Once a field of self-taught hackers, cybersecurity education shifts to universities

Over the past year, colleges and universities across the country have received millions in funding from the government and foundations to launch cybersecurity initiatives. The result is a stark change for an industry made up of programmers who have often learned by trial and error.

Stanford University, pictured above, received an infusion of money last year from the William and Flora Hewlett Foundation to initiate new programs focused on cybersecurity policy.

AP/File

April 22, 2015

For years, the best way to learn about your computer was to take a screwdriver it.

That’s how Jon Miller learned cybersecurity: trial and error, advice from friends, and constant tinkering. In the 1980s and early 1990s, that’s how everyone else did it, too. Network security was self-taught in basements and bedrooms. And it worked pretty well. Without formal training, Mr. Miller worked his way into a role as vice president of strategy at the security firm Cylance. He’s lectured at colleges – without ever taking a class in one.

“I learned from getting new hardware and problem solving,” says Miller. “But if colleges had offered courses in security at that high level, I would have taken them.”

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

There's a sea change occurring in how information security is taught. Millions of dollars are pouring into universities to launch cybersecurity programs. While cybersecurity is still an industry that celebrates self-taught outsiders and hackers working for good, the future is sure to bring more engineers and specialists trained in the classroom.

It's been year of incredible growth for college departments in the field. A host of schools such as Utah Valley University, University of Texas at El Paso, Missouri State, and University of Tampa launched or started degree programs in the field. A multimillion-dollar multiuniversity National Science Foundation cryptography initiative broke ground at top universities across the country. A bevy of community colleges announced specialized training programs. And the Obama administration announced a $25 million grant for historically black universities to train students in cybersecurity.

It's a stark change from the days Sushil Jajodia launched the country's first academic center devoted to computer security, the Center for Secure Information Systems at George Mason University, more than two decades ago.

“When we started, there was no World Wide Web,” says Dr. Jajodia, who opened the center in 1990 and still runs it today. “There were no security products, and no security classes. The only people interested in our work were the Department of Defense and National Security Agency for the protection of their networks.”

In 2000, a decade after Jajodia started his George Mason center, he was the main interviewee in a Chronicle of Higher Education story on the “struggle” to establish cybersecurity education programs in colleges. “I don’t see things getting better in the near term,” he told them. At the time, there were only a handful of programs and the few professors in the field were regularly poached by higher paying industry jobs.

Howard University hoped to make history. Now it’s ready for a different role.

“If they have one course I’m happy,” Jajodia said at the time, of computer science departments around the country.

Today he's happy (well, happier). This year’s undergraduate catalog at the University of Alabama lists six classes mentioning “security” in the computer science department. In 2008 it listed zero. 

But for all the growth over the past few years, nothing nurtured higher education’s interest in cybersecurity like the recent growth of threats. “Four or five months ago, I went to the dean and said, 'This is the time we’ve been waiting for! Finally the awareness has hit everyone,” Jajodia says.

Balancing academics with real-world training

There are real advantages to learning a subject such as cybersecurity in a school rather than in a basement workshop.

Throughout history, novices turning to online cliques of experts have been told – often in colorful ways – to “read the manual.” For people who broke through, including Miller, being self-taught means occasional knowledge gaps.

Miller also acknowledges that he got lucky by being able to afford equipment – there has always been a high financial barrier to learning cybersecurity. And without today’s gigantic code repositories (GitHub wouldn’t start until 2008), there was a need for a level of patience and an ability to pick things up quickly. Some of those hurdles can be overcome with the help of teachers, standardized curricula, computer labs, and textbooks. 

While Jajoda and other academics welcome the new attention to their field, they still worry about schools offering computer science and engineering degrees without integrating security training. It baffles Jajodia that students can still take a software design class that doesn’t incorporate principles of security. That, he says, is a problem in the academic perception of cybersecurity – it’s still seen as a separate discipline from mainstream computer science.  

And while schools work to shoehorn cybersecurity into computer and software design, they also face an issue that reports from the White House, the nonprofit RAND Corp., and the Association of Computing Machinery all describe as the tent-post problem: Classes often devote more time to academic theories about cybersecurity, rather than the rote training in real world scenarios professionals will need.

The state of Michigan along with tech nonprofit Merit got out ahead of this problem. In 2012, Michigan launched the “Cyber Range,” a "live fire" facility many of its colleges use to simulate real-world cyberattacks. But that is not a common feature in higher education. Even as the funding flows into cybersecurity education, Jajodia worries the money will be earmarked solely to research rather than to improving training.

“Professors,” he says,” don’t have the modern tools students need to train on.”

New focus on cybersecurity policy

But with the influx of new funding, some schools are taking cybersecurity education into bold directions, including new degrees in law and policy.

At the Massachusetts Institute of Technology, for instance, the school is using a $15 million grant from the Hewlett Foundation to launch a Cybersecurity Policy Initiative. The grant is one of three that Hewlett announced last year for cybersecurity efforts at MIT, Stanford University, and the University of California at Berkeley. 

Daniel Weitzner is heading up the MIT center. As a former deputy chief technology officer at the White House and the founder of the advocacy group the Center for Democracy and Technology, he's familiar with the policy issues and process surrounding cybersecurity. That experience has given him deep insight into the current need for educated government advisers and for the policy research necessary to inform them. 

“Imagine if the chairman of Federal Reserve had absolutely no guidance in what the outcome of their policies would be. We’d never let them say, 'Oh, let’s just raise the interest rate half a percent and see what happens.' That's where we are today," says Dr. Weitzner. Programs like his, then, could have a tangible effect on national policy. 

The MIT program will focus on metrics, ways to calculate the effects of different actions.

Beyond helping society make more informed choices, he says programs such as his will position students for a growing number of unaddressed positions in a rapidly expanding field.

“You see it in big companies – the Googles – looking to hire student’s with a computer science background and a public policy background,” he says. And the Center for Democracy and Technology, he says, "is now hiring as many lawyers as computer scientists."

The law school and Center for Health and Homeland Security at the University of Maryland are partnering to offer new masters in cybersecurity to fill the same need, which they also see as massive. As Michael Greenberger, director of the CHHS puts it: “In our consulting work, we see continued interest in cybersecurity law. We are doing this because there is a demand for knowledgeable people. We aren’t just doing this for tuition.”

Will classrooms replace self-taught hacking? 

Mr. Greenberger's comments about tuition represent a serious concern among some security experts. Jajodia, the George Mason professor, worries that the thirst for cybersecurity programs will encourage schools looking to add prestige or cash flow to offer inferior programs. It's not so much the traditional universities he worries about. It's for profit schools.

But whatever the school, there's no real consensus on how graduates who wait until college to start their learning fit in to an industry largely staffed with those who learned on their own, whether there will be a culture clash or differences in the quality of education. With a growing job market, new modes of education aren't just inevitable – they're necessary. But there may never be a full shift to the academy, and it's doubtful the industry would want it.

“Some of our best people have no security background,” says Jay Kaplan, chief executive officer of the penetration-testing firm Synack. “They’re motivated software or hardware engineers with an interest in security. Some of the best experts didn’t train for this, or practicing it full time.”

Synack operates as a curated network of freelancers, so Mr. Kaplan has recruited a lot of cybersecurity professionals over the years. And he has no interest in where their knowledge base came from. He tests applicants with simulated security projects. University of Phoenix students face the same quiz as ones from Harvard University – the same quiz self-taught hackers take. 

Mr. Kaplan says that having learned cybersecurity both at a university and on his feet. He went to George Washington University during the last influx of money into cybersecurity education, one of the first few students to receive the Clinton-era “CyberCorps” scholarship intended to create a well-trained army of computer savvy public servants. The scholarship, which still around today, permits students to go to National Security Agency-certified “National Centers of Academic Excellence” in information assurance or equivalent programs. When Kaplan applied to college in 2003, there were only a handful of qualifying schools. Now there are more than 100.

He earned a bachelor’s degree in computer science before working as an internal network analyst at the NSA, but the NSA required different skills than the conceptual academic training he received in school. So Kaplan had to pick up practical skills in the workplace. “Obviously it required a very technical understanding very specific to the job,” says Kaplan. “It’s a common problem for people coming from a theoretical background.”

There is, he says, one advantage of learning pen testing from schools rather than a lifetime of hacking – it’s tougher to trust someone to guard your network whose last job was breaking into it. “At conferences, I ask them where they live and what they do for a living. If they don’t have a good answer to how they live in a nice area but have no job, I can’t hire them.”

But will universities exceed, or even meet the self-trained educations of years past. From his office at Cylance, Jon Miller is optimistic but not entirely sold. 

“I don’t think a four year program alone will by any means be enough. I’m a strong believer in putting in the time and students need to put in a lot more time outside of class,” he says. Self-trained security professionals spend lifetimes learning a field colleges claim to teach in four.  “But kids now have had next generation Internet all of their lives. It’s possible they could be prepared to learn cybersecurity in four years.”