It’s been a bit over a year since Heartbleed, which was released in 2014, and was the first in a series of headaches for open source software users. In this two-part interview, I sat down with Susan Sons, senior systems analyst at Indiana University's Center for Applied Cybersecurity Research (CACR), to discuss what went wrong with Heartbleed, Shellshock, and Poodle, and what is being done to fix it.
At CACR, Susan helps NSF-funded science and infrastructure projects improve their security and aids in operational security at a DHS-funded static code analysis project, sneaking in some of her own research and various attempts to save the world from poor information security practices whenever she can. Susan volunteers as director of the Internet Civil Engineering Institute, a nonprofit dedicated to supporting and securing the common software infrastructure we all depend on.
Andrew: Recently, we've seen Heartbleed, Shellshock, and Poodle. What are your thoughts on the amount of "marketing," e.g., logos, websites, and other branding, that's going into vulnerability writing today?
Susan: It's definitely raised the fear factor in the mainstream (read: among people who will never read the code). Whether that's a good thing or a bad thing depends a lot on what the situation was in your world beforehand. In places with resource-strapped InfoSec staff with no power to actually fix anything, it was a boon. It got C-level execs to pay attention. In places that were already mostly doing it right, it was a pain, because it caused InfoSec and IT teams to bleed off time and resources calming people down. On the upside, the press generated lots of mocking-worthy lines. I keep getting laughs at dinner parties.
Andrew: So, you mentioned the people who are reading the code. Linus Torvalds coined Linus' Law, which states "with many eyes all bugs are shallow". Essentially, what he was stating was that bugs, especially security bugs, should be obvious in open source because people are reviewing and contributing to the code. Why did this fall apart, especially with Heartbleed, and then Shellshock?
Susan: The inverse of that is that if nobody's looking, some bugs are impossible to find. OpenSSL, for example, was relied upon for security in countless places critical to financial systems, internet, and life-critical systems. But it was maintained as a volunteer side project by three guys with lots of other things on their plates and inadequate security training and virtually no resources.
Andrew: How is it that that is the case, though? I know for a fact that versions of OpenSSL with the vulnerability were FIPS certified by NIST for encryption. And doesn't the NSA have some skin in the game here as well?
Susan: They do, but a lot happens by reputation. Or by having someone who doesn't have time to learn the intricacies of the codebase do a weeklong "audit." Or by assuming that a static analysis tool will capture any problems that matter. Nobody threw an army of security experts at this, ever, as far as I can tell.
Next week, we'll continue with part two of this interview, where we will discuss who's got money and who's spending money to fix this.