What the security industry can learn from the World Health Organization

The discovery of computer bugs can be marketing boons for cybersecurity firms. But one critic says the industry should take a page from the health profession and select names for flaws that aren't designed to stoke fear or generate buzz.

The logos for recent vulnerabilities (clockwise): Heartbleed, Venom, Ghost, and SandWorm

News illustration

May 22, 2015

As soon as the cybersecurity firm CrowdStrike announced its discovery last week of a computer vulnerability it dubbed "Venom," the ominous headlines began.

"Venom vulnerability: Serious computer bug shatters cloud security," wrote Fortune.

But as buzz died down and more experts weighed in, much of the initial dread about Venom turned to a collective shrug. "Blinked and you may have missed [Venom]," The Wall Street Journal wrote just a day after it said the vulnerability was sending companies "scrambling." The vulnerability is indeed widespread, but most agreed it would be difficult to exploit. The initial surge of press attention may not match the actual danger.

“The sexier the name, the more media attention," says Christopher Dawson, editor-at-large for the blog from network security firm Fortinet. "Yet when you [name bugs], it creates this sensational thing.”

The National Vulnerability Database, a government-hosted repository of computer bugs, ranks Venom just past the borderline between medium and high risk – a 7.5 out of 10. But this year alone, it has listed nearly 800 bugs as high risk, and there is no shortage of 10s. Many of those involve extraordinarily popular software programs such major operating systems and Web browsers.

But few rivaled the publicity that Venom generated for CrowdStrike. And that's because most bugs just go by their serial numbers assigned before being listed in the database. Venom, which even had a slick logo, had a marketing and public relations team working behind it.

Drumming up attention for some bugs can turn into a victory for cybersecurity firms, but may pose a problem for the broader computer security field, says Mr. Dawson and other experts, because it steals attention from other, more dangerous flaws. But he's come up with a solution: When it comes to selecting names for computer flaws, the security industry should look to another industry that knows a thing or two about naming maladies.

The World Health Organization, he says, gets it right. And its recently released standards, he says, could perfectly translate to the computer security industry.

Can Syria heal? For many, Step 1 is learning the difficult truth.

WHO’s way

The WHO has been examining the problems that arise from naming for more than a decade but just released guidelines for the process on May 8.

Typically, the piece of information people hear about a disease isn’t a symptom. It’s the name. And first impression can carry consequences. In 2009, Egypt ordered the slaughter of 300,000 pigs, believing it would help stop transmission of the Swine Flu. But doctors who diagnosed patients with swine flu are usually thought to have contracted it from other humans – killing pigs likely would do very little.

“We did an intensive process over more than a decade to investigate all the ways naming could be misleading or just not provide a good description of a disease, including both past and potential problems,” says Dr. Elizabeth Mumford, a researcher who worked on the WHO guidelines.

Past problems included names that were misleading (such as Swine Flu), diseases named after people rather than describing the problem, names that were hard to pronounce or remember, and – in a move that generated criticism that the WHO was being overly politically correct – names that seemed to imply a disease was specific to a certain location.

“The Spanish Flu didn’t originate in Spain, and wasn’t confined there,” Mumford says, as an example.  

The same kinds of misleading, not descriptive, and hard to remember names abound in the computer security realm, too. Malware, vulnerabilities, and hacking outfits that choose not to identify themselves are now named by the security researchers who discover them. Usually that means names are chosen more for marketing value than for conveying information.

And that’s where the security industry would run afoul to key WHO guideline: Don’t choose names that cause “undue fear.”

'Undue fear'

The trend of naming all vulnerabilities isn't that old. It started with Heartbleed in April of last year. The name was fitting; it was a glitch that bled data from the “heartbeat” function that verified connections stayed open in a massively popular Web encryption platform. And more than that, it was vulnerability historic in the terms of its size and potential damage. It deserved a name.

That didn’t mean the name wasn’t also a marketing effort on the part of Codenomicon, the vulnerability detection tool company that discovered it. Codenomicon registered the domain “heartbleed.com” and designed what would become a ubiquitously used bleeding heart logo before they notified all of the developers who would be affected by the vulnerability. 

Then there was ShellShock, WinShock, Sandworm, and an entire legion of cute names derived from acronyms – Poodle, Ghost, Freak, and, last week, Venom. It seems almost impossible to announce a big vulnerability without a name, whether you want to not. The consulting firm JAS Global Advisors expected a major Windows bug they made public in February to be known by its vulnerability serial number, called a Common Vulnerabilities and Exposures number, or CVE. It still became known as JASBug.

Most often, those names aren't particularly telling when it comes to conveying what actually needs to be fixed.

"Sandworm wasn’t even a worm," says Dawson. In computer jargon, worms are specific types of malware. Sandworm referred to both a Russian hacking group and the vulnerability it frequently exploited.

Venom was a bit more descriptive; it stands for Virtualized Environment Neglected Operations Manipulation and is, in fact, a bug that affects the networking process called virtualization. But many other bugs affect this process. A more descriptive name, for example, could have mentioned that the bug was in the code for floppy drives. 

CrowdStrike declined to comment for this story.

“What ends up happening is named vulnerabilities get more attention regardless of how much they deserve it,” says Chris Eng, vice president of research at the Massachusetts cybersecurity firm Veracode. “The intuition is, if it’s branded, it’s more dangerous.”

Applying the WHO Guidelines

Mr. Eng suggests that, in an ideal world, the industry could go back to the old days, and refer to vulnerabilities by their Common Vulnerabilities and Exposures numbers. “They’re only eight numbers,” he says. “They aren’t that hard to remember. And the first four are the year.”

But he also acknowledged that the cat was out of the bag, and research companies are now accustomed to having their own individually named, marketable vulnerabilities. Even if they weren’t, there are times where it’s extremely useful to be able to discuss vulnerabilities without worrying about typing CVE 2009-1324 when you meant CVE 2009-1423.

This is where the WHO guidelines could be useful. 

WHO suggests short, pronounceable acronyms for names. Its prototypical disease name is SARS: memorable enough as an acronym without causing the same undue fear as a name like Venom. Venom, and bugs like it, for instance could just as easily be described as a Virtualization Escape Vulnerability, or VEV, says Dawson of Fortinet. 

But unlike the cybersecurity industry, the healthcare profession has strong governing bodies such as WHO at its center, and hospitals don't see the same marketing value in a threatening disease name. Dawson says trade groups could play a similar role and dissuade the industry from racing to come up with the coolest names and marketing campaigns.

When it comes down to it, Dawson doubts the names are what most people in the industry want anyway. "Developers don't want logos," he says. "Marketing wants logos."