FDA presses medical device makers to OK good faith hacking
Loading...
Kevin McDonald of the Mayo Clinic told the Food and Drug Administration last month that most of the medical devices his center tests are pretty much the same.
"It's just another crappy computer," said Mr. McDonald, the clinic's director of clinical information security, during an FDA workshop on medical device cybersecurity.
But as patient care becomes more dependent on computers and software, often putting patients' lives in the hands of machines, vulnerabilities in those systems are increasingly troubling for security researchers, government agencies, and many leading medical facilities such as the Mayo Clinic.
As a result, the FDA is pushing the burgeoning medical device industry to do something it has so far resisted – give security researchers permission to examine their products and search for security vulnerabilities so companies can quickly patch problems.
"Patient care is increasingly dependent on medical devices with vulnerable software," said Beau Woods of I Am the Cavalry, a group of researchers who work on issues where cybersecurity affects public safety. "Vulnerable, exposed medical devices are becoming a widespread problem across healthcare."
But even though the FDA is advocating for the industry to develop a mechanism for security researchers to look for bugs and report them, only two medical device manufacturers – Phillips and Dräger – have published a coordinated vulnerability disclosure policy, which is an invitation for researchers to look for software flaws, as well as a public declaration of how the companies will handle reported vulnerabilities.
"The FDA is encouraging medical device manufacturers to take a proactive approach to cybersecurity management of their medical devices," said Suzanne Schwartz, director of emergency preparedness/operations and medical countermeasures for the Center for Devices and Radiological Health, a division of the FDA.
"Only when we work collaboratively and openly in a trusted environment, will we be able to best protect patient safety and stay ahead of cybersecurity threats," she said.
Traditional software and computer companies were initially reluctant to work with independent security researchers, too. Many tech firms believed that good faith research meant to expose and report vulnerabilities would open them up to security breaches.
Eventually, however, most tech companies came around to the idea that letting outside researchers examine products, which then led to companies releasing patches, was an efficient way to fix software flaws. Now, major tech companies such as Microsoft, Facebook, and Apple offer some kind of bug bounty program.
"Microsoft took about 15 years from sending cease-and-desist orders to researchers to giving out six-figure bug bounties,” said Josh Corman, a security researcher and member of I Am The Cavalry.
The issues in the medical device industry, Mr. Corman said, is that it's at "year zero" when it comes to cybersecurity. "We have to accept and acknowledge where they are starting from. But we can’t allow it to take 15 years, the consequences would be too dire."
Because the FDA recognizes that security flaws in medical devices are potentially life threatening, the agency has become increasingly proactive in warning hospitals about various device vulnerabilities. Last July, it issued a wake-up call to the industry with a safety warning advising health care organizations to stop using a line of drug pumps due to the cybersecurity risk.
The security flaw made it possible for a remote attacker to raise – or lower – the dosage to potentially fatal levels. No evidence exists that any patients were injured, making the safety warning extraordinary.
Like the early software industry, medical device makers view bug disclosure programs as leading to more problems than helping improve overall security, said Corman. From their point of view, he said, "if we invite researchers to look for bugs, they will find bugs."
The bad guys, Corman pointed out, are already looking for bugs, with or without the permission of the manufacturer. And the good faith researchers who find bugs hesitate to report them for fear of a lawsuit or prosecution under the Digital Millennium Copyright Act, which bars researcher from bypassing systems that protect copyrights on software to conduct research.
But in October, the Library of Congress granted an exemption for independent security researchers to hack medical devices and automotive systems (among Internet connected devices). The three-year exemption comes into effect October 2016 and will protect researchers engaged in good faith research from DMCA prosecution by manufacturers.
The exemption stipulates that the research "must be conducted in a controlled setting designed to avoid harm to individuals or the public," and only applies to implantable medical devices (such as pacemakers and insulin pumps) and their corresponding monitoring devices.
Medical and auto industry groups remain opposed the exemption, however, suggesting they aren't convinced that their products are susceptible to serious cybersecurity risks.
Katie Moussouris, chief policy officer for the firm HackerOne, which runs bug bounty programs, once compared the process of companies dealing with security vulnerabilities to the five stages of grief in the Kübler-Ross model – denial of the problem, followed by anger, bargaining, depression, and, finally, acceptance.
This process does not happen overnight, but Corman hopes to accelerate medical device vendors on their journey – "safer, sooner, together," as I Am the Cavalry likes to put it.
"The first thing [manufacturers] have to get over is they assume bugs are rare or sparse," he said. "They need to realize that bugs are dense."