MPMN recently spoke with Kevin Fu, software expert and associate professor of computer science at the University of Massachusetts Amherst, about the current risks associated with medical device software and how to mitigate them, as well as grumpy stakeholders and implant hacking. Fu focuses his research on improving the security and privacy of embedded systems, especially in medical implants.
MPMN: Medical device software has been put under the microscope during the past year, thanks to the Institute of Medicine's report on the 510(k) process earlier this year and other factors. Do you think that medical device software is under-regulated?
Fu: Many of the regulations that affect software are 35 years old. There are also some guidance documents specific to software, but what I'm hearing from all stakeholders is, essentially, grumpiness. When you don't have a clear pathway to approve the safety and effectiveness of software, nobody's going to be happy. I think it's really important that there is consistency and good decision-making on the part of everyone from the manufacturers to the regulators to the healthcare providers and the patients--who are often forgotten from the equation.
MPMN: What are the most concerning or troublesome issues currently affecting medical device software?
Fu: Software has the tendency to result in three emergent things. First, software tends to breed overconfidence. When we change a device from being purely mechanical to being run by software, we have a tendency to think it's magic and forget about some of the risks that software brings. Second, software is not thoroughly testable. A lot of the techniques we would use in, say, evaluating the safety and effectiveness of a mechanical component, for example, would not immediately apply to software. One example is interpolation of testing results. If you are testing a piece of material and you take certain sample points, you can interpolate between them and make fairly educated decisions on the safety and effectiveness of the entire device. But if you test one piece of software, you've just tested one piece of software and that says nothing about the rest of the software. This overconfidence combined with the difficulty of testing is also combined with the fact that software continues to flood into medical devices. It's doing good things, but at the same time, there's a tendency to overestimate the benefits and underestimate the risks.
MPMN: What can medical device manufacturers do in order to mitigate these risks?
Fu: To start, we need to have a more meaningful specification of requirements for the software. There's variation in the quality of the specification of requirements that tend to be many of the root causes of the problems we see in adverse event reports. We also seem to be forgetting a lot of issues of human factors. So, with many adverse events, there are some clear violations of engineering practices for good human factors. For example, there was an infusion pump in which the healthcare provider could enter a number that would control the amount of time that a drug was dripped into a patient. But there was no label of units on this particular entry system. So, it, of course, invited errors. Unfortunately, a patient expired as a result of an overdose. More meaningful specification, human factors engineering, and the adoption of modern software systems engineering technology are needed. One example of the latter would be static analysis, a technique to help identify flaws in software before execution on a real system. While you're creating the software, you can more readily identify problems that will likely manifest as adverse events. Personally, I think we also need to start thinking about developing a safety net for security and privacy issues. Just like there's this tendency to have an overconfidence in software, there's a tendency for manufacturers to think that if they have an embedded device, they don't need to worry about any of the problems on the Internet like malware and computer viruses. That would be nice, but I don't think that's realistic. I think the industry needs to be careful about making too many assumptions about having complete immunity from cyber security threats.
MPMN: How real is the threat of malicious hacking and malware to software-controlled medical devices?
Fu: That's the million-dollar question. What we have right now are risks and what we don't know is when those risks will become tangible threats. Unfortunately for computer systems, change tends to come quickly and without much warning. We've seen the warning signs and researchers have demonstrated in the laboratory that it can be done. It could happen tomorrow or nothing may happen for years. One historical example to look at is the Tylenol cyanide poisonings in 1982. Until that event, few felt that there was a credible threat of tampering with pharmaceuticals. Software security risks can very quickly change, so it would be foolish to think that we don't need protection.
MPMN: What, then, should medical device OEMs be doing in relation to medical device security?
Fu: They need to include security and privacy properties at the specification stage of devices, and then the software engineering processes ought to follow a standard security development life cycle. It's very challenging to modify a device after the fact to enhance security; retrofitting is probably going to be quite difficult. The issue is more: How can we protect these future devices that are coming down the pipeline? Also, how do you know when you have enough security? There are a lot of hard questions that don't have answers in this space today.
MPMN: In general, is there more action or regulation needed pertaining to medical device software?
Fu: There are three things that are outside or on the outskirts of manufacturers' control. Number one is better surveillance of software. It's extremely difficult to convince a healthcare provider to submit a report on a software problem; it takes time and they're not experts in software, so they might not know how to report it. There needs to be a more convenient reporting mechanism for software issues, perhaps using a reporting structure or something out of the box. Also, clearer responsibility is needed. Medical devices are no longer just systems; they're systems of systems. For instance, in the radiology context, you'll find a number of different computer systems connected all together to take part in the diagnostics or therapy. There's a lot of finger pointing between different vendors involved with the point of care. So if something goes wrong, no one party is currently responsible.
MPMN: Any final thoughts on medical device software that you'd like to offer up?
Fu: It's important to note that software behaves very differently from hardware. I'm confident that the industry will eventually find innovative ways to manage these kinds of risks. But right now, we're in kind of the second stage of the revolution for software-controlled medical devices.
For more-detailed discussion of these medical device software challenges, check out Fu's IOM-commissioned report on "Trustworthy Medical Device Software."