In a keynote at Black Hat 2014, cybersecurity thought leader Dan Geer suggests that open source is the solution to software security challenges.

August 19, 2014

4 Min Read
Black Hat 2014: Open Source Could Solve Medical Device Security

By Scott Sheaf

 

The keynote address at this year’s Black Hat network security convention was delivered by Dan Geer, chief information security officer at venture capital firm In-Q-Tel. In-Q-Tel works closely with the U.S. Intelligence Community and Geer is known in the security industry as a visionary and thought leader. In his address, “Cybersecurity as Realpolitik” Geer laid out a set of recommended policies that he believes will help mitigate the ever-growing problem with software security. As a medical device developer, I found his recommendations far-ranging and somewhat unrealistic, but nevertheless, as keynote addresses should be, his ideas are very thought-provoking.

 

Dan Geer delivers a keynote at Black Hat USA 2014. 

On the topic of source code liability, Geer suggests that eventually software developers, including medical device development companies, will be responsible for the trouble their software causes (or fails to prevent). I think it’s fair to say that it is impossible to guarantee a totally secure system. You cannot prove a negative statement after all. Given enough time, most systems can be breached. So where does this potential liability end? What if my company has sloppy coding standards, no code reviews, or I use a third-party software library that has a vulnerability? Should hacking be considered foreseeable misuse?

 

Geer offered up a possible solution—open-source everything. “For better or poorer, the only two products not covered by product liability today are religion and software, and software should not escape for much longer,” Geer said. Geer outlined a three-clause strawman proposal he developed with colleague Poul-Henning Kamp for how software liability regulation could be structured:

 

1.) Determine whether damage was caused because of intent or willfulness. “We are only trying to assign liability for unintentionally caused damage, whether that's sloppy coding, insufficient testing, cost cutting, incomplete documentation, or just plain incompetence. [This clause] moves any kind of intentionally inflicted damage out of scope. That is for your criminal code to deal with, and most already do,” Geer said.

 

2.) If you license software, but allow the licensee full access to the source code and the ability to disable any functionality they chose, your liability is limited to a refund. “[This clause] is how to avoid liability: Make it possible for your users to inspect and chop out any and all bits of your software,” Geer said.

3.) In any other case, you are liable for whatever damage your software causes when it is used normally. As Geer outlined, “If you do not want to accept the information sharing...[you] must live with normal product liability, just like manufactures of cars, blenders, chain-saws and hot coffee.”

But would something like this work? Geer believes it absolutely would in the long run. “In the short run, it is pretty certain that there will be some nasty surprises as badly constructed source code gets a wider airing,” he said. “The free and open source community will, in parallel, have to be clear about the level of care they have taken, and their build environments as well as their source code will have to be kept available indefinitely.”

 

In theory, opening up the source code allows security researchers, end users, and other medical device developers the opportunity to find and report vulnerabilities before they become an issue. This also offers the benefit of educating the community at large and essentially crowdsourcing safer medical devices. The obvious downside to this is intellectual property (IP) protection or the very real problem of introducing a whole host of “zero-day” attacks—wherein a vulnerability is discovered and attacked before it can patched—on fielded devices.

 

I do not believe we will ever see the day where FDA mandates public disclosure of safety critical software source code. But what I do think we may see is either regulation or, at the very least, guidance that encourages vendors to publicly disclose parts or all of the medical device’s design history file (DHF) for some period before the device is approved for sale. In the future, vendors will likely include security engineering as part of their design input and artifacts such as network topologies, threat models, and encryption protocols as part of the DHF. This is an interesting middle ground that would protect IP but still allow the medical device industry and security researchers a chance comment on security-related aspects of the design.

There are no easy answers to the medical device security challenges. It will be the combination of little things that together raise the standards against which we as medical device developers measure the quality, safety, and security of our products.

 

Watch Geer's full keynote below. A full text transcript available here:

 

 

 Scott Sheaf is a senior software engineer at Battelle. 

 

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like