These three technologies, or lack thereof, are standing in the way of automated diagnostics.

November 9, 2016

5 Min Read
The Diagnosis Is Automated, My Dear Watson

These three technologies, or lack thereof, are standing in the way of automated diagnostics.

Nigel Syrotuck

In August, a stumped group of Japanese doctors treating a 60-year-old woman turned to IBM's Watson for help, eventually reaching the diagnosis of a rare form of leukemia. Artificial Intelligence is a natural fit for diagnostics: there are millions of medical papers available and more being released daily. Sifting through them and compiling data is a natural fit for a processor. There is just one catch: they are written in scientific English.

Thankfully, language interpretation is one of Watson's key strengths, and it is a rapidly growing field of study. Though automated diagnostics have been successful in studies as early as 2013 and 2014, now more than ever are we finally starting to see a culmination of the innovation required to drive automated diagnostics to a point where they are actually useful.

Besides the use of plain language, innovation is also happening in two important sectors: machine learning and computational power.

Hear Syrotuck and StarFish Medical colleague Paul Charlebois discuss "Achieving Universality and Simplicity in Design: Why Are Physicians and Patients Demanding Both?" at BIOMEDevice San Jose, December 7-8.

Challenge 1) Computational power. Did you know there is as much (or by now, probably more) processing power in Macbook chargers as there was in the original Macintosh? Another fun fact: in 2013 a computer was able to simulate one second of human brain activity . . . the only issue was that it took 40 minutes. We can expect that simulation time to drop exponentially in the future, but Moore's law is tapering off, so useful AI's may not be realistic with our current technology. There is promise in other types of processors, however. Breakthroughs in a different technology (such as quantum computing, which may not be good at all for deep learning) could accelerate the advancement of AIs by allowing faster searches and different types of architectures that may require a lot less finesse to develop.

Challenge 2) Plain Speech Interpretation. One of the questions Watson was asked on Jeopardy was the following: "to push one of these paper products is to stretch established limits," to which it answered correctly, "envelope." That is an amazing display of natural language processing. And it's not just Watson. Have you noticed that Google has recently started displaying more and more answers to your questions rather than simply providing links to websites? This is because they are trying to prioritize context over keywords, and to show that they are also capable of understanding plain speech as an input, figuring out what you're really trying to ask, and searching through droves of data written in plain speech for the correct answer.

Challenge 3) Machine Learning. Watson is an amazing, powerful platform managed by what we can safely assume is a brilliant team at IBM. Likewise, Siri has a cutting edge team at Apple, but also has the added benefit of millions of regular users sending data to Apple about how, what, and why they search for medical advice. Similarly, Google receives millions of searches per day relating to symptoms and medical issues. So far both Siri and Google appear to not be as useful as Watson for medical uses, but I'd wager big data from regular users will allow them to learn quickly and advance well beyond Watson in automated diagnostics.

Bonus Challenge--Voice Recognition. Voice recognition has proven that it works well enough to be useful, and we can expect it will continue to get even better, but it is a necessary inclusion on this list because people are much better at speaking naturally about their search topics than typing them out using keyword-based search engine entries. It's the difference between pecking out 'back ache' on Google and getting hundreds of different hits about everything from muscle cramps to spine cancer, versus being able to explain the full story behind how you fell off a ladder when you were a child and how it's been slowly feeling worse over the years. This is the advantage a doctor has over a machine: they are easy to talk to. 

How Much Longer!? Short answer: hopefully in the next 10 years, but it's highly contingent on the technology. Once these challenges are overcome, automated diagnostic platforms should be operational, but there will still be major questions to answer--especially in regulatory adoption. For example, will this technology be proven to be superior to human consultation for all or just some conditions? FDA says software absolutely can be a medical device, but which classification? What if an AI platform is coupled with a wellness device like a smartwatch? It's very difficult to predict the value of these technologies on the population's health and wellness because we don't know how much of it will fall on the consumer side of the law, and how much will be regulated. If automated diagnostics are approved for direct consumer consumption, then it's very likely every single person (who can afford one) will have their own personal tricorder in the next decade. If not, then the most we can realistically hope to see out of this innovation is a resource for doctors to use to help them do their jobs--much like aviation technology helps airline pilots--that will hopefully be a lot better than their current fallback: Wikipedia.

So, Should I Ditch my Doctor Now? Absolutely not. At least, not yet. One study found that voice searches using multiple voice recognition and search platforms (Google, Siri, S Voice and Cortana) return "terrible" results when told things like "I am depressed" or "I am having a heart attack." Another study just found doctors are twice as accurate on average as 23 different symptom checking apps which are currently on the market. The medical device industry has a lot to gain from these innovations, but it looks like it will still be a few years before this technique is stable, helpful, and trusted.

Nigel Syrotuck is a mechanical engineer at StarFish Medical, a medical device design company headquartered in Victoria, British Columbia.

[Image courtesy of STUART MILES/FREEDIGITALPHOTOS.NET]

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like