Jim Harper knows that if his mother, who lives across the country from him, hears his voice on the phone she'll be able to tell within seconds if he hasn't been getting enough sleep or if he's coming down with a cold. That's what makes him confident that health information is present in the vocal chords, he told BIOMEDevice Boston attendees on Tuesday.
Many experienced doctors also tend to pick up on vocal cues that contribute to their characterization of a patient's condition, but Harper said this information typically isn't integrated into healthcare in any kind of formal way because it is subjective. Different people have different abilities to pick up on these signals and interpret them in a health context, he said.
Harper's company, Sonde Health, is developing a voice-based technology platform that leverages machine intelligence to detect subtle changes in a user's voice to possibly signal a variety of health conditions ranging from a sinus infection to depression. The company's app-based technology is just one example of how artificial intelligence is beginning to change the way the industry defines a medical device.
"Perceptual models, things like what your mother can do, I think are interestingly a great path to unlocking the potential for health and wellness," Harper said. "AI can take those perceptual cues in the acoustic sounds of what we are speaking, and also other signals not related to speech, and create objective, quantifiable measures that correlate with the best health measures that are in use today and put those perceptual cues back on a path of being clinically actionable so that we think differently about what is a medical device."
Even FDA is beginning to broaden its scope in terms of what constitutes a medical device, Harper noted.
"FDA is going down this path of really thinking about software, these algorithms, the tools, the processes we use to collect the acoustic data and analyze it," he said. "That software itself is a medical device, and when you start to regulate, when you start to create software itself as a medical device, we now begin to think about everyday devices that we use as being medical instruments."
For a voice-based technology like Sonde Health's platform, that means everything from a smartphone to a hearable consumer device like the Amazon Echo or Google Home, can be used as a health instrument. We've already seen a few other examples of this in the healthcare sector.
Going back to his mother as an example, Harper said he recently called her on a Saturday after not talking to her in about a week. That call revealed that she'd had a bad cold that she had been suffering from for almost six days.
"Thankfully it wasn't serious, but there's a lot of things that voice-enabled, background-enabled detection of a change in respiratory health could have alerted me to call her at the beginning of the disease rather than having to wait for six days," Harper said.
Taking it a step further, such technology could be used to help doctors remotely monitor patients with a chronic disease by tracking changes in their vocal cues over time.
Sonde Health is partnering with chip manufacturers that do embedded signal processing to make sure that, if we enable passive technologies, meaning those that are always on in the background like Alexa or Siri, no data ever has to go to the cloud, no voice recording is ever actually required to monitor these vocal biomarkers.
"We see these things merging seamlessly in the next 12 to 18 months in a way that gives us the capacity to prevent misuse," Harper said. "For example, I have this on my phone, but what we don't want is for me to be able to sit across from you at dinner and be able to assess your health without your permission or without your knowledge."