Amazon's Alexa helps a newly-diagnosed 15-year-old diabetes patient learn to cope with and manage his disease. A smartphone app uses machine learning to detect subtle but characteristic changes in a person's voice that may reveal an important health issue. Microsoft's HoloLens enables a product design firm to rapidly re-design and prototype a large-scale dermatology imaging system.
These are just a few of the things attendees learned about at the BIOMEDevice Boston Conference and Expo, as several speakers shared insights and specific case examples of how advanced technologies like voice-user interfaces, artificial intelligence, and augmented reality are proving valuable for medical device companies. In this story, we take a look at how one product design firm is using Amazon Echo and Google Home to inform product design for device companies and pharmaceutical companies developing combination products.
"At Worrell, we really saw Alexa and Google Home as what we're calling a trojan horse in healthcare," said Brandon Bogdalek, a product development consultant at Worrell, a design firm focused on healthcare innovation. "It's something that you can order an Uber on, something you can order a pizza on, something you can ask questions to, but it's also something we can deliver healthcare through."
Alexa Boosts Patient Competence with Self-Injection Devices
In 2017 the design firm first ran a study using Alexa, the virtual assistant used in the Amazon Echo and Amazon Echo Dot smart speakers, to train arthritis patients on how to give themselves steroid injections. The company wanted to take the paper-based instructions for use (IFU) that accompany self-injection devices and combine it with an Amazon Echo or Google Home system to see how it might impact a patient's confidence in using the device as well as their ability to use it correctly.
The results were rather interesting. While it took a little longer for patients to go through the instructions on the first day using both the written IFU and Alexa (about 15 minutes on average) compared to using just the written IFU (six minutes, 34 seconds on average), the patients who only used the written IFU averaged 20 total use errors on that first day compared to just six total use errors in the group that used both the written IFU and Alexa.
Also, by the second day, both groups were able to get through the instructions much quicker (two minutes, 30 seconds for both groups), yet the use errors on the second day remained much higher for the group of patients that just used the written instructions (32 versus eight).
"What's really funny is people are extremely confident when they go through the written IFU," Bogdalek said, while the patients in the Alexa group only rated their confidence level as "confident", even though as a group they made far fewer mistakes while learning to use the device compared to the other group.
So as it turns out, being able to speak to and respond to somebody (even if that somebody is a robot) actually made the users more competent at using the device at home even if they didn't feel as confident about it as those who relied solely on written instructions.
Alexa Is More Than Just a Bot to Users
Next, the firm tested the feasibility of using voice assistance in diabetes management across different aspects of that process. The study included 16 patients over a two-week period and the team built custom skills using Amazon's Echo. They did the on-boarding in person by going into the patients' homes to talk them through how to use the product, then conducted preliminary and exit interviews.
In this study, recently-diagnosed diabetes patients were asked to use the technology to have "heart to heart" conversations with Alexa about what they want their friends and family to understand about their disease and what they are going through.
The researchers found that patients were having genuine, meaningful conversations with Alexa about their disease, but the downside was that Alexa still sounds very robotic, which raised the question of whether or not Alexa or other voice-assisted technologies need to sound more human in order to have these types of conversations with patients and have the same impact?
"We don't think so," Bogdalek said. "We would like to see Alexa become a little bit more real [sounding] but not too real to be able to distinguish between a human and something that's a robot. We can still get that same outcome either way."
Patients also completed five-minute daily wellness surveys to track their dietary and lifestyle choices that impact their diabetes management, and Bogdalek said the feedback on this portion of the study from participants was interesting in that it seemed to really impact the choices they made because they felt accountable to Alexa.
"People actually started feeling a little guilty to Alexa when they did this, 'man, I really shouldn't eat that cake,' or 'I really shouldn't overeat on my carbohydrates today because I'm going to have to talk to Alexa about that, and she's probably going to judge me, a little bit'," Bogdalek said. "The insight that came from that is Alexa is more than a bot to users and it actually can help influence a change in behavior."
The third task study patients involved daily messages labeled "My Story" in which patients shared their personal experience with the disease.
"When I was first concerned about my health I had no idea what was happening," responded one study participant, a 15-year old patient named Jake. "I was losing a lot of weight and I went to the doctor 'cause I was really thirsty, I wanted to drink a lot, never really wanted to run around."
"Again, patients really are comfortable speaking to Alexa and voicing their thoughts and their feelings," Bogdalek said. "Logistically, having a lengthy conversation with Alexa is challenging. There's actually a constraint on timing so you can only go 90 seconds with Alexa before she cuts you off without any notice. So you could be having what you would consider a deep-rooted conversation or an emotional conversation with Alexa and she just stops. And you're like 'well now what do I do? Do I go back and do that over again?' So those are some of the design considerations that we're thinking about as a design firm and we are trying to actively figure out ways to prompt patients to begin speaking and then at the end make sure that they know the conversation is finished."
For Product Development, Google Home Is the Winner
Researchers can't be in the field 24/7 conducting research to understand user needs, so Worrell has started shipping smart home devices to patients, asking them to complete voice-assisted tasks like the diabetes study participants did, and then the firm takes that information and looks at it on the backend to identify trends over time, Bogdalek said.
"If you take Alexa and you relate that to Workflow, this is really becoming what we think as the best practice in research and design," Bogdalek said.
So which is better for medical device development, Amazon Echo or Google Home?
"Actually, we've completely switched over to using Google Home now," Bogdalek said. "If you look at Amazon and Google as companies, Google is a data company, and Amazon is a product-based company. What we found with working with those two corporations directly, Amazon is more apt to try to sell you a Fire Stick and Google is more apt to try to figure out how you can better leverage your data. So it's been a transition over time, but Google seems to be the winner."