The role of sensors in medical devices is becoming increasingly more important, especially when it comes to developing new technologies such as wearable devices that can diagnose and transmit information for long periods of time. Recent developments in sensor technologies are driving the trend toward the decentralization of healthcare by putting diagnostic instruments in the hands (or on the bodies) of patients. The move toward miniaturization in medical devices has also enabled sensors to be placed on laparoscopic instruments, providing surgeons with valuable haptic feedback.
Experts examined sensor advancements in the MD&M West 2020 conference sessions: Advances in Prolonged Field Care Enabled by Wearable Diagnostics on Tuesday, February 11; Preparing for Sensor Fusion, A Revolution in Patient-Centric Health Delivery, on Wednesday, February 12; and Overcoming Challenges in Haptic Feedback for Robotic Surgery Platforms on Wednesday, February 12.
“Sensor fusion is a combination of data from multiple sources,” said Srihari Yamanoor, president of Designbly. “It can be multiple sensors being used at the same time, it can be multiple sensors of different types measuring different parameters at different times, and it can also include sensors that are virtual.”
Having multiple data sources is critically important where patient safety is concerned, he said. “For example, you could try to predict with an accelerometer if a patient has fallen,” said Sai Yamanoor, IoT R&D applications engineer at Praxair. “You might see a sudden change in posture, but you don't know whether it was a fall, especially with older patients.” He explained that geriatric patients typically fall very slowly and just one sensor would not necessarily pick it up. “That's where sensor fusion can come into play,” he said. “If the patient has a device with multiple sensors, then you can tell that something has happened.”
But sensor fusion has its challenges, Sai Yamanoor said. One of them is making data more accurate. “I recently learned that the most accurate heart rate you can get is from wearing a sensor in a heart rate monitor closer to your ears, as opposed to a watch, or an appendage that hangs on your chest,” he said, noting that we have just scratched the surface when it comes to wearable devices. Sri Yamanoor noted that using AI could very well be used in this endeavor.
Another challenge, especially when there are multiple sensors running on the same system, is noise. If the same noise source affects all the sensors, it can become difficult to identify correlated noise with a particular sensor. An example of this, Sai Yamanoor said, is surgical robotics, with data points and data sources coming in from different places. The engineering trick, he said, is to correctly identify sources.
Srihari Yamanoor cautioned that sensor fusion is not exactly like smartphone versus dumb phone scenario. It is a union of different sensor types, he said. “It exists in some way, shape, or form already. Blood glucose monitoring is a good example, as there is clinical data, continuous glucose monitoring data, and there's also data that comes from testing infrequently, so that is a form of fusion.”
But his company wants to take sensor fusion even further, he said, explaining that what is new with this approach is that they are looking at data coming from different sources. There will be challenges in perfecting this technology, though, such as partnership development. “We have to face these Betamax versus VCR situations, where you have different sensors sitting on different data sources,” he said. “For example, you might have a wearable device that counts the number of steps you take and you could also track how many steps you take on your smartphone, but they are manufactured by different companies,” he said. “So how do you take the two sources and collaborate--there's a lot of stuff that needs to be worked out. And depending on the application, you might have a series of custom devices for a specific type of patient, so facing those challenges [is] going to take a few years.”
However, both experts are optimistic about the future of sensor fusion in healthcare. Sai Yamanoor noted that sensor fusion is already being used in nonmedical applications, such as self-driving cars, and that the potential is there for medical devices. “Recently there was a startup that actually got approval for developing a device that's meant for surgery planning,” he said. “I think we are almost there in terms of using sensor fusion an application related to medical devices.”
“You have to keep your eyes and ears open to new applications that can come to new technologies. The challenges are finding the right partner and identifying the right type of data source to find the most meaningful way to solve problems in the fastest possible way,” Sai Yamanoor concluded.
As with sensor fusion, good data, and a lot of it, is the key to a making a successful wearable device. “As these wearable devices are becoming smaller and more powerful and allowing people to wear them for a much longer duration, they're going to have artificial intelligence systems that are going to be able to identify not just patients having issues, but they will be more personalized [to detect] when a patient is outside of their base line,” said Steven Hansen, CEO and founder at Odin Technologies LLC.
He explained that these kinds of technologies will allow people to be referred to physicians or nurses for further treatment or diagnosis, or even summon help for critical patients. “These wearable devices are going to be able to extend the capabilities of a physician beyond where they are physically,” Hansen explained. “As hospitals are slowly becoming decentralized, you're having more at-home care, and you're going to start seeing a rise in the amount of wearable devices that are available that allow patients to monitor themselves,” he said.
Hansen believes there will be an increase in wearables in the military because of the way that warfare is changing—slowly starting to push towards smaller subsets of people that are doing very specialized missions. “These missions are often not on the front lines, but in enemy territories,” Hansen said, where it is difficult to evacuate personnel to hospitals. “You're having people that are going into combat and you need to be able to monitor and diagnose them. Because there isn’t a local healthcare system for these soldiers, they need to be monitored and cared for over days or weeks without advanced medical treatment,” he continued. “There's a real big push for additional wearable devices in this space.”
These products, especially those made for prolonged use, must be as small and lightweight as possible, as well as very robust. “Dust, heat, radiation, vibration, impact, water—all these kinds of things are present in austere environments, so the device has to have a level of robustness to be able to withstand the elements,” Hansen said.
And as we have seen before with technologies such as GPS, cell phones, and remotely controlled ovens that were developed by the military or NASA, wearable diagnostic device technology will be translated for consumer use as well, Hansen noted. “As the military finances new products, it can then take these products and translate them into the consumer space,” he said. “Military validation and use of technology is big in getting technologies off the ground,” Hansen explained.
Hansen said that sensors will also be used in daily living activities that will help people become invested in their wellness, which is extremely important for public health. “People are now asking themselves how many calories do I need each day? Is my heart really healthy? Is my heart rate too high? Is it too low? What's my blood pressure? All these things that help people in their daily lives to give better information on whether they’re becoming healthier and help these people improve their quality of living.”
Wearables will also help those patients who might be having minor concerns at home get the treatment they need faster. “It's not only going to help patients understand if they have an issue, but also help the physician identify what the problem is earlier, so they can give the patient the right treatment,” Hansen said. “This will also drive down cost, as well as lessen the time in diagnosing patients, because physicians will already have an inclination of what the issue is before the patient even shows up at the hospital.”
Of course, there are some challenges in developing wearables. Hansen said that one of the big hurdles is battery life for devices in both military and civilian use. “Can you get a device that's providing advanced diagnostics to continuously evaluate, diagnose, analyze, and transmit data without running out of power? The battery life has to be able to sustain diagnostic or information gathering, or even transmitting information over long periods of time,” he said.
Security is another issue. “If you're transmitting patient information over Bluetooth or Wi-Fi, the opportunity for someone to wirelessly access the device increases,” Hansen said. “And so now you not only have to worry about diagnosing the patient, but you [might also] have people with malicious intent getting into those wearables, maybe messing with the device itself or the data.”
As for challenges with AI that a lot of these devices will use, Hansen said the big question is can the device learn itself out of being safe? “If you have an artificial intelligence engine, you have to prove that regardless of what inputs you give it, it will never hurt somebody,” Hansen said. “That's really hard to do. And a lot of companies have been struggling to produce these medical devices because you can’t necessarily prove that whatever input the machine gets is going to produce a safe output.”
With two sensors aboard the Mars Curiosity, FUTEK Advanced Sensor Technology Inc., is well versed in designing and producing products that can survive harsh environments. One recent project the company took on was to create a sensor that would improve a surgeon’s dexterity substantially when performing laparoscopic or robotic procedures.
One of the drawbacks to minimally invasive surgeries is that, in using robotic instruments, surgeons lose some of the sense of touch. “Surgery is 50% sight and 50% haptics,” said Ebenezer Ferreira, project management—MEDTECH Programs, at FUTEK. “In traditional surgery where the surgeon is directly manipulating an instrument with his hand, he has 100% haptic feedback. When you move to laparoscopic surgery, the surgeon loses a little bit of the haptic feedback because of the dampening effect of the instrument. So, when you move towards minimally invasive robotic surgery or surgical robotics, you certainly have good visualization of the surgical site, but the surgeon has completely lost dexterity,” he continued. “It's very important for the surgeon to realize the texture or the stiffness of tissue of the internal organs during surgery and using haptic feedback sensors helps to bring the sense of touch to the instrument.”
In developing their haptic feedback sensors, engineers at FUTEK had to overcome some serious challenges. The first was miniaturization. “These sensors are really tiny, just a few millimeters by a few millimeters,” said Ehsan Mokhberi, sales manager for strategic accounts at Futek. The sizes vary by project, but they are ultraminiature because they must fit inside an instrument that is no greater than 10 to 15 mm space/envelope.
The sensors must also be able to withstand many cycles of sterilization by autoclave. “One of the solutions is to have a hermetically sealed force sensor,” Ferreira said. The common way to achieve hermeticity is to weld the sensor parts, which has size limitations—the smaller the sensor, the more difficult it is to weld the parts. Futek’s patented monolithic process can survive several autoclave and autowash cycles without failure, Mokhberi said, because the sensor is only one piece, which prevents leakage.
Mokhberi went on to say that manufacturability also can be challenging. “There's a lot of companies that can do things in a laboratory, but when they want to scale it that becomes a really big headache,” he said.
He cited an example of many semiconductor companies that can design processors, but there are only a few that measure them to specific sizes, maintain specific performance, and then manufacture them in quantities of millions, cost-effectively. “It's four different variables that you have to put in the equation,” he said. “Can you make a product commercial, can you bring it to a price point that makes sense, can you put it into a surgical instrument at the same time you're maintaining performance, and can you make it reusable while maintaining a performance of it? And that is the whole challenge that we've learned in the past 30 years of how to do that.”
Above: A depiction of the idea of how early engagement with the sensor design company help to streamline design and cut sensor integration and manufacturing costs. Figure and caption courtesy of FUTEK.
When asked what he hopes attendees will take away from the session at MD&M West, Ferreira said, “First is to educate them about what options are available, and second is trying to get them involved at the earlier design phase.” He encouraged medtech companies to get involved with FUTEK at the infancy level of a product cycle to allow for streamlined, most-effective integration, cut down both on costs and the learning curve. “Because that translates into hard dollars for companies, and it becomes really expensive fast. So we try to familiarize them with our technology and engage at an early stage.”
Sai Yamanoor and Sri Yamanoor presented Preparing for Sensor Fusion, A Revolution in Patient-Centric Health Delivery, on Wednesday, February 12, from 1:15 to 2 p.m., in Room 210A. Steven Hansen spoke on Advances in Prolonged Field Care Enabled by Wearable Diagnostics on Tuesday, February 11, from 1:15 to 2 p.m. in Room 210A. Overcoming Challenges in Haptic Feedback for Robotic Surgery Platforms was presented by Ebenezer Ferreira and Ehsan Mokberi, and Thomas Bowles, Director of Quality Assurance, on Wednesday, February 12, from 10:45 to 11:45 a.m., in Room 210A.