About three years ago Royal Philips introduced an ultrasound transducer called Lumify that simply plugs into a tablet computer via a USB port, and has an interface that runs off an app. MD+DI at the time called it the birth of app-based ultrasound after speaking with Randy Hamlin, vice president and the point-of-care business leader at Philips, to learn more about the design and regulatory challenges the company had to overcome to make the device a reality.
Now, the Amsterdam-based company has partnered with Innovative Imaging Technologies (IIT), based in Quebec, Canada, to bring to market the industry's first tele-ultrasound solution based on the Lumify portable ultrasound system and powered by IIT's Reacts collaborative platform. The combined technology enables clinicians around the world to connect in real time by turning a compatible smart device into an integrated tele-ultrasound solution, combining two-way audio-visual calls with live ultrasound streaming.
MD+DI spoke with Hamlin again for the inside scoop on this partnership and to find out how it could take both ultrasound care and training to a whole new level.
Since launching Lumify and gathering user feedback from the technology's early adopters, Hamlin said it became clear that there was a need to address the connection with learning how to perform an ultrasound exam with actually being able to interpret the images.
"Actually understanding what you're looking at is a challenge for someone less trained," he said.
So Philips began exploring ways to bridge that learning gap and/or connect the Lumify users with experts remotely.
"As we started looking at how to address that problem we were introduced to the IIT team and some of the work they were doing already with technology that could make connections in the medical space, and it just appeared to be such a natural partnership that made sense to us," Hamlin said. "That started about two years ago."
Already the company has seen some interesting cases where Reacts Lumify has been put to use, both in the educational arena and in the clinical world, Hamlin said. On the educational side, for example, he said there are times when there is a geographical distance between the learner and the expert, and the combined technology has enabled that connection and makes it seem as if the educator is in the room with the learner.
Then there are times that the technology has been able to put a Reacts Lumify user in touch with an expert to can look at the ultrasound streaming and provide a remote consult in that particular case.
Another interesting way the technology is being used in what Hamlin referred to as the pre-hospital environment, such as in an ambulance in transit to a hospital.
"Sometimes those transit times are quite extensive in rural areas and you can imagine being able to have a paramedic connect remotely in transit to the emergency room physician to get some guidance on patient care as they're being transported to the hospital," Hamlin said. "We see how this tool can be used to bridge the gap between someone who is an expert and can reach out and connect with someone who is less skilled, perhaps."
This is really an example of a broader trend that is happening not just in the clinical setting, but also on the design and development side of medtech.
"When I reflect back over the last couple of decades working in the industry and as we've developed products and solutions here at Philips we have a global footprint. We have engineering teams spread clear across the world who have different expertise in different areas," Hamlin said.
There was a time when teams worked in isolation and if collaboration was needed on a particular project, the company would have to physically bring someone in to be in the same room as the development team. "What we do now is we develop our solutions by connecting virtually the experts across Philips's network through video and audio and it enables us to do almost exactly what Reacts Lumify is doing in the medical space now. I can reach experts, it doesn't matter if they're in a different state or a different country, and I can bring them into that conversation."
The industry seems to have adopted this concept of virtual teams now, he said, which means they're finding the right talent, with the right skill for bringing the best solution to the project. The same is true of patient care, he said, there is a realization that patients can receive better care because doctors can't be an expert at everything but they can leverage technology to connect with the experts they need to provide better care for their patients.
"I think the smart device adoption within the medical community is an enabler for the realization that 'I can tap in now to clinical experts', and I think that's really catching on and I think it's going to continue now at a really fast pace," Hamlin said.
By integrating Reacts into the Lumify system, clinicians can begin their Reacts session with a face-to-face conversation on their Lumify ultrasound system. Then, users can switch to the front-facing camera on their smartphone or tablet to show the position of the probe. They can then share the Lumify ultrasound stream so that both parties are simultaneously viewing the live ultrasound image and probe positioning while discussing and interacting at the same time.
Hamlin said the key to getting a new product to be adopted in any industry is to make it easy to use, and extremely portable.
"We all have our smart devices, we want information wherever we're at," he said.
It wasn't all that long ago that "distance learning" or "remote consult" meant that there was a wall-mounted camera that streamed footage to a room in another location with some sort of large monitor. That technology was large and clumsy, and users had to be in a specific location for it to work, not to mention the cost of such equipment. Mobile-based technologies like the Lumify with Reacts system represents a significant improvement over that earlier tech.
"Being able to integrate seamlessly into our ultrasound app this capability on smart devices is going to really change the game," Hamlin said. "It's not just connecting those three things at the same time, which is what we're doing, but it's doing it with devices that can be with the clinician no matter where they're at. It runs not just on WiFi but also on the cell networks, phones or tablets or laptops."