Michael Yip, director of advanced robotics and controls lab at the University of California, San Diego (UCSD), will discuss such challenges and more in “Learning to Control and Plan Surgical Robots within the Body,” an April 7 presentation during MD&M BIOMEDigital.

Susan Shepard

April 2, 2021

3 Min Read
surgical robotics
Image by Gerd Altmann from Pixabay

As surgical robotics get smaller, more flexible, and more mechanically complex, new challenges arise, such as whether we should rely on the doctor to sort out the control of the devices, or if control should be offloaded to a semi or fully autonomous framework. Michael Yip, director of advanced robotics and controls lab at the University of California, San Diego (UCSD), will discuss this and more in “Learning to Control and Plan Surgical Robots within the Body,” during MD&M BIOMEDigital.

“Today’s robots cover an extensive range of procedures,” said Yip, in an interview with MD+DI. They range from a teleoperated system for minimally invasive surgery and diagnostic procedures, systems for carving bones to exactly fit orthopedic implants, and semi-autonomous radiotherapy robots that coordinate beams of radiation on the body to focus on and kill tumors, and more.

Yip’s team at UCSD is working on artificially intelligent surgical robots that leverage feedback from cameras, accelerometers, and other sensor technologies to infer what is going on in surgery and provide assistance to the surgeon. “This [assistance] comes in many forms, from virtual fixtures to augmented reality assistance, to recognition of critical anatomy to either avoid or be careful around,” he said.

His group is also working with the National Science Foundation, the NIH, and the U.S. Army on AI surgical robots that provide first response in remote locations, under austere environments, with limited reception to support teleoperation from afar. “In these environments, we are teaching robots to perform life-saving procedures such as hemorrhage control and airway management so they can provide critical first response immediately, stabilize casualties, and wait for medical evacuation to arrive,” Yip explained.

However, because robotics today are still nonmalleable, Yip said surgeons are still required to work within the limits of the robot’s intelligence and capabilities. “Most of these robots have no situational awareness or understanding of anatomy,” he said, so the entire burden falls on the surgeon to ensure they interface with their robotic tools effectively.

But Yip’s research group is working on changing that. “My research group is working to break these constraints so that the robots recognize and adapt to the surgeon, can understand the procedures being performed, have situational awareness, and essentially imbue themselves with clinical expertise, using AI and machine learning,” he said.

When asked what he hopes his attendees will take away from his session, Yip said that there is a lot of information already being collected by cameras. “With machine learning and AI, they can tell you about what is going on in surgery, both regarding the tissues, as well as the robot and the surgeon themselves,” he said. “As researchers, my lab is trying to codify what doctors recognize and see into algorithms using machine learning and AI, so that we can teach the same intelligence to the robots,” Yip said. By doing so, he explained, the quality of care can be standardized and raised amongst all patients.

Learning to Control and Plan Surgical Robots within the Body,” will be presented on April 7, from 10 to 11 a.m. Yip said technologists, researchers, clinicians who want to have a view of the future of surgical robotics, and those who simply want to understand what AI and machine learning can do for their practice or product could especially benefit from the session.

About the Author(s)

Susan Shepard

Susan Shepard is a freelance contributor to MD + DI.

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like