Helping Robots Appreciate the Gesture

By: Thomas Blair Industrial engineering graduate student Mithun Jacob works with a prototype of a robotic scrub nurse that can understand hand gestures. Photo courtesy of Purdue University and Mark Simmons.

Thomas Blair

February 17, 2011

4 Min Read
Helping Robots Appreciate the Gesture

Researchers work with a prototype of the robot.

Industrial engineering graduate student Mithun Jacob works with a prototype of a robotic scrub nurse that can understand hand gestures. Photo courtesy of Purdue University and Mark Simmons. 

Nonverbal communication is a fundamental element of the human experience. It makes it possible to hail a cab. It can build (or tear apart) entire relationships. It can even be a matter of life or death, as when a surgeon uses hand gestures to communicate with a scrub nurse.

But what happens when robots, which are becoming more and more integrated into our everyday lives, enter the equation? Robots are not, as of right now, known for their intuitive abilities. How could a soulless hunk of metal grasp a mode of communication as distinctly human as hand gestures?

 

That’s the question Juan Pablo Wachs, an assistant professor of industrial engineering at Purdue University in West Lafayette, IN, is grappling with. He and his fellow researchers, Dr. Stephen Adams D.V.M. and graduate students Mithun Jacob and Yu-Ting Li, are working on a robotic scrub nurse that can respond to and follow natural hand gestures, so that surgeons would be able to interact with it as intuitively as they would flesh-and-blood assistants. The researchers have built a prototype, which they have tested in cooperation with Purdue’s School of Veterinary Medicine.

 

The system is designed to allow a surgeon to, say, request a specific instrument from the robot using hand gestures. The robot reads the gestures with a camera, picks the instrument up with a robotic arm, and gives it to the surgeon.

 

The technology is similar to that used in Microsoft’s Kinect video game system, which allows players to control the action on their Xboxes via a motion-reading camera. Indeed, Wachs and his team are testing the Kinect, and may incorporate it with their system.

 

Wachs’ prototype would not be the first robotic surgical assistant. But, he said in a telephone interview, it would be the first to recognize nonverbal hand gestures. While voice recognition technology is relatively developed, little work has been done in the more complex and complicated area of gesture recognition, even though it is an enormously significant piece of the puzzle.

 

“This is actually the way that surgeons are most proficient when they work,” Wachs said. “They use gestures…They use voice, as well, to communicate with the nurse, (and) there are at least two modalities, which is both voice and gestures, and sometimes only voice, and sometimes only gestures.”

 

Voice recognition is “one aspect of human-to-human communication, and if you only rely on that, you are short.”

 

Wachs and his team are still facing a number of challenges, including the fact that their robot will have to be able to accommodate surgeons who are too busy to be trained on a specific gesture vocabulary. Their robot will have to be able to communicate with surgeons on a more natural level.

 

“You are forced to understand the behavior of the surgeon, the body behavior, the body language, the gestural language,” Wachs said. “You’re forced to understand that in order to be able to know what to do, or to interact with the surgeon to pass the surgical instruments.”

 

Preliminary testing has illustrated other areas where the system could be improved. Wachs said that he has plans to add voice-recognition capabilities and find a way to make the robot faster, to better match the performance of flesh-and-blood scrub nurses.

 

“Experienced scrub nurses are very fast,” Wachs said, “and they can also predict what is the next surgical instrument that the surgeon is going to need.”

 

He made it clear that his goal is not to replace the human scrub nurse, but to augment the scrub nurse’s role by having the robot take over the duties of the surgical tech.

 

“The scrub nurse has a lot of expertise and deals with a lot of knowledge that we are not dealing with,” he said.

 

The headlines tend to go to other, flashier robotic systems that can actually perform surgery, such as Intuitive Surgical’s Da Vinci Surgical System. Wachs acknowledged that the Da Vinci is impressive, but he pointed out that it is expensive, requires extra training of surgical staff, and represents a transformation of the operating room.

 

Wachs said that his technology, in contrast, can be integrated into existing surgical infrastructure, and will be cheaper and require far less training, allowing it to be better suited for use in outpatient and ambulatory settings. Wachs felt that, in the near future, at least, surgical robots will best be utilized in a support role.

 

“You still want the expertise and the intuition of the expert, of the surgeon,” he said. “You still need it. At least for now, the best combination is teamwork.”

 

Wachs and his team are doing their best to ensure that, like any good teammate, robotic surgical assistants will be able to communicate effectively.

 

The Association of Computing Machinery has produced a video of the robotic prototype in action, which accompanies an article on gesture recognition technology titled “Vision-Based Hand-Gesture Applications,” that was coauthored by Wachs with Mathias Kölsch of the Naval Postgraduate School and Helman Stern and Yael Edan of Ben-Gurion University. The article appears in the February issue of the ACM’s magazine, Communications of the ACM.

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like