Engineering student J.D. Yamokoski wears an LED-lit patch that the robot, built by Scott Banks (right),
follows to take x-ray video. Photo courtesy of the
University of Florida.
The device is meant to augment static images of patients' bones, muscles, and joints. Scott Banks, who created the robot, says he hopes that by merging the full-motion x-rays with computerized representations, orthopedic surgeons will be able to make better diagnoses, suggest more appropriate treatments, and get a clearer idea of postoperative successes and failures.
“Our goal is to come up with a way to observe and measure how joints are moving when people are actually using them,” Banks said. “We think this will be tremendously powerful, not only for research but also in the clinical setting.”
Orthopedic surgeons usually diagnose patients by touch, or by using magnetic resonance (MR) or computed tomography (CT) images. X-ray video is also used, says Banks, but the technology may only provide limited visuals.
Although all of these techniques can be effective, they do not work well with injuries that manifest themselves when a joint is in motion, Banks says. These include, for example, injuries to the patella, or kneecap, and injuries of the shoulder. Surgeons are sometimes forced to operate to diagnose these and other injuries.
After an operation, surgeons have few tools beyond the patient's experience to tell them whether a procedure worked as intended and whether it will forestall additional joint damage.
The current device consists of a 1-m-long robot arm that is also used in robotic-assisted surgeries and silicon chip manufacturing. Banks and his team reengineered the arm for their purposes. In the device's completed form, one robot will hold lightweight equipment capable of taking x-rays, while another robot holds the tracking sensor. Although the robots will be attached to a fixed base, there will be room for a person to move around and remain within the robot's visual reach. And in the future, said Banks, “we could put these robots on wheels.”
The patient wears an LED-lit patch on the affected body part, which could include any joint. Several cameras placed around the room and a networked computer command the robot to hone in on and track the patch. Banks says that the tracking system is not yet accurate enough for video x-ray. For now he is using a standard video camera to test and fine-tune the sensors. To continue his work, Banks has applied for a grant from the National Institutes of Health.
Mike Moser, an orthopedic surgeon working with Banks on the project, says he thinks the robot system would be very useful to surgeons. “The biggest thing that this technology could offer in treating orthopedic injuries is that it has the ability to visualize joint motion dynamically, as it changes,” he says. “I think this would be good for many different conditions of the shoulder, knee, elbow, and ankle. It could be extrapolated to almost any orthopedic injury or condition.”