Rethink Robotics: Juan Pablo Wachs
Introducing hand-gesture recognition to the surgical arena
Surgeons of the future might use a system that recognizes hand gestures as commands to control a robotic scrub nurse or tell a computer to display medical images of the patient during an operation.
Both the hand-gesture recognition and robotic nurse innovations might help to reduce the length of surgeries and the potential for infection, says Juan Pablo Wachs, associate professor of industrial engineering at Purdue.
Surgeons routinely need to review medical images and records during surgery, but stepping away from the operating table and touching a keyboard and mouse can delay the surgery and increase the risk of spreading infection-causing bacteria.
The new approach is a system that uses a camera and specialized algorithms to recognize hand gestures as commands to instruct a computer or robot. At the same time, a robotic scrub nurse, which assists the surgeon and hands the proper surgical instruments to the doctor when needed, represents a potential new tool that might improve operating-room efficiency. Wachs has developed a prototype named Gestonurse in collaboration with faculty in Purdue’s School of Veterinary Medicine and the Indiana University School of Medicine. View a demonstration
"While it will be very difficult using a robot to achieve the same level of performance as an experienced nurse who has been working with the same surgeon for years, often scrub nurses have had very limited experience with a particular surgeon, maximizing the chances for misunderstandings, delays and sometimes mistakes in the operating room," Wachs says. "In that case, a robotic scrub nurse could be better."
Wachs began researching hand-gesture recognition several years ago in work led by the Washington Hospital Center and Ben-Gurion University, where he was a research fellow and doctoral student, respectively. He notes that researchers at other institutions developing robotic scrub nurses have focused largely on voice recognition, whereas little has been done in the area of gesture recognition.
"One challenge will be to develop the proper shapes of hand poses and the proper hand trajectory movements to reflect and express certain medical functions," Wachs says. "You want to use intuitive and natural gestures for the surgeon to express medical image navigation activities, but you also need to consider cultural and physical differences between surgeons. They may have different preferences regarding what gestures they may want to use."
Other challenges include providing computers with the ability to understand the context in which gestures are made and to discriminate between intended gestures versus unintended gestures.
"Say the surgeon starts talking to another person in the operating room and makes conversational gestures," Wachs says. "You don't want the robot handing the surgeon a hemostat."