Dr. Juan Wachs and Edgar J. Rojas Muñoz write for the College of Engineering on their work with augmented reality (AR) to train surgeons in the battlefield.
AR is a leap forward from a telestrator, where a remote mentor annotates a live video of the mentee’s operating field using lines and icons that encode surgical instructions. These annotations are visualized by the local mentee on a nearby display. Though effective, telestration requires mentees to constantly shift focus away from the operating field to the display and remap the instructions to the actual operating field, which can lead to cognitive overload and errors.
In AR telementoring, computer-generated 3D objects are superimposed onto the mentee’s field of view in real time. Most AR-based systems place tablets in a fixed position in the operating field between the patient and the mentee to display the medical guidance. These platforms are neither self-contained nor portable, and require multiple pieces of hardware, such as external cameras, screens, computers, and brackets.
In a project led by Industrial Engineering at Purdue and developed with support from the U.S. Department of Defense, our next-generation System for Telementoring with Augmented Reality (STAR) platform addresses these shortcomings by integrating an Augmented Reality Head-Mounted Display (ARHMD). An onboard camera transmits an image-stabilized view of the operating field to the mentor. The mentor creates annotations representing surgical instructions using a touch interface to draw incision lines, illustrate the placement of instruments, and so forth.
The head-mounted displays enable the medical instructions to be perceived at the correct position and depth relative to the patient’s body. They allow mentees to see the surgical field directly and don’t obstruct their hands, as the device is entirely self-contained on the head.