US hospitals are facing great shortage of registered nurses, which could lead to an increment in mortality rate. One solution to this challenge is to bring Robotic Scrub Nurse (RSN) into the Operating Room (OR) to free nurses from mundane and repetitive tasks such as instrument delivery and retrieval.
As an important building block for RSN, this paper presents an accurate and robust surgical instrument recognition algorithm. Surgical instruments are often cluttered, occluded and display specular light, which causes a challenge for conventional recognition algorithms. A learning-through-interaction paradigm was proposed to tackle the challenge, which combines computer vision with robot manipulation and achieves active recognition. The unknown instrument is firstly segmented out as blobs and its poses estimated, then the RSN system picks it up and presents it to an optical sensor in a determined pose. Lastly the unknown instrument is recognized with high confidence.
Experiments were then conducted to evaluate the performance of the proposed segmentation and recognition algorithms, respectively. It is found out that the proposed patch-based segmentation algorithm and attention-based recognition algorithm greatly outperform their benchmark comparisons, proving the applicability and effectiveness of a RSN to perform accurate and robust surgical instrument recognition tasks.
Zhou, Tian., & Wachs, Juan. Finding a Needle in a Haystack: Recognizing Surgical Instruments through Vision and Manipulation. In SPIE/IS&T Electronic Imaging, no. 9, pp. 37–45, IS&T, 2017. Best Student Paper
Zhou, Tian, and Juan P. Wachs. “Needle in a haystack: Interactive surgical instrument recognition through perception and manipulation.” Robotics and Autonomous Systems 97 (2017): 182-192.