Daniela Chanci Arrubla, Wachs Awarded Best Paper from HFES 2020

Medical team preparing for sterile surgery, medical worker holds endoscopy device and looks at screen
MSIE Candidate, Daniela Chanci Arrubla
MSIE Candidate, Daniela Chanci Arrubla
At this fall's Human Factors and Ergonomics Society Meeting Daniela Chanci Arrubla was honored with a Best Paper Designation from the society at the 2020 International meeting.

Daniela Chanci Arrubla's paper entitled "Correlation between gestures' qualitative properties and usability metrics" was named HCTG Best Student Paper Award at HFES 2020. She is studying under Professor Juan Wachs at the industrial engineering lab, the Inteligent Systems and Assistive Technologies (ISAT). Co-authors are Naveen Madapana and Glebys Gonzalez. 

Chanci Arrubla is developing an autonomous and real-time touchless interface for surgeons to control MRI images in the Operating Room (OR). Instead of relying on surgical assistants or nurses, our system assists surgeons by providing them with an optimal gesture and speech-language to supplement the use of keyboards and mouses in the OR. Daniela's work finds applications in several other domains such as other human-computer interaction applications, and to interact with machines, social and assistive robots in restaurants, workplaces, and warehouses.

In the future, Chanci Arrubla intends to improve the accuracy of her algorithms and enhance the responsiveness of the system towards gestural and verbal instructions. As of now, the system recognizes the hand gestures and limited speech and provides the surgeon with a set of probable command options for them to select a final command using an acknowledgment pad. However, next steps include integration of the  algorithm that provides smart assistance as the surgeon operates the MRI software into the current system.

Congratulations, Daniela and Dr. Wachs!

Abstract

The choice of best gestures and commands for touchless interfaces is a critical step that determines the user-satisfaction and overall efficiency of surgeon computer interaction. In this regard, usability metrics such as task completion time, error rate, and memorability have a long-standing as potential entities in determining the best gesture vocabulary. In addition, some previous works concerned with this problem have utilized qualitative measures to identify the best gesture. In this work, we hypothesize that there is a correlation between the qualitative properties of gestures (v) and their usability metrics (u). Therefore, we conducted an experiment with linguists to quantify the properties of the gestures. Next, a user study was conducted with surgeons, and the usability metrics were measured. Lastly, linear and non-linear regression techniques were used to find the correlations between u and v. Results show that usability metrics are correlated with the gestures’ qualitative properties (R^2 = 0.4).