GestureAnalyzer: Visual Analytics for Pattern Analysis of Mid-Air Hand Gestures

by | Oct 4, 2014

Authors: Sujin Jang, Niklas Elmqvist, Karthik Ramani
Proceedings of the ACM Symposium on Spatial User Interaction, October 4-5, 2014, Honolulu, HI, USA

Understanding the intent behind human gestures is a critical problem in the design of gestural interactions. A common method to observe and understand how users express gestures is to use elicitation studies. However, these studies requires time-consuming analysis of user data to identify gesture patterns. Also, the analysis by humans cannot describe gestures in as detail as in data-based representations of motion features. In this paper, we present GestureAnalyzer, a system that supports exploratory analysis of gesture patterns by applying interactive clustering and visualization techniques to motion tracking data. GestureAnalyzer enables rapid categorization of similar gestures, and visual investigation of various geometric and kinematic properties of user gestures. We describe the system components, and then demonstrate its utility through a case study on mid-air hand gestures obtained from elicitation studies.


The GestureAnalyzer interface. (A) is a list of tasks loaded from the database. (B) shows a table of user IDs. (C) shows the animation of user gestures. (D) is a panel that shows the interactive hierarchical clustering of gesture data. Information of currently selected task and cluster node are given at the bottom. (E) is a list of output clusters generated from the interactive hierarchical clustering. (F) provides a visual definition of gesture feature. (G) shows a tree diagram of gesture clusters.



Sujin Jang is currently working at Motorola, Chicago, IL. He received his Ph.D. from the School of Mechanical Engineering at Purdue University in August 2017. His research work at the C-Design Lab broadly involved human-computer interaction, visual analytics, machine learning, and robotics. His research has focused on creating methodologies and principles for effective use of gestures in HCI. In particular, he has developed methods to analyze and exploit human gesture based on visual analytics integrating machine learning and information visualization; biomechanical arm fatigue analysis; a gestural user interface for human-robot interaction; and an interactive clustering and collaborative filtering approach for hand pose estimation. He also has served as a teaching assistant for ME 444: Computer-aided design and rapid prototyping, and received the Estus H. and Vashti L. Magoon Award for Teaching Excellence in 2015. [Personal Website][LinkedIn]