Publications

2018

Gonzalez, G., Madapana, N., Taneja, R., Zhang, L., Rodgers, R. B., Wachs, J. P. (2018). Looking Beyond the Gesture: Vocabulary Acceptability Criteria for Gesture Elicitation Studies. In press of the Human Factors and Ergonomics Society Annual Meeting.

Madapana, N., Wachs, J. P. (2018). Hard Zero Shot Learning for Gesture Recognition. In press at 2018 24th International Conference on Pattern Recognition (ICPR).

Cabrera, Maria E, Voyles, R. M., & Wachs, J. P. (2018). Coherence in One-Shot Gesture Recognition for Human-Robot Interaction. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 75–76). ACM. Paper

Sanchez-Tamayo, N., & Wachs, J. P. (2018). Collaborative Robots in Surgical Research: a Low-Cost Adaptation. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 231–232). ACM. Paper

Zhou, T., Cha, J. S., Gonzalez, G. T., Wachs, J. P., Sundaram, C., & Yu, D. (2018). Joint Surgeon Attributes Estimation in Robot-Assisted Surgery. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 285–286). ACM. Paper

2017

Andersen, Dan, Popescu, V., Cabrera, M. E., Shanghavi, A., Mullis, B., Marley, S., … Wachs, J. P. (2017). An Augmented Reality-Based Approach for Surgical Telementoring in Austere Environments. Military Medicine, 182(suppl_1), 310–315. Paper

Madapana, N., & Wachs, J. (2017a). ZSGL: zero shot gestural learning. In Proceedings of the 19th ACM International Conference on Multimodal Interaction (pp. 331–335). ACM. Paper

Cabrera, Maria E, Novak, K., Foti, D., Voyles, R., & Wachs, J. P. (2017). What makes a gesture a gesture? Neural signatures involved in gesture recognition. In Automatic Face & Gesture Recognition (FG 2017), 2017 12th IEEE International Conference on (pp. 748–753). IEEE. Paper

Madapana, N., & Wachs, J. P. (2017b). A semantical & analytical approach for zero shot gesture learning. In Automatic Face & Gesture Recognition (FG 2017), 2017 12th IEEE International Conference on (pp. 796–801). IEEE. Paper

Cabrera, Maria E, Sanchez-Tamayo, N., Voyles, R., & Wachs, J. P. (2017). One-Shot Gesture Recognition: One Step Towards Adaptive Learning. In Automatic Face & Gesture Recognition (FG 2017), 2017 12th IEEE International Conference on (pp. 784–789). IEEE. Paper

Cabrera, Maria Eugenia, & Wachs, J. P. (2017). A Human-Centered Approach to One-Shot Gesture Learning. Frontiers in Robotics and AI, 4, 8. Paper

Duerstock, B. S., Wachs, J. P., Zhang, T., & Williams, G. (2017). Multimodal image perception system and method. Paper

Velasquez, C. A., Mazhar, R., Chaikhouni, A., Zhou, T., & Wachs, J. P. (2017). Taxonomy of Communications in the Operating Room. In International Conference on Applied Human Factors and Ergonomics (pp. 251–262). Springer, Cham. Paper

Zhang, T., Duerstock, B. S., & Wachs, J. P. (2017). Multimodal Perception of Histological Images for Persons Who Are Blind or Visually Impaired. ACM Transactions on Accessible Computing (TACCESS), 9(3), 7. Paper

Zhang, T., Li, Y.-T., & Wachs, J. P. (2017). The Effect of Embodied Interaction in Visual-Spatial Navigation. ACM Transactions on Interactive Intelligent Systems (TiiS), 7(1), 3. Paper

Zhou, T., & Wachs, J. P. (2017a). Early prediction for physical human robot collaboration in the operating room. Autonomous Robots, 1–19. Paper

Zhou, T., & Wachs, J. P. (2017c). Finding a Needle in a Haystack: Recognizing Surgical Instruments through Vision and Manipulation. Electronic Imaging, 2017(9), 37–45. Paper

Zhou, T., & Wachs, J. P. (2017d). Needle in a haystack: Interactive surgical instrument recognition through perception and manipulation. Robotics and Autonomous Systems, 97, 182–192. Paper

2016

Andersen, Daniel, Popescu, V., Cabrera, M. E., Shanghavi, A., Gomez, G., Marley, S., … Wachs, J. (2016a). Virtual annotations of the surgical field through an augmented reality transparent display. The Visual Computer, 32(11), 1481–1498.

Andersen, Daniel, Popescu, V., Cabrera, M. E., Shanghavi, A., Gómez, G., Marley, S., … Wachs, J. P. (2016). Avoiding Focus Shifts in Surgical Telementoring Using an Augmented Reality Transparent Display. In MMVR (Vol. 22, pp. 9–14).

Andersen, Daniel, Popescu, V., Cabrera, M. E., Shanghavi, A., Gomez, G., Marley, S., … Wachs, J. P. (2016b). Medical telementoring using an augmented reality transparent display. Surgery, 159(6), 1646–1653.

Andersen, Daniel, Popescu, V., Lin, C., Cabrera, M. E., Shanghavi, A., & Wachs, J. (2016). A Hand-Held, Self-Contained Simulated Transparent Display. In Mixed and Augmented Reality (ISMAR-Adjunct), 2016 IEEE International Symposium on (pp. 96–101). IEEE.

Cabrera, Maria E, & Wachs, J. P. (2016). Embodied gesture learning from one-shot. In Robot and Human Interactive Communication (RO-MAN), 2016 25th IEEE International Symposium on (pp. 1092–1097). IEEE.

Duerstock, B. S., Wachs, J. P., & Jiang, H. (2016). Universal translator for recognizing nonstandard gestures.

Jacob, Mithun George, & Wachs, J. P. (2016). Optimal Modality Selection for Cooperative Human–Robot Task Completion. IEEE Transactions on Cybernetics, 46(12), 3388–3400.

Jiang, Hairong, Duerstock, B. S., & Wachs, J. P. (2016a). User-centered and analytic-based approaches to generate usable gestures for individuals with quadriplegia. IEEE Transactions on Human-Machine Systems, 46(3), 460–466.

Jiang, Hairong, Duerstock, B. S., & Wachs, J. P. (2016b). Variability Analysis on Gestures for People With Quadriplegia. IEEE Transactions on Cybernetics.

Jiang, Hairong, Wachs, J. P., & Duerstock, B. S. (2016). An optimized real-time hands gesture recognition based interface for individuals with upper-level spinal cord injuries. Journal of Real-Time Image Processing, 11(2), 301–314.

Jiang, Hairong, Zhang, T., Wachs, J. P., & Duerstock, B. S. (2016). Enhanced control of a wheelchair-mounted robotic manipulator using 3-D vision and multimodal interaction. Computer Vision and Image Understanding, 149, 21–31.

Li, J., Ye, D. H., Chung, T., Kolsch, M., Wachs, J., & Bouman, C. (2016). Multi-target detection and tracking from a single camera in Unmanned Aerial Vehicles (UAVs). In Intelligent Robots and Systems (IROS), 2016 IEEE/RSJ International Conference on (pp. 4992–4997). IEEE.

O’Hara, K., Sellen, A., & Wachs, J. (2016). Introduction to Special Issue on Body Tracking and Healthcare. Human–Computer Interaction, 31(3–4), 173–190.

Popescu, V. S., & Wachs, J. P. (2016). Simulated transparent display with augmented reality for remote collaboration.

Velasquez, C. A., Chaikhouni, A., & Wachs, J. P. (2016). Robotic Assistants in Operating Rooms in Qatar, Development phase. In Qatar Foundation Annual Research Conference Proceedings (Vol. 2016, p. ICTPP2886). HBKU Press Qatar.

Wachs, Juan, Mejail, M., Fishbain, B., & Alvarez, L. (2016). Special issue on real-time image and video processing for pattern recognition systems and applications. Journal of Real-Time Image Processing, 11(2), 247–249.

Wachs, Juan P. (2016). Embodied Interactions in Human-Machine Decision Making for Situation Awareness Enhancement Systems. Purdue University West Lafayette United States.

Zhou, T., Cabrera, M. E., Low, T., Sundaram, C., & Wachs, J. (2016). A comparative study for telerobotic surgery using free hand gestures. Journal of Human-Robot Interaction, 5(2), 1–28.

Zhou, T., Cabrera, M. E., & Wachs, J. P. (2016). A Comparative Study for Touchless Telerobotic Surgery. In Computer-Assisted Musculoskeletal Surgery (pp. 235–255). Springer, Cham.

Zhou, T., & Wachs, J. (2016). Early turn-taking prediction in the operating room. In 2016 AAAI Fall Symposium Series.

2015

Bechar, Avital, Nof, S. Y., & Wachs, J. P. (2015). A review and framework of laser-based collaboration support. Annual Reviews in Control, 39, 30–45.

Blekhman, A., Wachs, J. P., & Dori, D. (2015). Model-Based System Specification With Tesperanto: Readable Text From Formal Graphics. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 45(11), 1448–1458.

Jiang, Hairong, Hsu, C.-H., Duerstock, B. S., & Wachs, J. P. (2015). Determining natural and accessible gestures using uncontrolled manifolds and cybernetics. In Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on (pp. 4078–4083). IEEE.

Pereira, A., Wachs, J. P., Park, K., & Rempel, D. (2015). A user-developed 3-D hand gesture set for human–computer interaction. Human Factors, 57(4), 607–621.

Wachs, Juan. (2015). See-What-I-Do: Increasing mentor and trainee sense of co-presence in trauma surgeries with the STAR platform. Purdue University West Lafayette,.

Wachs, Juan P. (2015). Wisdom in Our Fingers—Or How Embodied Interaction Can Shape Future Work.

Zhou, T., Cabrera, M. E., & Wachs, J. P. (2015). Touchless telerobotic surgery-is it possible at all? In AAAI (pp. 4228–4230).

2014

Bechar, Avital, Nof, S. Y., & Wachs, J. P. (2014). Precision Collaboration and Advanced Integration Using Laser Systems and Techniques. Laser and Photonic Systems: Design and Integration, 165.

Gomez, L., Wachs, J. P., & Jacobo-Berlles, J. (2014). Guest Editorial-Special Issue on Robust Recognition Methods for Multimodal Interaction. Pattern Recognition Letters, 36, 187–188.

Jacob, Mithun George, & Wachs, J. P. (2014a). Context-based hand gesture recognition for the operating room. Pattern Recognition Letters, 36, 196–203.

Jacob, Mithun George, & Wachs, J. P. (2014b). Optimal modality selection for multimodal human-machine systems using RIMAG. In Systems, Man and Cybernetics (SMC), 2014 IEEE International Conference on (pp. 2108–2113). IEEE.

Jiang, Hairong, Duerstock, B. S., & Wachs, J. P. (2014a). A machine vision-based gestural interface for people with upper extremity physical impairments. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 44(5), 630–641.

Jiang, Hairong, Duerstock, B. S., & Wachs, J. P. (2014b). An analytic approach to decipher usable gestures for quadriplegic users. In Systems, Man and Cybernetics (SMC), 2014 IEEE International Conference on (pp. 3912–3917). IEEE.

Jiang, Hairong, P. Wachs, J., & S. Duerstock, B. (2014). Integrated vision-based system for efficient, semi-automated control of a robotic manipulator. International Journal of Intelligent Computing and Cybernetics, 7(3), 253–266.

Jiang, Hairong, Zhang, T., Wachs, J. P., & Duerstock, B. (2014). Autonomous performance of multistep activities with a wheelchair mounted robotic manipulator using body dependent positioning. In Workshop on Assistive Robotics for Individuals with Disabilities: HRI Issues and Beyond, IEEE/RSJ Int. Conf. on Intell. Robots and Systems.

Li, Y.-T. (2014). Embodied interaction with visualization and spatial navigation in time-sensitive scenarios.

Li, Y.-T., & Wachs, J. P. (2014a). A Bayesian approach to determine focus of attention in spatial and time-sensitive decision making scenarios. In AAAI-14 Workshop on Cognitive Computing for Augmented Human Intelligence.

Li, Y.-T., & Wachs, J. P. (2014b). HEGM: A hierarchical elastic graph matching for hand gesture recognition. Pattern Recognition, 47(1), 80–88.

Li, Y.-T., & Wachs, J. P. (2014c). Linking attention to physical action in complex decision making problems. In Systems, Man and Cybernetics (SMC), 2014 IEEE International Conference on (pp. 1241–1246). IEEE.

Loescher, T., Lee, S. Y., & Wachs, J. P. (2014). An augmented reality approach to surgical telementoring. In Systems, Man and Cybernetics (SMC), 2014 IEEE International Conference on (pp. 2341–2346). IEEE.

Parida, S., Cabrera, M. E., & Wachs, J. P. (2014). Dynamic Surgical Tool Tracking and Delivery System using Baxter Robot.

Wachs, Juan, & Dori, D. (2014). A Conceptual Model For Tool Handling In The Operation Room. In Qatar Foundation Annual Research Conference (p. ITPP0604).

Wachs, Juan P, Frenkel, B., & Dori, D. (2014). Operation room tool handling and miscommunication scenarios: An object-process methodology conceptual model. Artificial Intelligence in Medicine, 62(3), 153–163.

Zhang, T., Williams, G. J., Duerstock, B. S., & Wachs, J. P. (2014). Multimodal approach to image perception of histology for the blind or visually impaired. In Systems, Man and Cybernetics (SMC), 2014 IEEE International Conference on (pp. 3924–3929). IEEE.

Zhong, H., Wachs, J. P., & Nof, S. Y. (2014). Telerobot-enabled HUB-CI model for collaborative lifecycle management of design and prototyping. Computers in Industry, 65(4), 550–562.

2013

Jacob, Mithun G, Li, Y.-T., & Wachs, J. P. (2013). Surgical instrument handling and retrieval in the operating room with a multimodal robotic assistant. In Robotics and Automation (ICRA), 2013 IEEE International Conference on (pp. 2140–2145). IEEE.

Jacob, Mithun George, Li, Y.-T., Akingba, G. A., & Wachs, J. P. (2013). Collaboration with a robotic scrub nurse. Communications of the ACM, 56(5), 68–75.

Jiang, Hairong, Wachs, J. P., & Duerstock, B. S. (2013). Integrated vision-based robotic arm interface for operators with upper limb mobility impairments. In Rehabilitation Robotics (ICORR), 2013 IEEE International Conference on (pp. 1–6). IEEE.

Jiang, Hairong, Wachs, J. P., Pendergast, M., & Duerstock, B. S. (2013). 3D joystick for robotic arm control by individuals with high level spinal cord injuries. In Rehabilitation Robotics (ICORR), 2013 IEEE International Conference on (pp. 1–5). IEEE.

Kölsch, M., Wachs, J., & Sadagic, A. (2013). Visual analysis and filtering to augment cognition. In International Conference on Augmented Cognition (pp. 695–702). Springer, Berlin, Heidelberg.

Li, Y.-T., Jacob, M., Akingba, G., & Wachs, J. P. (2013). A cyber-physical management system for delivering and monitoring surgical instruments in the OR. Surgical Innovation, 20(4), 377–384.

Li, Y.-T., & Wachs, J. P. (2013). Recognizing hand gestures using the weighted elastic graph matching (WEGM) method. Image and Vision Computing, 31(9), 649–657.

Nof, S. Y., Cheng, G. J., Weiner, A. M., Chen, X. W., Bechar, A., Jones, M. G., … others. (2013). Laser and photonic systems integration: Emerging innovations and framework for research and education. Human Factors and Ergonomics in Manufacturing & Service Industries, 23(6), 483–516.

Sadagic, A., Kölsch, M., Welch, G., Basu, C., Darken, C., Wachs, J. P., … others. (2013). Smart instrumented training ranges: bringing automated system solutions to support critical domain needs. The Journal of Defense Modeling and Simulation, 10(3), 327–342.

van Dijk, W., van der Kooij, H., Pehlivan, A. U., Sergi, F., O’Malley, M. K., Handzic, I., … others. (n.d.). Poster Sessions—ICORR 2013.

Wachs, Juan P, & Gomez, G. (2013a). “ Telementoring” en el quirófano: un nuevo enfoque en la formación médica. Medicina (Buenos Aires), 73(6), 539–542.

Wachs, Juan P, & Gomez, G. (2013b). Telementoring systems in the operating room: a new approach in medical training. Medicina, 73(6), 539–42.

Zhang, S. S., & Wachs, J. P. (2013). The Improvement and Application of Intelligence Tracking Algorithm for Moving Logistics Objects Based on Machine Vision Sensor. Sensor Letters, 11(5), 862–869.

Zhong, H., Wachs, J. P., & Nof, S. Y. (2013a). A collaborative telerobotics network framework with hand gesture interface and conflict prevention. International Journal of Production Research, 51(15), 4443–4463.

Zhong, H., Wachs, J. P., & Nof, S. Y. (2013b). HUB-CI model for collaborative telerobotics in manufacturing. IFAC Proceedings Volumes, 46(7), 63–68.

2012

Bechar, A, Nof, S., & Wachs, J. (2012). Laser systems in precision interactions and collaboration. PRISM Center Research Report, Purdue University, West Lafayette, Indiana. Available to Download at Https://Engineering. Purdue. Edu/ Prism/Publications. Shtml.

Jacob, M., Cange, C., Packer, R., & Wachs, J. P. (2012). Intention, context and gesture recognition for sterile MRI navigation in the operating room. In Iberoamerican Congress on Pattern Recognition (pp. 220–227). Springer, Berlin, Heidelberg.

Jacob, M., Li, Y.-T., Akingba, G., & Wachs, J. P. (2012). Gestonurse: a robotic surgical nurse for handling surgical instruments in the operating room. Journal of Robotic Surgery, 6(1), 53–63.

Jacob, Mithun G, Li, Y.-T., & Wachs, J. P. (2012). Gestonurse: a multimodal robotic scrub nurse. In Human-Robot Interaction (HRI), 2012 7th ACM/IEEE International Conference on (pp. 153–154). IEEE.

Jacob, Mithun George, Wachs, J. P., & Packer, R. A. (2012). Hand-gesture-based sterile interface for the operating room using contextual cues for the navigation of radiological images. Journal of the American Medical Informatics Association, 20(e1), e183–e186.

Jiang, Hairong, Duerstock, B. S., & Wachs, J. P. (2012). Integrated gesture recognition based interface for people with upper extremity mobility impairments. Advances in Human Aspects of Healthcare, 546–555.

Jiang, Hairong, Wachs, J. P., & Duerstock, B. S. (2012). Facilitated gesture recognition based interfaces for people with upper extremity physical impairments. In Iberoamerican Congress on Pattern Recognition (pp. 228–235). Springer, Berlin, Heidelberg.

Ko, H. S., Wachs, J. P., & Nof, S. Y. (2012). Web-based Facility Monitoring by Facility Sensor Networks. In IIE Annual Conference. Proceedings (p. 1). Institute of Industrial and Systems Engineers (IISE).

Li, Y.-T., & Wachs, J. P. (2012). Hierarchical elastic graph matching for hand gesture recognition. In Iberoamerican Congress on Pattern Recognition (pp. 308–315). Springer, Berlin, Heidelberg.

Wachs, Juan P. (2012). Robot, pass me the scissors! how robots can assist us in the operating room. In Iberoamerican Congress on Pattern Recognition (pp. 46–57). Springer, Berlin, Heidelberg.

Wachs, Juan P, Jacob, M., Li, Y.-T., & Akingba, G. (2012). Does a robotic scrub nurse improve economy of movements? In Medical Imaging 2012: Image-Guided Procedures, Robotic Interventions, and Modeling (Vol. 8316, p. 83160E). International Society for Optics and Photonics.

2011

Jacob, Mithun George, Li, Y.-T., & Wachs, J. P. (2011). A gesture driven robotic scrub nurse. In Systems, Man, and Cybernetics (SMC), 2011 IEEE International Conference on (pp. 2039–2044). IEEE.

Wachs, Juan Pablo, Kölsch, M., Stern, H., & Edan, Y. (2011). Vision-based hand-gesture applications. Communications of the ACM, 54(2), 60–71.

2010

Matson, E. T., Leong, B., Nguyen, C. Q., Smith, A., & Wachs, J. P. (2010). Using autonomous robots to enable self-organizing broadband networks. In Control Automation and Systems (ICCAS), 2010 International Conference on (pp. 605–610). IEEE.

Nguyen, C. Q., Leong, B., Matson, E. T., Smith, A., & Wachs, J. P. (2010). AWARE: autonomous wireless agent robotic exchange. In International Conference on Intelligent Robotics and Applications (pp. 276–287). Springer, Berlin, Heidelberg.

Wachs, Juan, & Duerstock, B. (2010). An analytical framework to measure effective human machine interaction. Advances in Human Factors and Ergonomics in Healthcare, 611–621.

Wachs, Juan P. (2010). Gaze, posture and gesture recognition to minimize focus shifts for intelligent operating rooms in a collaborative support system. International Journal of Computers Communications & Control, 5(1), 106–124.

Wachs, Juan P, Kölsch, M., & Goshorn, D. (2010). Human posture recognition for intelligent vehicles. Journal of Real-Time Image Processing, 5(4), 231–244.

Wachs, Juan P, Stern, H. I., Burks, T., & Alchanatis, V. (2010). Low and high-level visual feature-based apple detection from multi-modal images. Precision Agriculture, 11(6), 717–735.

Wachs, Juan P, Vujjeni, K., Matson, E. T., & Adams, S. (2010). “A window on tissue”-Using facial orientation to control endoscopic views of tissue depth. In Engineering in Medicine and Biology Society (EMBC), 2010 Annual International Conference of the IEEE (pp. 935–938). IEEE.

2009

Cheng, H., Kumar, R., Basu, C., Han, F., Khan, S., Sawhney, H., … others. (2009). An instrumentation and computational framework of automated behavior analysis and performance evaluation for infantry training. In Proceedings of I/ITSEC.

Goshorn, D., Wachs, J., & Kölsch, M. (2009). The Multi-level Learning and Classification of Multi-class Parts-Based Representations of US Marine Postures. In Iberoamerican Congress on Pattern Recognition (pp. 505–512). Springer, Berlin, Heidelberg.

Sadagic, A., Welch, G., Basu, C., Darken, C., Kumar, R., Fuchs, H., … others. (2009). New generation of instrumented ranges: Enabling automated performance analysis.

Wachs, J, Stern, H., Burks, T., Alchanatis, V., & Bet-Dagan, I. (2009). Apple detection in natural tree canopies from multimodal images. In Proceedings of the 7th European Conference on Precision Agriculture, Wageningen, The Netherlands (Vol. 68, pp. 293–302).

Wachs, JP, Stern, H., Edan, Y., Gillam, M., Handler, J., Feied, C., & Smith, M. (2009). A Gesture-based Tool for Sterile Browsing of Radiology Images (vol 15, pg 321, 2008). JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 16(3), 284–284.

Wachs, Juan P, Goshorn, D., & Kolsch, M. (2009). Recognizing human postures and poses in monocular still images.

Wachs, Juan P, Stern, H. I., Edan, Y., Gillam, M., Feied, C., Smith, M., & Handler, J. (2009). A novel hand gesture-based image browsing system for the operating room.

Wachs, Juan, Stern, H., Burks, T., & Alchanatis, V. (2009). Multi-modal registration using a combined similarity measure. In Applications of Soft Computing (pp. 159–168). Springer, Berlin, Heidelberg.

2008

Alchanatis, V., Wachs, J., Stern, H., Burks, T., & others. (2008). Multi-modal automatic registration of thermal-IR and RGB images of apple trees canopy. In Agricultural and biosystems engineering for a sustainable world. International Conference on Agricultural Engineering, Hersonissos, Crete, Greece, 23-25 June, 2008. European Society of Agricultural Engineers (AgEng).

Stern, H. I., Wachs, J. P., & Edan, Y. (2008a). Designing hand gesture vocabularies for natural interaction by combining psycho-physiological and recognition factors. International Journal of Semantic Computing, 2(01), 137–160.

Stern, H. I., Wachs, J. P., & Edan, Y. (2008b). Optimal consensus intuitive hand gesture vocabulary design. In Semantic Computing, 2008 IEEE International Conference on (pp. 96–103). IEEE.

Wachs, J, Stern, H., Edan, Y., Gillam, M., Feied, C., Smith, M., & Handler, J. (2008). A hand gesture sterile tool for browsing MRI images in the OR. Journal of the American Medical Informatics Association, 15(3), 321–323.

Wachs, JP, Stern, H., Edan, Y., Gillam, M., Handler, J., Feied, C., & Smith, M. (2009). A Gesture-based Tool for Sterile Browsing of Radiology Images (vol 15, pg 321, 2008). JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 16(3), 284–284.

Wachs, Juan P, Stern, H. I., Edan, Y., Gillam, M., Handler, J., Feied, C., & Smith, M. (2008). A gesture-based tool for sterile browsing of radiology images. Journal of the American Medical Informatics Association, 15(3), 321–323.

Wachs, Juan, Stern, H., & Edan, Y. (2008). A holistic framework for hand gestures design. In 2nd Annual Visual and Iconic Language Conference (pp. 24–34).

Wachs, Juan, Stern, H., Edan, Y., Gillam, M., Feied, C., Smithd, M., & Handler, J. (2008). Real-time hand gesture interface for browsing medical images. International Journal of Intelligent Computing in Medical Sciences & Image Processing, 2(1), 15–25.

2007

Stern, H., Wachs, J., & Edan, Y. (2007a). A method for selection of optimal hand gesture vocabularies. In International Gesture Workshop (pp. 57–68). Springer, Berlin, Heidelberg.

Stern, H., Wachs, J., & Edan, Y. (2007b). An Analytic Approach for Optimal Hand Gestures. In Proceedings of GW2007-7th International Workshop on Gesture in Human-Computer Interaction and Simulation 2007–POSTER SESSION (p. 12).

Wachs, Juan, Stern, H., Edan, Y., Gillam, M., Feied, C., Smith, M., & Handler, J. (2007a). Doctor-Computer Interface using Gestures. In Proceedings of GW2007-7th International Workshop on Gesture in Human-Computer Interaction and Simulation 2007–POSTER SESSION (p. 16).

Wachs, Juan, Stern, H., Edan, Y., Gillam, M., Feied, C., Smith, M., & Handler, J. (2007b). Gestix: a doctor-computer sterile gesture interface for dynamic environments. In Soft Computing in Industrial Applications (pp. 30–39). Springer, Berlin, Heidelberg.

2006

Feied, C., Gillam, M., Wachs, J., Handler, J., Stern, H., & Smith, M. (2006). A real-time gesture interface for hands-free control of electronic medical records. In AMIA Annual Symposium Proceedings (Vol. 2006, p. 920). American Medical Informatics Association.

Stern, H. I., Wachs, J. P., & Edan, Y. (2006a). Human factors for design of hand gesture human-machine interaction. In Systems, Man and Cybernetics, 2006. SMC’06. IEEE International Conference on (Vol. 5, pp. 4052–4056). IEEE.

Stern, H. I., Wachs, J. P., & Edan, Y. (2006b). Optimal hand gesture vocabulary design using psycho-physiological and technical factors. In Automatic Face and Gesture Recognition, 2006. FGR 2006. 7th International Conference on (pp. 257–262). IEEE.

Wachs, Juan. (2006). Optimal hand gesture vocabulary design methodology for virtual robotic control. Ben Gurion University.

Wachs, Juan, Shapira, O., & Stern, H. (2006). A Method to Enhance the “Possibilistic C-Means with Repulsion”Algorithm based on Cluster Validity Index. In Applied Soft Computing Technologies: The Challenge of Complexity (pp. 77–87). Springer, Berlin, Heidelberg.

Wachs, Juan, Stern, H., Edan, Y., Gillam, M., Feied, C., Smith, M., & Handler, J. (2006a). A real-time hand gesture interface for medical visualization applications. In Applications of Soft Computing (pp. 153–162). Springer, Berlin, Heidelberg.

Wachs, Juan, Stern, H., Edan, Y., Gillam, M., Feied, C., Smith, M., & others. (2006b). A real-time hand gesture interface for a medical image guided system. In In ISRACAS 2006-Ninth Israeli Symposium on Computer-Aided Surgery, Medical Robotics, and Medical Imaging.

Hairong Jiang, Bradley S. Duerstock and Juan P. Wachs, “An Analytic Approach to Decipher Usable Gestures for Quadriplegic Users”, 2014 IEEE International Conference on System, Man and Cybernetics (SMC), Oct 5-8, San Diego, California, USA, 2014.