Visuo-tactile AR for Enhanced Safety Awareness in HRI

We describe our approach for developing a multimodal AR-system that combines visual and tactile cues in order to enhance the safety-awareness of humans in human-robot interaction tasks. Motivated by a competition for attentional resources between the need for safety-maintenance and achievement of a primary task, we employ multimodal cues that inform a user about unsafe proximities to dangerous areas. The system augments a scene with both visual output provided via AR-glasses and tactile stimuli produced by vibration motors embedded into a belt. The tactile belt allows the user to focus visual attention on a primary task while keeping him or her safety-aware. The visual representation that is additionally rendered into the scene provides visual grounding. This feedback is beneficial to a user as well as to external observers in training and supervision scenarios. We tested the system with informed and naive users to iterate over the design and to gain first insights into the utility of our multimodal approach.

Poster

Publication

Matti Krüger, Michael Gienger, and Martin Weigel

Method and system for assisting a person in assessing an environment

Granted: US11328573B1 • Pending: EP3993454A1, JP2022074085A.

Project Page BibTeX View Patent Patent

Matti Krüger, Martin Weigel, and Michael Gienger

Visuo-tactile AR for Enhanced Safety Awareness in Human-Robot Interaction

HRI 2020 Workshop “Virtual, Augmented and Mixed Reality for Human-Robot Interaction (VAM-HRI)”.

Project Page PDF Workshop Paper

Loading...