Multi-Modal Perception and Behavior Adaptation Models for Human State Understanding and Interaction Improvement in Robotic Touch - Honda Research Institute USA

Multi-Modal Perception and Behavior Adaptation Models for Human State Understanding and Interaction Improvement in Robotic Touch

Multi-Modal Perception and Behavior Adaptation Models for Human State Understanding and Interaction Improvement in Robotic Touch

Huy Quyen Ngo and Rana Soltani Zarrin

AAAI Symposia- UR-RAD (Fall Symposium on Unifying Representations for Robot Application DevelopmentFall Symposium on Unifying Representations for Robot Application Development)

Robots that can physically interact with humans in a safe, comfortable, and intuitive manner can help in a variety of settings including home, medical care, workplace, and social settings. However, perceptions of the users greatly affect the acceptability and adoption of such robots. Ability of the system to understand user’s perception of the physical interaction as well as adapting robot’s behaviors based on user perception and interaction context can facilitate acceptability of these robots. In this paper we propose a perception based interaction adaptation framework. One main component of this framework is a multi-modal perception model which is grounded on the existing literature and is intended to provide a quantitative estimation of the human state- defined as the perceptions of the physical interaction- by using human, robot, and context information. This model is intended to be comprehensive to be usable in many physical Human-Robot Interaction (pHRI) scenarios. The estimated human state is fed to a context-aware behavior adaptation framework which recommends robot behaviors to improve human state using a learned behavior cost model and an optimization formulation. We show the potential and feasibility of such a human state estimation model by evaluating a reduced model, with data collected through a user study. Additionally, through some feature analysis, we aimed to shed light on future interaction designs for pHRI.

Downloadable item