Multi-Modal Perception-based Interaction Adaptation in Robotic Touch
Humanoids 2025
Robots that can physically interact with humans in a safe, comfortable, and intuitive manner can help in a variety of settings. In this paper we propose a perception-based interaction adaptation framework. One main component of this framework is a multi-modal perception model which is grounded on the existing literature and is intended to provide a quantitative estimation of the human state- defined as the perceptions of the physical interaction- by using human, robot, and context information. The estimated human state is fed to a context-aware behavior adaptation framework which recommends robot behaviors to improve human state using a learned behavior cost model and an optimization formulation. We show the potential of such a human state estimation model by evaluating a reduced model through in-person user studies.