Human Motion Analysis
Human Motion Analysis
Research Area

Modeling, analysis, and prediction of human motion is an important and indispensable area of research in human centered design and virtual prototyping in the automotive industry.  The application of human simulation technology toward prediction of comfort, ergonomics, occupant safety, occupant package design, human-machine interaction, and other disciplines promises to overcome limitations imposed by experimentation with real human subjects or their mechanical surrogates. ​  At HRI, we utilize research in the areas of robotics, biomechanics, computer vision, and motor control to develop technology that generates and analyzes human movement for various automotive applications, including ergonomics assessment in manufacturing, vehicle occupant package design, and physical human machine interaction.​     

To generate human movements, we adopt a differential kinematics-based approach from robotics to implement an algorithm for predicting the posture and movement of humans performing the required tasks. These algorithms generally require computing a low dimensional description of motion from a set of observations using computer vision techniques, or from wearable sensor networks such as inertial measurement units (IMUs). Earlier versions of these algorithms were developed for Honda's humanoid robotics applications. Those algorithms have been adapted for human kinematic and dynamic structures and realized in OpenSim, a widely used open-source, user extensible software system that lets users develop models of musculoskeletal structures and create kinematic or dynamic simulations of movement. The generated motion is used to drive a biomechanical simulation in OpenSim to predict various kinematic, dynamic, and energetic indicators of motion.​

Projects

We compute various musculoskeletal indicators of human performance when the driver is operating a vehicle under normal and emergency maneuvering.
We have developed online algorithms to transfer motion from a human demonstrator to Honda's humanoid robot, ASIMO.
This project presents a control theoretic approach for human pose estimation from a set of key feature points detected using depth image streams obtained from a time of flight imaging device.
Utilizing the latest wearable sensing technologies and patented motion prediction algorithms, the goal is to predict human movement and perform biomechanical computations based on those predictions.