Human movement prediction
Utilizing the latest wearable sensing technologies and patented motion prediction algorithms, the goal is to predict human movement and perform biomechanical computations based on those predictions.
Recovering human motion from Cartesian description of operational tasks is a challenging problem in human movement science. A major difficulty is recovery of a large number of degrees of freedom in redundant tasks that simultaneously optimize various human performance measures and adhere to physical and physiological constraints. An effective solution to this problem has many applications in areas such as vehicle occupant packaging, ergonomics, biomechanics, and man-machine interaction, among others.
In this project, human motion is predicted from observations of low dimensional motion descriptors which can be detected from various sensing modalities, including wearable sensor networks such as IMUs, or from visual processing of depth images obtained from 3D cameras such as the Kinect sensor. The algorithms we have developed take as input such description of tasks and recover the motion of an anatomically realistic high dimensional model of the human. These algorithms often consider kinematic and dynamic constraints which must be satisfied to execute the motion.