Faculty

Dinesh Manocha, Aniket Bera (CS/UMIACS), Jae Kun Shim (Kinesiology, Bioengineering)

Funding Agency

UMD Brain and Behavior Initiative

Year

2020

Descriptions

Learning Age and Gender Adaptive Gait Motor Control-based Emotion Using Deep Neural Networks and Affective Modeling

Detecting and classifying human emotion is one of the most challenging problems at the confluence of psychology, affective computing, kinesiology, and data science. While previous studies have shown that human observers are able to perceive another’s emotions by simply observing physical cues (like facial expressions, prosody, body gestures, and walking styles), this project aims to develop an automated artificial intelligence based technique for perceiving human emotions based on kinematic and kinetic variables—that is, based on both the contextual and intrinsic qualities of human motion. The research will examine the role of age and gender on gait-based emotion recognition using deep learning. After collecting full-body gaits across age and gender in a motion-capture lab, Bera, Shim, and Manocha will employ an autoencoder-based semi-supervised deep learning algorithm to learn perceived human emotions from walking style. They will then hierarchically pool these joint motions in a bottom-up manner, following kinematic chains in the human body, and coupling this data with both perceived emotion (by an external observer) and self-reported emotion.


Top