中文 |

Research Progress

Gait Recorded by Smart Phone Reveals Your Emotion

Nov 12, 2015

Most researches have been conducted on nonverbal signals, which are considered as extrinsic expression of human’s intrapsychic state. Emotion detection aims to determine a person's affective state automatically, which has immense potential in many areas, including health care, psychological detection and human-computer interaction, etc. Traditional emotion detection is based on expressions, or linguistic and acoustic features in speech. However, high complexity in dealing with image and audio is inevitable.

Dr. ZHU Tingshao’s Computational CyberPsychology Lab. (CCPL) from the Institute of Psychology of Chinese Academy of Sciences has proposed a novel method for identifying human emotion from natural walking. Results indicate that it is possible to identify emotion using gait data. Besides, ankle is more capable to reveal human emotion (angry/neutral/happy) than wrist.

Linear acceleration and gravity data were recorded by smart phone embedded acceleration sensors. Two rounds of experiments were conducted in a fixed rectangle-shaped area marked on the floor with red lines. After signing the consent form, each participant wears one smart phone on one wrist and the other on one ankle, and stands in front of the starting line. For the first round, each participant is asked to walk naturally back and forth in the area for about two minutes. Then, researchers ask him/her to report his/her current emotion state (anger) with a score from one to ten. The participant watches film clips for emotion priming. After watching, the participant is asked to walk naturally back and forth again in the same area for another one minute, just as before. Each participant is asked to report his/her current anger score and recalls the anger score after watching film clips. For the second-round, happy score is acquired, and happy clips are used.

In the experiment, all trained models (Random Tree, Random Forest, Support Vector Machine, Multilayer perception, Decision Tree) showed that there exists great difference in gait before and after watching film clip. The emotion identification accuracy on data sets from ankle is higher than that from wrist. Among above models, Support Vector Machine works the best. The accuracy for identifying angry vs. neutral is 90.31%, and identifying happy vs. neutral is 89.76%. The accuracy for identifying anger, neutral, and happy are 85%, 78%, and78%.

This study was supported by National High-tech R&D Program of China (2013AA01A606), National Basic Research Program of China (2014CB744600), Key Research Program of Chinese Academy of Sciences (CAS) (KJZD-EWL04), and CAS Strategic Priority Research Program (XDA06030800).

Contact Us
  • 86-10-68597521 (day)

    86-10-68597289 (night)

  • 86-10-68511095 (day)

    86-10-68512458 (night)

  • cas_en@cas.cn

  • 52 Sanlihe Rd., Xicheng District,

    Beijing, China (100864)

Copyright © 2002 - Chinese Academy of Sciences