Facial Expression-Based Emotion Classification using Electrocardiogram and Respiration Signals

Dilranjan Wickramasuriya1, Mikayla K. Tessmer, Rose T. Faghih1

  • 1University of Houston

Details

Category

Poster Session

Theme

Health and Wellness Across the Lifespan

Sessions

12:15 - 14:15 | Wed 20 Nov | Upper Foyer Balcony | A1P-B

Poster Session - Health and Wellness Across the Lifespan 1

Full Text

Abstract

Automated emotion recognition from physiological signals is an ongoing research area. Many studies rely on self-reported emotion scores from subjects to generate classification labels. This can introduce labeling inconsistencies due to inter-subject variability. Facial expressions provide a more consistent means of generating labels. We generate labels by selecting locations at which subjects either displayed a visibly averse/negative reaction or laughed in video recordings. We next use a supervised learning approach for classifying these emotional responses based on electrocardiogram (EKG) and respiration signal features in an experiment where different movie/video clips were utilized to elicit feelings of joy, disgust, amusement etc. As features, we extract wavelet coefficient patches from EKG RR-interval time series and respiration waveform parameters. We use principal component analysis for dimensionality reduction and support vector machines for classification. We achieved an overall classification accuracy of 78.3%.

Additional Information

No information added

Video

No videos found