Emotion recognition by correlating facial expressions and EEG analysis

Adrian R. Aguiñaga, Daniel E. Hernandez, Angeles Quezada, Andrés Calvillo Téllez

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

Emotion recognition is a fundamental task that any affective computing system must perform to adapt to the user’s current mood. The analysis of electroencephalography signals has gained notoriety in studying human emotions because of its non-invasive nature. This paper presents a two-stage deep learning model to recognize emotional states by correlating facial expressions and brain signals. Most of the works related to the analysis of emotional states are based on analyzing large segments of signals, generally as long as the evoked potential lasts, which could cause many other phenomena to be involved in the recognition process. Unlike with other phenomena, such as epilepsy, there is no clearly defined marker of when an event begins or ends. The novelty of the proposed model resides in the use of facial expressions as markers to improve the recognition process. This work uses a facial emotion recognition technique (FER) to create identifiers each time an emotional response is detected and uses them to extract segments of electroencephalography (EEG) records that a priori will be considered relevant for the analysis. The proposed model was tested on the DEAP dataset.

Original languageEnglish
Article number6987
JournalApplied Sciences (Switzerland)
Volume11
Issue number15
DOIs
StatePublished - 1 Aug 2021

Keywords

  • Affective computing
  • EEG
  • Emotions
  • FER
  • Machine learning
  • Neural networks

Fingerprint

Dive into the research topics of 'Emotion recognition by correlating facial expressions and EEG analysis'. Together they form a unique fingerprint.

Cite this