Emotion recognition using EEG has been widely studied to address the challenges associated with affective computing. Using manual feature extraction methods on EEG signals results in sub-optimal performance by the learning models. With the advancements in deep learning as a tool for automated feature engineering, in this work, a hybrid of manual and automatic feature extraction methods has been proposed. The asymmetry in different brain regions is captured in a 2D vector, termed the AsMap, from the differential entropy features of EEG signals. These AsMaps are then used to extract features automatically using a convolutional neural network model. The proposed feature extraction method has been compared with differential entropy and other feature extraction methods such as relative asymmetry, differential asymmetry and differential caudality. Experiments are conducted using the SJTU emotion EEG dataset and the DEAP dataset on different classification problems based on the number of classes. Results obtained indicate that the proposed method of feature extraction results in higher classification accuracy, outperforming the other feature extraction methods. The highest classification accuracy of 97.10% is achieved on a three-class classification problem using the SJTU emotion EEG dataset. Further, this work has also assessed the impact of window size on classification accuracy.

Automated feature extraction on AsMap for emotion classification using EEG / Ahmed, M. Z. I.; Sinha, N.; Phadikar, S.; Ghaderpour, E.. - In: SENSORS. - ISSN 1424-8220. - 22:6(2022). [10.3390/s22062346]

Automated feature extraction on AsMap for emotion classification using EEG

Ghaderpour E.
2022

Abstract

Emotion recognition using EEG has been widely studied to address the challenges associated with affective computing. Using manual feature extraction methods on EEG signals results in sub-optimal performance by the learning models. With the advancements in deep learning as a tool for automated feature engineering, in this work, a hybrid of manual and automatic feature extraction methods has been proposed. The asymmetry in different brain regions is captured in a 2D vector, termed the AsMap, from the differential entropy features of EEG signals. These AsMaps are then used to extract features automatically using a convolutional neural network model. The proposed feature extraction method has been compared with differential entropy and other feature extraction methods such as relative asymmetry, differential asymmetry and differential caudality. Experiments are conducted using the SJTU emotion EEG dataset and the DEAP dataset on different classification problems based on the number of classes. Results obtained indicate that the proposed method of feature extraction results in higher classification accuracy, outperforming the other feature extraction methods. The highest classification accuracy of 97.10% is achieved on a three-class classification problem using the SJTU emotion EEG dataset. Further, this work has also assessed the impact of window size on classification accuracy.
2022
Arousal; Classification; Deep learning; Electroencephalogram; Emotion; Valence; Brain; Entropy; Neural Networks, Computer; Electroencephalography; Emotions
01 Pubblicazione su rivista::01a Articolo in rivista
Automated feature extraction on AsMap for emotion classification using EEG / Ahmed, M. Z. I.; Sinha, N.; Phadikar, S.; Ghaderpour, E.. - In: SENSORS. - ISSN 1424-8220. - 22:6(2022). [10.3390/s22062346]
File allegati a questo prodotto
File Dimensione Formato  
Ahmed_Automated_2022.pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 982.25 kB
Formato Adobe PDF
982.25 kB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1655300
Citazioni
  • ???jsp.display-item.citation.pmc??? 20
  • Scopus 43
  • ???jsp.display-item.citation.isi??? 37
social impact