Multimodal emotion recognition from physiological signals is receiving an increasing amount of attention due to the impossibility to control them at will unlike behavioral reactions, thus providing more reliable information. Existing deep learning-based methods still rely on extracted handcrafted features, not taking full advantage of the learning ability of neural networks, and often adopt a single-modality approach, while human emotions are inherently expressed in a multimodal way. In this paper, we propose a hypercomplex multimodal network equipped with a novel fusion module comprising parameterized hypercomplex multiplications. Indeed, by operating in a hypercomplex domain the operations follow algebraic rules which allow to model latent relations among learned feature dimensions for a more effective fusion step. We perform classification of valence and arousal from electroencephalogram (EEG) and peripheral physiological signals, employing the publicly available database MAHNOB-HCI surpassing a multimodal state-of-the-art network. The code of our work is freely available at https://github.com/ispamm/MHyEEG.

Hypercomplex multimodal emotion recognition from EEG and peripheral physiological signals / Lopez, E.; Chiarantano, E.; Grassucci, E.; Comminiello, D.. - (2023), pp. 1-5. (Intervento presentato al convegno 2023 IEEE International Conference on Acoustics, Speech and Signal Processing Workshops, ICASSPW 2023 tenutosi a Rhodes; Greece) [10.1109/ICASSPW59220.2023.10193329].

Hypercomplex multimodal emotion recognition from EEG and peripheral physiological signals

Lopez E.
Primo
;
Chiarantano E.
Secondo
;
Grassucci E.
Penultimo
;
Comminiello D.
Ultimo
2023

Abstract

Multimodal emotion recognition from physiological signals is receiving an increasing amount of attention due to the impossibility to control them at will unlike behavioral reactions, thus providing more reliable information. Existing deep learning-based methods still rely on extracted handcrafted features, not taking full advantage of the learning ability of neural networks, and often adopt a single-modality approach, while human emotions are inherently expressed in a multimodal way. In this paper, we propose a hypercomplex multimodal network equipped with a novel fusion module comprising parameterized hypercomplex multiplications. Indeed, by operating in a hypercomplex domain the operations follow algebraic rules which allow to model latent relations among learned feature dimensions for a more effective fusion step. We perform classification of valence and arousal from electroencephalogram (EEG) and peripheral physiological signals, employing the publicly available database MAHNOB-HCI surpassing a multimodal state-of-the-art network. The code of our work is freely available at https://github.com/ispamm/MHyEEG.
2023
2023 IEEE International Conference on Acoustics, Speech and Signal Processing Workshops, ICASSPW 2023
EEG; Hypercomplex Algebra; Hypercomplex Neural Networks; Multimodal Emotion Recognition
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Hypercomplex multimodal emotion recognition from EEG and peripheral physiological signals / Lopez, E.; Chiarantano, E.; Grassucci, E.; Comminiello, D.. - (2023), pp. 1-5. (Intervento presentato al convegno 2023 IEEE International Conference on Acoustics, Speech and Signal Processing Workshops, ICASSPW 2023 tenutosi a Rhodes; Greece) [10.1109/ICASSPW59220.2023.10193329].
File allegati a questo prodotto
File Dimensione Formato  
Lopez_Hypercomplex _2023.pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.22 MB
Formato Adobe PDF
1.22 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1693887
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 0
social impact