Determining the motor intentions of an individual through the analysis of electroencephalograms (EEGs) is a challenging task that concurrently holds considerable potential in aiding subjects with motor dysfunctions. Moreover, thanks to the recent advances in artificial intelligence models and EEG acquisition devices, such analyses can be carried out with ever higher accuracy. The latter aspect covers great importance, since the EEG analysis of subjects whose mental efforts are focused on moving limbs is frequently used for crucial tasks, including the control of interactive interfaces and prosthetic devices. In this paper, a novel multi-stream 1D Convolutional Neural Network (CNN) architecture is proposed. The input EEG signal is processed by four convolutional streams, which differ in the size of convolutional kernels, thus allowing the extraction of information at different time scales. The resulting 1D feature maps are then fused together and provided to a dense classifier to identify which limb the subject intended to move. Comprehensive experiments conducted on PhysioNet EEG motor movement/imagery dataset, which remains the reference collection of data in this application context, have demonstrated that the proposed model surpasses the key works in the current state-of-the-art in both cross-subject and intra-subject settings.

Multi-Stream 1D CNN for EEG Motor Imagery Classification of Limbs Activation / Avola, D.; Cinque, L.; Di Mambro, A.; Lanzino, R.; Pannone, D.; Scarcello, F.. - In: IEEE ACCESS. - ISSN 2169-3536. - 12:(2024), pp. 83940-83951. [10.1109/ACCESS.2024.3412710]

Multi-Stream 1D CNN for EEG Motor Imagery Classification of Limbs Activation

Avola D.;Cinque L.;Di Mambro A.;Lanzino R.;Pannone D.
;
2024

Abstract

Determining the motor intentions of an individual through the analysis of electroencephalograms (EEGs) is a challenging task that concurrently holds considerable potential in aiding subjects with motor dysfunctions. Moreover, thanks to the recent advances in artificial intelligence models and EEG acquisition devices, such analyses can be carried out with ever higher accuracy. The latter aspect covers great importance, since the EEG analysis of subjects whose mental efforts are focused on moving limbs is frequently used for crucial tasks, including the control of interactive interfaces and prosthetic devices. In this paper, a novel multi-stream 1D Convolutional Neural Network (CNN) architecture is proposed. The input EEG signal is processed by four convolutional streams, which differ in the size of convolutional kernels, thus allowing the extraction of information at different time scales. The resulting 1D feature maps are then fused together and provided to a dense classifier to identify which limb the subject intended to move. Comprehensive experiments conducted on PhysioNet EEG motor movement/imagery dataset, which remains the reference collection of data in this application context, have demonstrated that the proposed model surpasses the key works in the current state-of-the-art in both cross-subject and intra-subject settings.
2024
1D Convolutional Neural Networks (1D CNNs); Brain modeling; Brain-Computer Interfaces (BCIs); Convolution; Convolutional neural networks; EEG Analysis; Electroencephalography; Feature extraction; Motor Imagery; Motors; Multi-Stream; Streams
01 Pubblicazione su rivista::01a Articolo in rivista
Multi-Stream 1D CNN for EEG Motor Imagery Classification of Limbs Activation / Avola, D.; Cinque, L.; Di Mambro, A.; Lanzino, R.; Pannone, D.; Scarcello, F.. - In: IEEE ACCESS. - ISSN 2169-3536. - 12:(2024), pp. 83940-83951. [10.1109/ACCESS.2024.3412710]
File allegati a questo prodotto
File Dimensione Formato  
Avola_Multi-Stream_2024.pdf

accesso aperto

Note: DOI: 10.1109/ACCESS.2024.3412710
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 3.32 MB
Formato Adobe PDF
3.32 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1713410
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact