Objective. Deep learning tools applied to high-resolution neurophysiological data have significantly progressed, offering enhanced decoding, real-time processing, and readability for practical applications. However, the design of artificial neural networks to analyze neural activityin vivoremains a challenge, requiring a delicate balance between efficiency in low-data regimes and the interpretability of the results.Approach. To address this challenge, we introduce a novel specialized transformer architecture to analyze single-neuron spiking activity. The model is tested on multi-electrode recordings from the dorsal premotor cortex of non-human primates performing a motor inhibition task.Main results. The proposed architecture provides an early prediction of the correct movement direction, achieving accurate results no later than 230 ms after the Go signal presentation across animals. Additionally, the model can forecast whether the movement will be generated or withheld before a stop signal, unattended, is actually presented. To further understand the internal dynamics of the model, we compute the predicted correlations between time steps and between neurons at successive layers of the architecture, with the evolution of these correlations mirrors findings from previous theoretical analyses.Significance. Overall, our framework provides a comprehensive use case for the practical implementation of deep learning tools in motor control research, highlighting both the predictive capabilities and interpretability of the proposed architecture.
Spatio-temporal transformers for decoding neural movement control / Candelori, Benedetta; Bardella, Giampiero; Spinelli, Indro; Ramawat, Surabhi; Pani, Pierpaolo; Ferraina, Stefano; Scardapane, Simone. - In: JOURNAL OF NEURAL ENGINEERING. - ISSN 1741-2560. - 22:1(2025), pp. 1-14. [10.1088/1741-2552/adaef0]
Spatio-temporal transformers for decoding neural movement control
Candelori, BenedettaCo-primo
Software
;Bardella, Giampiero
Co-primo
Conceptualization
;Spinelli, Indro
Conceptualization
;Ramawat, SurabhiData Curation
;Pani, PierpaoloFunding Acquisition
;Ferraina, StefanoFunding Acquisition
;Scardapane, Simone
Conceptualization
2025
Abstract
Objective. Deep learning tools applied to high-resolution neurophysiological data have significantly progressed, offering enhanced decoding, real-time processing, and readability for practical applications. However, the design of artificial neural networks to analyze neural activityin vivoremains a challenge, requiring a delicate balance between efficiency in low-data regimes and the interpretability of the results.Approach. To address this challenge, we introduce a novel specialized transformer architecture to analyze single-neuron spiking activity. The model is tested on multi-electrode recordings from the dorsal premotor cortex of non-human primates performing a motor inhibition task.Main results. The proposed architecture provides an early prediction of the correct movement direction, achieving accurate results no later than 230 ms after the Go signal presentation across animals. Additionally, the model can forecast whether the movement will be generated or withheld before a stop signal, unattended, is actually presented. To further understand the internal dynamics of the model, we compute the predicted correlations between time steps and between neurons at successive layers of the architecture, with the evolution of these correlations mirrors findings from previous theoretical analyses.Significance. Overall, our framework provides a comprehensive use case for the practical implementation of deep learning tools in motor control research, highlighting both the predictive capabilities and interpretability of the proposed architecture.File | Dimensione | Formato | |
---|---|---|---|
Candelori_Spatio-temporal-transformers_2025.pdf
accesso aperto
Tipologia:
Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza:
Creative commons
Dimensione
2.51 MB
Formato
Adobe PDF
|
2.51 MB | Adobe PDF |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.