Continuous decoding of hand kinematics has been recently explored for the intuitive control of electroencephalography (EEG)-based Brain-Computer Interfaces (BCIs). Deep neural networks (DNNs) are emerging as powerful decoders, for their ability to automatically learn features from lightly pre-processed signals. However, DNNs for kinematics decoding lack in the interpretability of the learned features and are only used to realize within-subject decoders without testing other training approaches potentially beneficial for reducing calibration time, such as transfer learning. Here, we aim to overcome these limitations by using an interpretable convolutional neural network (ICNN) to decode 2-D hand kinematics (position and velocity) from EEG in a pursuit tracking task performed by 13 participants. The ICNN is trained using both within-subject and cross-subject strategies, and also testing the feasibility of transferring the knowledge learned on other subjects on a new one. Moreover, the network eases the interpretation of learned spectral and spatial EEG features. Our ICNN outperformed most of the other state-of-the-art decoders, showing the best trade-off between performance, size, and training time. Furthermore, transfer learning improved kinematics prediction in the low data regime. The network attributed the highest relevance for decoding to the delta-band across all subjects, and to higher frequencies (alpha, beta, low-gamma) for a cluster of them; contralateral central and parieto-occipital sites were the most relevant, reflecting the involvement of sensorimotor, visual and visuo-motor processing. The approach improved the quality of kinematics prediction from the EEG, at the same time allowing interpretation of the most relevant spectral and spatial features.

Borra D., Mondini V., Magosso E., Muller-Putz G.R. (2023). Decoding movement kinematics from EEG using an interpretable convolutional neural network. COMPUTERS IN BIOLOGY AND MEDICINE, 165, 1-20 [10.1016/j.compbiomed.2023.107323].

Decoding movement kinematics from EEG using an interpretable convolutional neural network

Borra D.
;
Magosso E.;
2023

Abstract

Continuous decoding of hand kinematics has been recently explored for the intuitive control of electroencephalography (EEG)-based Brain-Computer Interfaces (BCIs). Deep neural networks (DNNs) are emerging as powerful decoders, for their ability to automatically learn features from lightly pre-processed signals. However, DNNs for kinematics decoding lack in the interpretability of the learned features and are only used to realize within-subject decoders without testing other training approaches potentially beneficial for reducing calibration time, such as transfer learning. Here, we aim to overcome these limitations by using an interpretable convolutional neural network (ICNN) to decode 2-D hand kinematics (position and velocity) from EEG in a pursuit tracking task performed by 13 participants. The ICNN is trained using both within-subject and cross-subject strategies, and also testing the feasibility of transferring the knowledge learned on other subjects on a new one. Moreover, the network eases the interpretation of learned spectral and spatial EEG features. Our ICNN outperformed most of the other state-of-the-art decoders, showing the best trade-off between performance, size, and training time. Furthermore, transfer learning improved kinematics prediction in the low data regime. The network attributed the highest relevance for decoding to the delta-band across all subjects, and to higher frequencies (alpha, beta, low-gamma) for a cluster of them; contralateral central and parieto-occipital sites were the most relevant, reflecting the involvement of sensorimotor, visual and visuo-motor processing. The approach improved the quality of kinematics prediction from the EEG, at the same time allowing interpretation of the most relevant spectral and spatial features.
2023
Borra D., Mondini V., Magosso E., Muller-Putz G.R. (2023). Decoding movement kinematics from EEG using an interpretable convolutional neural network. COMPUTERS IN BIOLOGY AND MEDICINE, 165, 1-20 [10.1016/j.compbiomed.2023.107323].
Borra D.; Mondini V.; Magosso E.; Muller-Putz G.R.
File in questo prodotto:
File Dimensione Formato  
ComputersInBiolMedic_SuppMat_2023.pdf

accesso aperto

Descrizione: Supplementary Material
Tipo: File Supplementare
Licenza: Licenza per Accesso Aperto. Creative Commons Attribuzione (CCBY)
Dimensione 675.67 kB
Formato Adobe PDF
675.67 kB Adobe PDF Visualizza/Apri
ComputersInBiolMedic_2023-compresso.pdf

accesso aperto

Descrizione: Article
Tipo: Versione (PDF) editoriale
Licenza: Licenza per Accesso Aperto. Creative Commons Attribuzione (CCBY)
Dimensione 1.56 MB
Formato Adobe PDF
1.56 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/955929
Citazioni
  • ???jsp.display-item.citation.pmc??? 7
  • Scopus 27
  • ???jsp.display-item.citation.isi??? 22
social impact