A brain machine interface (BMI) based on motor imagery (MI) enables the control of devices using brain signals while the subject imagines performing a movement. It plays a key role in prosthesis control and motor rehabilitation. To improve user comfort, preserve data privacy, and reduce the system s latency, a new trend in wearable BMIs is to execute algorithms on low-power microcontroller units (MCUs) embedded on edge devices to process the electroencephalographic (EEG) data in real-time close to the sensors. However, most of the classification models presented in the literature are too resource-demanding for low-power MCUs. This article proposes an efficient convolutional neural network (CNN) for EEG-based MI classification that achieves comparable accuracy while being orders of magnitude less resource-demanding and significantly more energy-efficient than state-of-the-art (SoA) models. To further reduce the model complexity, we propose an automatic channel selection method based on spatial filters and quantize both weights and activations to 8-bit precision with negligible accuracy loss. Finally, we implement and evaluate the proposed models on leading-edge parallel ultralow-power (PULP) MCUs. The final two-class solution consumes as little as 30 μ J/inference with a runtime of 2.95 ms/inference and an accuracy of 82.51% while using 6.4× fewer EEG channels, becoming the new SoA for embedded MI-BMI and defining a new Pareto frontier in the three-way trade-off among accuracy, resource cost, and power usage.

Wang, X., Hersche, M., Magno, M., Benini, L. (2024). MI-BMInet: An Efficient Convolutional Neural Network for Motor Imagery Brain–Machine Interfaces With EEG Channel Selection. IEEE SENSORS JOURNAL, 24(6), 8835-8847 [10.1109/jsen.2024.3353146].

MI-BMInet: An Efficient Convolutional Neural Network for Motor Imagery Brain–Machine Interfaces With EEG Channel Selection

Magno, Michele;Benini, Luca
2024

Abstract

A brain machine interface (BMI) based on motor imagery (MI) enables the control of devices using brain signals while the subject imagines performing a movement. It plays a key role in prosthesis control and motor rehabilitation. To improve user comfort, preserve data privacy, and reduce the system s latency, a new trend in wearable BMIs is to execute algorithms on low-power microcontroller units (MCUs) embedded on edge devices to process the electroencephalographic (EEG) data in real-time close to the sensors. However, most of the classification models presented in the literature are too resource-demanding for low-power MCUs. This article proposes an efficient convolutional neural network (CNN) for EEG-based MI classification that achieves comparable accuracy while being orders of magnitude less resource-demanding and significantly more energy-efficient than state-of-the-art (SoA) models. To further reduce the model complexity, we propose an automatic channel selection method based on spatial filters and quantize both weights and activations to 8-bit precision with negligible accuracy loss. Finally, we implement and evaluate the proposed models on leading-edge parallel ultralow-power (PULP) MCUs. The final two-class solution consumes as little as 30 μ J/inference with a runtime of 2.95 ms/inference and an accuracy of 82.51% while using 6.4× fewer EEG channels, becoming the new SoA for embedded MI-BMI and defining a new Pareto frontier in the three-way trade-off among accuracy, resource cost, and power usage.
2024
Wang, X., Hersche, M., Magno, M., Benini, L. (2024). MI-BMInet: An Efficient Convolutional Neural Network for Motor Imagery Brain–Machine Interfaces With EEG Channel Selection. IEEE SENSORS JOURNAL, 24(6), 8835-8847 [10.1109/jsen.2024.3353146].
Wang, Xiaying; Hersche, Michael; Magno, Michele; Benini, Luca
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/1004678
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 34
  • ???jsp.display-item.citation.isi??? 26
social impact