Hand movement classification via surface electromyographic (sEMG) signal is a well-established approach for advanced Human-Computer Interaction. However, sEMG movement recognition has to deal with the long-term reliability of sEMG-based control, limited by the variability affecting the sEMG signal. Embedded solutions are affected by a recognition accuracy drop over time that makes them unsuitable for reliable gesture controller design. In this paper, we present a complete wearable-class embedded system for robust sEMG-based gesture recognition, based on Temporal Convolutional Networks (TCNs). Firstly, we developed a novel TCN topology (TEMPONet), and we tested our solution on a benchmark dataset (Ninapro), achieving 49.6% average accuracy, 7.8%, better than current State-Of-the-Art (SoA). Moreover, we designed an energy-efficient embedded platform based on GAP8, a novel 8-core IoT processor. Using our embedded platform, we collected a second 20-sessions dataset to validate the system on a setup which is representative of the final deployment. We obtain 93.7% average accuracy with the TCN, comparable with a SoA SVM approach (91.1%). Finally, we profiled the performance of the network implemented on GAP8 by using an 8-bit quantization strategy to fit the memory constraint of the processor. We reach a 4 times lower memory footprint (460 kbyte) with a performance degradation of only 3% accuracy. We detailed the execution on the GAP8 platform, showing that the quantized network executes a single classification in 12.84ms with a power envelope of 0.9mJ, making it suitable for a long-lifetime wearable deployment.
Zanghieri, M., Benatti, S., Burrello, A., Kartsch, V., Conti, F., Benini, L. (2020). Robust Real-Time Embedded EMG Recognition Framework Using Temporal Convolutional Networks on a Multicore IoT Processor. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, 14(2), 244-256 [10.1109/TBCAS.2019.2959160].
Robust Real-Time Embedded EMG Recognition Framework Using Temporal Convolutional Networks on a Multicore IoT Processor
Zanghieri, Marcello;Benatti, Simone;Burrello, Alessio;Kartsch, Victor;Conti, Francesco;Benini, Luca
2020
Abstract
Hand movement classification via surface electromyographic (sEMG) signal is a well-established approach for advanced Human-Computer Interaction. However, sEMG movement recognition has to deal with the long-term reliability of sEMG-based control, limited by the variability affecting the sEMG signal. Embedded solutions are affected by a recognition accuracy drop over time that makes them unsuitable for reliable gesture controller design. In this paper, we present a complete wearable-class embedded system for robust sEMG-based gesture recognition, based on Temporal Convolutional Networks (TCNs). Firstly, we developed a novel TCN topology (TEMPONet), and we tested our solution on a benchmark dataset (Ninapro), achieving 49.6% average accuracy, 7.8%, better than current State-Of-the-Art (SoA). Moreover, we designed an energy-efficient embedded platform based on GAP8, a novel 8-core IoT processor. Using our embedded platform, we collected a second 20-sessions dataset to validate the system on a setup which is representative of the final deployment. We obtain 93.7% average accuracy with the TCN, comparable with a SoA SVM approach (91.1%). Finally, we profiled the performance of the network implemented on GAP8 by using an 8-bit quantization strategy to fit the memory constraint of the processor. We reach a 4 times lower memory footprint (460 kbyte) with a performance degradation of only 3% accuracy. We detailed the execution on the GAP8 platform, showing that the quantized network executes a single classification in 12.84ms with a power envelope of 0.9mJ, making it suitable for a long-lifetime wearable deployment.File | Dimensione | Formato | |
---|---|---|---|
Robust Real-Time.pdf
accesso aperto
Tipo:
Postprint
Licenza:
Licenza per accesso libero gratuito
Dimensione
4.34 MB
Formato
Adobe PDF
|
4.34 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.