This paper presents a wearable electromyographic gesture recognition system based on the hyperdimensional computing paradigm, running on a programmable parallel ultra-low-power (PULP) platform. The processing chain includes efficient on-chip training, which leads to a fully embedded implementation with no need to perform any offline training on a personal computer. The proposed solution has been tested on 10 subjects in a typical gesture recognition scenario achieving 85% average accuracy on 11 gestures recognition, which is aligned with the state-of-the-art, with the unique capability of performing online learning. Furthermore, by virtue of the hardware friendly algorithm and of the efficient PULP system-on-chip (Mr. Wolf) used for prototyping and evaluation, the energy budget required to run the learning part with 11 gestures is 10.04 mJ, and 83.2 mu J per classification. The system works with a average power consumption of 10.4 mW in classification, ensuring around 29 h of autonomy with a 100 mAh battery. Finally, the scalability of the system is explored by increasing the number of channels (up to 256 electrodes), demonstrating the suitability of our approach as universal, energy-efficient biopotential wearable recognition framework.
Titolo: | Online Learning and Classification of EMG-Based Gestures on a Parallel Ultra-Low Power Platform Using Hyperdimensional Computing |
Autore/i: | Benatti S.; Montagna F.; Kartsch V.; Rahimi A.; Rossi D.; Benini L. |
Autore/i Unibo: | |
Anno: | 2019 |
Rivista: | |
Digital Object Identifier (DOI): | http://dx.doi.org/10.1109/TBCAS.2019.2914476 |
Abstract: | This paper presents a wearable electromyographic gesture recognition system based on the hyperdimensional computing paradigm, running on a programmable parallel ultra-low-power (PULP) platform. The processing chain includes efficient on-chip training, which leads to a fully embedded implementation with no need to perform any offline training on a personal computer. The proposed solution has been tested on 10 subjects in a typical gesture recognition scenario achieving 85% average accuracy on 11 gestures recognition, which is aligned with the state-of-the-art, with the unique capability of performing online learning. Furthermore, by virtue of the hardware friendly algorithm and of the efficient PULP system-on-chip (Mr. Wolf) used for prototyping and evaluation, the energy budget required to run the learning part with 11 gestures is 10.04 mJ, and 83.2 mu J per classification. The system works with a average power consumption of 10.4 mW in classification, ensuring around 29 h of autonomy with a 100 mAh battery. Finally, the scalability of the system is explored by increasing the number of channels (up to 256 electrodes), demonstrating the suitability of our approach as universal, energy-efficient biopotential wearable recognition framework. |
Data stato definitivo: | 2019-10-23T16:54:16Z |
Appare nelle tipologie: | 1.01 Articolo in rivista |
File in questo prodotto:
File | Descrizione | Tipo | Licenza | |
---|---|---|---|---|
PULP_HD.pdf | Postprint | Licenza per accesso libero gratuito | Open Access Visualizza/Apri | |
PULP_HD_EDITORIAL.pdf | Versione (PDF) editoriale | Licenza per accesso riservato | Administrator Contatta l'autore |