The sophisticated sense of touch of the human hand significantly contributes to our ability to safely, efficiently, and dexterously manipulate arbitrary objects in our environment. Robotic and prosthetic devices lack refined tactile feedback from their end-effectors, leading to counterintuitive and complex control strategies. To address this lack, tactile sensors have been designed and developed, but they are either expensive and not scalable or offer an insufficient spatial and temporal resolution. This paper focuses on overcoming these issues by designing a smart embedded system, called SmartHand, enabling the acquisition and real-time processing of high-resolution tactile information from a hand-shaped multi-sensor array for prosthetic and robotic applications. We acquire a new tactile dataset consisting of 340,000 frames while interacting with 16 objects from everyday life and the empty hand, i.e., a total of 17 classes. The design of the embedded system minimizes response latency in classification, by deploying a small yet accurate convolutional neural network on a high-performance ARM Cortex-M7 microcontroller. Compared to related work, our model requires one order of magnitude less memory and 15.6× fewer computations while achieving similar inter-session accuracy and up to 98.86% and 99.83% top-1 and top-3 cross-validation accuracy, respectively. Experimental results of the designed prototype show a total power consumption of 505 mW and a latency of only 100 ms.

SmartHand: Towards embedded smart hands for prosthetic and robotic applications / Wang X.; Geiger F.; Niculescu V.; Magno M.; Benini L.. - ELETTRONICO. - (2021), pp. 1-6. (Intervento presentato al convegno 2021 IEEE Sensors Applications Symposium, SAS 2021 tenutosi a swe nel 2021) [10.1109/SAS51076.2021.9530050].

SmartHand: Towards embedded smart hands for prosthetic and robotic applications

Benini L.
2021

Abstract

The sophisticated sense of touch of the human hand significantly contributes to our ability to safely, efficiently, and dexterously manipulate arbitrary objects in our environment. Robotic and prosthetic devices lack refined tactile feedback from their end-effectors, leading to counterintuitive and complex control strategies. To address this lack, tactile sensors have been designed and developed, but they are either expensive and not scalable or offer an insufficient spatial and temporal resolution. This paper focuses on overcoming these issues by designing a smart embedded system, called SmartHand, enabling the acquisition and real-time processing of high-resolution tactile information from a hand-shaped multi-sensor array for prosthetic and robotic applications. We acquire a new tactile dataset consisting of 340,000 frames while interacting with 16 objects from everyday life and the empty hand, i.e., a total of 17 classes. The design of the embedded system minimizes response latency in classification, by deploying a small yet accurate convolutional neural network on a high-performance ARM Cortex-M7 microcontroller. Compared to related work, our model requires one order of magnitude less memory and 15.6× fewer computations while achieving similar inter-session accuracy and up to 98.86% and 99.83% top-1 and top-3 cross-validation accuracy, respectively. Experimental results of the designed prototype show a total power consumption of 505 mW and a latency of only 100 ms.
2021
2021 IEEE Sensors Applications Symposium, SAS 2021 - Proceedings
1
6
SmartHand: Towards embedded smart hands for prosthetic and robotic applications / Wang X.; Geiger F.; Niculescu V.; Magno M.; Benini L.. - ELETTRONICO. - (2021), pp. 1-6. (Intervento presentato al convegno 2021 IEEE Sensors Applications Symposium, SAS 2021 tenutosi a swe nel 2021) [10.1109/SAS51076.2021.9530050].
Wang X.; Geiger F.; Niculescu V.; Magno M.; Benini L.
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/871048
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 2
social impact