Tactile sensing is a crucial perception mode for robots and human amputees in need of controlling a prosthetic device. Today, robotic and prosthetic systems are still missing the important feature of accurate tactile sensing. This lack is mainly due to the fact that the existing tactile technologies have limited spatial and temporal resolution and are either expensive or not scalable. In this article, we present the design and implementation of a hardware-software embedded system called SmartHand. It is specifically designed to enable the acquisition and real-time processing of high-resolution tactile information from a hand-shaped multisensor array for prosthetic and robotic applications. During data collection, our system can deliver a high throughput of 100 frames per second, which is 13.7x higher than previous related work. This has allowed the collection of a new tactile dataset consisting of 340000 frames while interacting with 16 objects from everyday life during five different sessions. Together with the empty hand, the dataset presents a total of 17 classes. We propose a compact yet accurate convolutional neural network that requires one order of magnitude less memory and 15.6x fewer computations compared with related work without degrading classification accuracy. The top-1 and top-3 cross-validation accuracies on the collected dataset are, respectively, 98.86% and 99.83%. We further analyze the intercession variability and obtain the best top-3 leave-one-out-validation accuracy of 77.84%. We deploy the trained model on a high-performance ARM Cortex-M7 microcontroller achieving an inference time of only 100 ms minimizing the response latency. The overall measured power consumption is 505 mW. Finally, we fabricate a new control sensor and perform additional experiments to provide analyses on sensor degradation and slip detection. This work is a step forward in giving robotic and prosthetic devices a sense of touch by demonstrating the practicality of a smart embedded system that uses a scalable tactile sensor with embedded tiny machine learning.

Wang, X.Y., Geiger, F., Niculescu, V., Magno, M., Benini, L. (2022). Leveraging Tactile Sensors for Low Latency Embedded Smart Hands for Prosthetic and Robotic Applications. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 71, 1-14 [10.1109/TIM.2022.3165828].

Leveraging Tactile Sensors for Low Latency Embedded Smart Hands for Prosthetic and Robotic Applications

Benini, L
2022

Abstract

Tactile sensing is a crucial perception mode for robots and human amputees in need of controlling a prosthetic device. Today, robotic and prosthetic systems are still missing the important feature of accurate tactile sensing. This lack is mainly due to the fact that the existing tactile technologies have limited spatial and temporal resolution and are either expensive or not scalable. In this article, we present the design and implementation of a hardware-software embedded system called SmartHand. It is specifically designed to enable the acquisition and real-time processing of high-resolution tactile information from a hand-shaped multisensor array for prosthetic and robotic applications. During data collection, our system can deliver a high throughput of 100 frames per second, which is 13.7x higher than previous related work. This has allowed the collection of a new tactile dataset consisting of 340000 frames while interacting with 16 objects from everyday life during five different sessions. Together with the empty hand, the dataset presents a total of 17 classes. We propose a compact yet accurate convolutional neural network that requires one order of magnitude less memory and 15.6x fewer computations compared with related work without degrading classification accuracy. The top-1 and top-3 cross-validation accuracies on the collected dataset are, respectively, 98.86% and 99.83%. We further analyze the intercession variability and obtain the best top-3 leave-one-out-validation accuracy of 77.84%. We deploy the trained model on a high-performance ARM Cortex-M7 microcontroller achieving an inference time of only 100 ms minimizing the response latency. The overall measured power consumption is 505 mW. Finally, we fabricate a new control sensor and perform additional experiments to provide analyses on sensor degradation and slip detection. This work is a step forward in giving robotic and prosthetic devices a sense of touch by demonstrating the practicality of a smart embedded system that uses a scalable tactile sensor with embedded tiny machine learning.
2022
Wang, X.Y., Geiger, F., Niculescu, V., Magno, M., Benini, L. (2022). Leveraging Tactile Sensors for Low Latency Embedded Smart Hands for Prosthetic and Robotic Applications. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 71, 1-14 [10.1109/TIM.2022.3165828].
Wang, XY; Geiger, F; Niculescu, V; Magno, M; Benini, L
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/904912
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 10
  • ???jsp.display-item.citation.isi??? 7
social impact