.is work introduces an ultra-low-power visual sensor node coupling event-based binary acquisition with Binarized Neural Networks (BNNs) to deal with the stringent power requirements of always-on vision systems for IoT applications. By exploiting insensor mixed-signal processing, an ultra-low-power imager generates a sparse visual signal of binary spatial-gradient features. .e sensor output, packed as a stream of events corresponding to the asserted gradient binary values, is transferred to a 4-core processor when the amount of data detected a.er frame di.erence surpasses a given threshold. .en, a BNN trained with binary gradients as input runs on the parallel processor if a meaningful activity is detected in a pre-processing stage. During the BNN computation, the proposed Event-based Binarized Neural Network model achieves a system energy saving of 17.8% with respect to a baseline system including a low-power RGB imager and a Binarized Neural Network, while paying a classi.cation performance drop of only 3% for a real-life 3-classes classi.cation scenario. .e energy reduction increases up to 8x when considering a long-term always-on monitoring scenario, thanks to the event-driven behavior of the processing sub-system.

Always-ON visual node with a hardware-software event-based binarized neural network inference engine / Rusci, Manuele; Rossi, Davide; Flamand, Eric; Gottardi, Massimo; Farella, Elisabetta; Benini, Luca. - STAMPA. - (2018), pp. 314-319. (Intervento presentato al convegno 15th ACM International Conference on Computing Frontiers, CF 2018 tenutosi a Ischia, Italy nel May 08-10, 2018) [10.1145/3203217.3204463].

Always-ON visual node with a hardware-software event-based binarized neural network inference engine

Rusci, Manuele;Rossi, Davide;Farella, Elisabetta;Benini, Luca
2018

Abstract

.is work introduces an ultra-low-power visual sensor node coupling event-based binary acquisition with Binarized Neural Networks (BNNs) to deal with the stringent power requirements of always-on vision systems for IoT applications. By exploiting insensor mixed-signal processing, an ultra-low-power imager generates a sparse visual signal of binary spatial-gradient features. .e sensor output, packed as a stream of events corresponding to the asserted gradient binary values, is transferred to a 4-core processor when the amount of data detected a.er frame di.erence surpasses a given threshold. .en, a BNN trained with binary gradients as input runs on the parallel processor if a meaningful activity is detected in a pre-processing stage. During the BNN computation, the proposed Event-based Binarized Neural Network model achieves a system energy saving of 17.8% with respect to a baseline system including a low-power RGB imager and a Binarized Neural Network, while paying a classi.cation performance drop of only 3% for a real-life 3-classes classi.cation scenario. .e energy reduction increases up to 8x when considering a long-term always-on monitoring scenario, thanks to the event-driven behavior of the processing sub-system.
2018
2018 ACM International Conference on Computing Frontiers, CF 2018 - Proceedings
314
319
Always-ON visual node with a hardware-software event-based binarized neural network inference engine / Rusci, Manuele; Rossi, Davide; Flamand, Eric; Gottardi, Massimo; Farella, Elisabetta; Benini, Luca. - STAMPA. - (2018), pp. 314-319. (Intervento presentato al convegno 15th ACM International Conference on Computing Frontiers, CF 2018 tenutosi a Ischia, Italy nel May 08-10, 2018) [10.1145/3203217.3204463].
Rusci, Manuele; Rossi, Davide; Flamand, Eric; Gottardi, Massimo; Farella, Elisabetta; Benini, Luca
File in questo prodotto:
File Dimensione Formato  
Always-ON Visual node.pdf

accesso aperto

Descrizione: Articolo Postprint
Tipo: Postprint
Licenza: Licenza per accesso libero gratuito
Dimensione 1.88 MB
Formato Adobe PDF
1.88 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/677193
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 8
  • ???jsp.display-item.citation.isi??? 7
social impact