The recent advancements in deep learning have demonstrated that inferring high-quality depth maps from a single image has become feasible and accurate, thanks to convolutional neural networks (CNNs), but how to process such compute- and memory-intensive models on portable and low-power devices remains a concern. Dynamic energy-quality scaling is an interesting yet less explored option in this field. It can improve efficiency through opportunistic computing policies where performances are boosted only when needed, achieving on average substantial energy savings. Implementing such a computing paradigm encompasses the availability of a scalable inference model, which is the target of this work. Specifically, we describe and characterize the design of an energy-quality scalable pyramidal network (EQPyD-Net), a lightweight CNN capable of modulating at runtime the computational effort with minimal memory resources. We describe the architecture of the network and the optimization flow, covering the important aspects that enable the dynamic scaling, namely, the optimized training procedures, the compression stage via fixed-point quantization, and the code optimization for the deployment on commercial low-power CPUs adopted in the edge segment. To assess the effect of the proposed design knobs, we evaluated the prediction quality on the standard KITTI data set and the energy and memory resources on the ARM Cortex-A53 CPU. The collected results demonstrate the flexibility of the proposed network and its energy efficiency. EQPyD-Net can be shifted across five operating points, ranging from a maximum accuracy of 82.2% with 0.4 Frame/J and up to 92.6% of energy savings with 6.1% of accuracy loss, still keeping a compact memory footprint of 5.2 MB for the weights and 38.3 MB (in the worst case) for the processing.

Energy-Quality Scalable Monocular Depth Estimation on Low-Power CPUs

Poggi, Matteo;Tosi, Fabio;Aleotti, Filippo;Mattoccia, Stefano
2022

Abstract

The recent advancements in deep learning have demonstrated that inferring high-quality depth maps from a single image has become feasible and accurate, thanks to convolutional neural networks (CNNs), but how to process such compute- and memory-intensive models on portable and low-power devices remains a concern. Dynamic energy-quality scaling is an interesting yet less explored option in this field. It can improve efficiency through opportunistic computing policies where performances are boosted only when needed, achieving on average substantial energy savings. Implementing such a computing paradigm encompasses the availability of a scalable inference model, which is the target of this work. Specifically, we describe and characterize the design of an energy-quality scalable pyramidal network (EQPyD-Net), a lightweight CNN capable of modulating at runtime the computational effort with minimal memory resources. We describe the architecture of the network and the optimization flow, covering the important aspects that enable the dynamic scaling, namely, the optimized training procedures, the compression stage via fixed-point quantization, and the code optimization for the deployment on commercial low-power CPUs adopted in the edge segment. To assess the effect of the proposed design knobs, we evaluated the prediction quality on the standard KITTI data set and the energy and memory resources on the ARM Cortex-A53 CPU. The collected results demonstrate the flexibility of the proposed network and its energy efficiency. EQPyD-Net can be shifted across five operating points, ranging from a maximum accuracy of 82.2% with 0.4 Frame/J and up to 92.6% of energy savings with 6.1% of accuracy loss, still keeping a compact memory footprint of 5.2 MB for the weights and 38.3 MB (in the worst case) for the processing.
Cipolletta, Antonio; Peluso, Valentino; Calimera, Andrea; Poggi, Matteo; Tosi, Fabio; Aleotti, Filippo; Mattoccia, Stefano
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/820503
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact