The growing number of low-power smart devices in the Internet of Things is coupled with the concept of 'edge computing' that is moving some of the intelligence, especially machine learning, toward the edge of the network. Enabling machine learning algorithms to run on resource-constrained hardware, typically on low-power smart devices, is challenging in terms of hardware (optimized and energy-efficient integrated circuits), algorithmic, and firmware implementations. This article presents a FANN-on-MCU, an open-source toolkit built upon the fast artificial neural network (FANN) library to run lightweight and energy-efficient neural networks on microcontrollers based on both the ARM Cortex-M series and the novel RISC-V-based parallel ultralow-power (PULP) platform. The toolkit takes multilayer perceptrons trained with FANN and generates code targeted to low-power microcontrollers. This article also presents detailed analyses of energy efficiency across the different cores, and the optimizations to handle different network sizes. Moreover, it provides a detailed analysis of parallel speedups and degradations due to parallelization overhead and memory transfers. Further evaluations include experimental results for three different applications using a self-sustainable wearable multisensor bracelet. The experimental results show a measured latency in the order of only a few microseconds and power consumption of a few milliwatts while keeping the memory requirements below the limitations of the targeted microcontrollers. In particular, the parallel implementation on the octa-core RISC-V platform reaches a speedup of 22× and a 69% reduction in energy consumption with respect to a single-core implementation on Cortex-M4 for continuous real-time classification.
Wang X., Magno M., Cavigelli L., Benini L. (2020). FANN-on-MCU: An Open-Source Toolkit for Energy-Efficient Neural Network Inference at the Edge of the Internet of Things. IEEE INTERNET OF THINGS JOURNAL, 7(5), 4403-4417 [10.1109/JIOT.2020.2976702].
FANN-on-MCU: An Open-Source Toolkit for Energy-Efficient Neural Network Inference at the Edge of the Internet of Things
Benini L.
2020
Abstract
The growing number of low-power smart devices in the Internet of Things is coupled with the concept of 'edge computing' that is moving some of the intelligence, especially machine learning, toward the edge of the network. Enabling machine learning algorithms to run on resource-constrained hardware, typically on low-power smart devices, is challenging in terms of hardware (optimized and energy-efficient integrated circuits), algorithmic, and firmware implementations. This article presents a FANN-on-MCU, an open-source toolkit built upon the fast artificial neural network (FANN) library to run lightweight and energy-efficient neural networks on microcontrollers based on both the ARM Cortex-M series and the novel RISC-V-based parallel ultralow-power (PULP) platform. The toolkit takes multilayer perceptrons trained with FANN and generates code targeted to low-power microcontrollers. This article also presents detailed analyses of energy efficiency across the different cores, and the optimizations to handle different network sizes. Moreover, it provides a detailed analysis of parallel speedups and degradations due to parallelization overhead and memory transfers. Further evaluations include experimental results for three different applications using a self-sustainable wearable multisensor bracelet. The experimental results show a measured latency in the order of only a few microseconds and power consumption of a few milliwatts while keeping the memory requirements below the limitations of the targeted microcontrollers. In particular, the parallel implementation on the octa-core RISC-V platform reaches a speedup of 22× and a 69% reduction in energy consumption with respect to a single-core implementation on Cortex-M4 for continuous real-time classification.File | Dimensione | Formato | |
---|---|---|---|
FANN_journal3.pdf
accesso aperto
Tipo:
Postprint
Licenza:
Licenza per accesso libero gratuito
Dimensione
5.98 MB
Formato
Adobe PDF
|
5.98 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.