Nano-sized unmanned aerial vehicles (UAVs) are ideal candidates for flying Internet-of-Things smart sensors to collect information in narrow spaces. This requires ultrafast navigation under very tight memory/computation constraints. The PULP-Dronet convolutional neural network (CNN) enables autonomous navigation running aboard a nano-UAV at 19 frame/s, at the cost of a large memory footprint of 320 kB and with drone control in complex scenarios hindered by the disjoint training of collision avoidance and steering capabilities. In this work, we distill a novel family of CNNs with better capabilities than PULP-Dronet, but memory footprint reduced by up to 168× (down to 2.9 kB), achieving an inference rate of up to 139 frame/s; we collect a new open-source unified collision/steering 66 k images data set for more robust navigation; and we perform a thorough in-field analysis of both PULP-Dronet and our tiny CNNs running on a commercially available nano-UAV. Our tiniest CNN, called Tiny-PULP-Dronet v3, navigates with a 100% success rate a challenging and never-seen-before path, composed of a narrow obstacle-populated corridor and a 180° turn, at a maximum target speed of 0.5 m/s. In the same scenario, the State-of-the-Art PULP-Dronet consistently fails despite having 168× more parameters.
Lamberti, L., Bellone, L., Macan, L., Natalizio, E., Conti, F., Palossi, D., et al. (2024). Distilling Tiny and Ultrafast Deep Neural Networks for Autonomous Navigation on Nano-UAVs. IEEE INTERNET OF THINGS JOURNAL, 11(20), 33269-33281 [10.1109/JIOT.2024.3431913].
Distilling Tiny and Ultrafast Deep Neural Networks for Autonomous Navigation on Nano-UAVs
Macan L.;Conti F.;Benini L.
2024
Abstract
Nano-sized unmanned aerial vehicles (UAVs) are ideal candidates for flying Internet-of-Things smart sensors to collect information in narrow spaces. This requires ultrafast navigation under very tight memory/computation constraints. The PULP-Dronet convolutional neural network (CNN) enables autonomous navigation running aboard a nano-UAV at 19 frame/s, at the cost of a large memory footprint of 320 kB and with drone control in complex scenarios hindered by the disjoint training of collision avoidance and steering capabilities. In this work, we distill a novel family of CNNs with better capabilities than PULP-Dronet, but memory footprint reduced by up to 168× (down to 2.9 kB), achieving an inference rate of up to 139 frame/s; we collect a new open-source unified collision/steering 66 k images data set for more robust navigation; and we perform a thorough in-field analysis of both PULP-Dronet and our tiny CNNs running on a commercially available nano-UAV. Our tiniest CNN, called Tiny-PULP-Dronet v3, navigates with a 100% success rate a challenging and never-seen-before path, composed of a narrow obstacle-populated corridor and a 180° turn, at a maximum target speed of 0.5 m/s. In the same scenario, the State-of-the-Art PULP-Dronet consistently fails despite having 168× more parameters.File | Dimensione | Formato | |
---|---|---|---|
Distilling_Tiny_and_Ultrafast_Deep_Neural_Networks_for_Autonomous_Navigation_on_Nano-UAVs.pdf
accesso aperto
Descrizione: versione editoriale
Tipo:
Versione (PDF) editoriale / Version Of Record
Licenza:
Creative commons
Dimensione
4.82 MB
Formato
Adobe PDF
|
4.82 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.