Autonomously navigating robots need to perceive and interpret their surroundings. Currently, cameras are among the most used sensors due to their high resolution and frame rates at relatively low-energy consumption and cost. In recent years, cutting-edge sensors, such as miniaturized depth cameras, have demonstrated strong potential, specifically for nano-size unmanned aerial vehicles (UAVs), where low-power consumption, lightweight hardware, and low-computational demand are essential. However, cameras are limited to working under good lighting conditions, while depth cameras have a limited range. To maximize robustness, we propose to fuse a millimeter form factor 64 pixel depth sensor and a low-resolution grayscale camera. In this work, a nano-UAV learns to detect and fly through a gate with a lightweight autonomous navigation system based on two tinyML convolutional neural network models trained in simulation, running entirely onboard in 7.6 ms and with an accuracy above 91%. Field tests are based on the Crazyflie 2.1, featuring a total mass of 39 g. We demonstrate the robustness and potential of our navigation policy in multiple application scenarios, with a failure probability down to 1.2. 10-3 crash/meter, experiencing only two crashes on a cumulative flight distance of 1.7 km.

Kalenberg, K., Muller, H., Polonelli, T., Schiaffino, A., Niculescu, V., Cioflan, C., et al. (2024). Stargate: Multimodal Sensor Fusion for Autonomous Navigation on Miniaturized UAVs. IEEE INTERNET OF THINGS JOURNAL, 11(12), 21372-21390 [10.1109/JIOT.2024.3363036].

Stargate: Multimodal Sensor Fusion for Autonomous Navigation on Miniaturized UAVs

Polonelli T.;Schiaffino A.;Benini L.
2024

Abstract

Autonomously navigating robots need to perceive and interpret their surroundings. Currently, cameras are among the most used sensors due to their high resolution and frame rates at relatively low-energy consumption and cost. In recent years, cutting-edge sensors, such as miniaturized depth cameras, have demonstrated strong potential, specifically for nano-size unmanned aerial vehicles (UAVs), where low-power consumption, lightweight hardware, and low-computational demand are essential. However, cameras are limited to working under good lighting conditions, while depth cameras have a limited range. To maximize robustness, we propose to fuse a millimeter form factor 64 pixel depth sensor and a low-resolution grayscale camera. In this work, a nano-UAV learns to detect and fly through a gate with a lightweight autonomous navigation system based on two tinyML convolutional neural network models trained in simulation, running entirely onboard in 7.6 ms and with an accuracy above 91%. Field tests are based on the Crazyflie 2.1, featuring a total mass of 39 g. We demonstrate the robustness and potential of our navigation policy in multiple application scenarios, with a failure probability down to 1.2. 10-3 crash/meter, experiencing only two crashes on a cumulative flight distance of 1.7 km.
2024
Kalenberg, K., Muller, H., Polonelli, T., Schiaffino, A., Niculescu, V., Cioflan, C., et al. (2024). Stargate: Multimodal Sensor Fusion for Autonomous Navigation on Miniaturized UAVs. IEEE INTERNET OF THINGS JOURNAL, 11(12), 21372-21390 [10.1109/JIOT.2024.3363036].
Kalenberg, K.; Muller, H.; Polonelli, T.; Schiaffino, A.; Niculescu, V.; Cioflan, C.; Magno, M.; Benini, L.
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/1004681
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? ND
social impact