Autonomous drone racing competitions are a proxy to improve unmanned aerial vehicles' perception, planning, and control skills. The recent emergence of autonomous nano-sized drone racing imposes new challenges, as their ∼10cm form factor heavily restricts the resources available onboard, including memory, computation, and sensors. This paper describes the methodology and technical implementation of the system winning the first autonomous nano-drone racing international competition: the “IMAV 2022 Nanocopter AI Challenge.” We developed a fully onboard deep learning approach for visual navigation trained only on simulation images to achieve this goal. Our approach includes a convolutional neural network for obstacle avoidance, a sim-to-real dataset collection procedure, and a navigation policy that we selected, characterized, and adapted through simulation and actual in-field experiments. Our system ranked 1 st among six competing teams at the competition. In our best attempt, we scored 115 m of traveled distance in the allotted 5-minute flight, never crashing while dodging static and dynamic obstacles. Sharing our knowledge with the research community, we aim to provide a solid groundwork to foster future development in this field.

A Sim-to-Real Deep Learning-based Framework for Autonomous Nano-drone Racing / Lamberti, Lorenzo; Cereda, Elia; Abbate, Gabriele; Bellone, Lorenzo; Morinigo, Victor Javier Kartsch; Barciś, Michał; Barciś, Agata; Giusti, Alessandro; Conti, Francesco; Palossi, Daniele. - In: IEEE ROBOTICS AND AUTOMATION LETTERS. - ISSN 2377-3766. - ELETTRONICO. - Early Access:(2024), pp. 1-8. [10.1109/LRA.2024.3349814]

A Sim-to-Real Deep Learning-based Framework for Autonomous Nano-drone Racing

Lamberti, Lorenzo;Morinigo, Victor Javier Kartsch;Conti, Francesco;
2024

Abstract

Autonomous drone racing competitions are a proxy to improve unmanned aerial vehicles' perception, planning, and control skills. The recent emergence of autonomous nano-sized drone racing imposes new challenges, as their ∼10cm form factor heavily restricts the resources available onboard, including memory, computation, and sensors. This paper describes the methodology and technical implementation of the system winning the first autonomous nano-drone racing international competition: the “IMAV 2022 Nanocopter AI Challenge.” We developed a fully onboard deep learning approach for visual navigation trained only on simulation images to achieve this goal. Our approach includes a convolutional neural network for obstacle avoidance, a sim-to-real dataset collection procedure, and a navigation policy that we selected, characterized, and adapted through simulation and actual in-field experiments. Our system ranked 1 st among six competing teams at the competition. In our best attempt, we scored 115 m of traveled distance in the allotted 5-minute flight, never crashing while dodging static and dynamic obstacles. Sharing our knowledge with the research community, we aim to provide a solid groundwork to foster future development in this field.
2024
A Sim-to-Real Deep Learning-based Framework for Autonomous Nano-drone Racing / Lamberti, Lorenzo; Cereda, Elia; Abbate, Gabriele; Bellone, Lorenzo; Morinigo, Victor Javier Kartsch; Barciś, Michał; Barciś, Agata; Giusti, Alessandro; Conti, Francesco; Palossi, Daniele. - In: IEEE ROBOTICS AND AUTOMATION LETTERS. - ISSN 2377-3766. - ELETTRONICO. - Early Access:(2024), pp. 1-8. [10.1109/LRA.2024.3349814]
Lamberti, Lorenzo; Cereda, Elia; Abbate, Gabriele; Bellone, Lorenzo; Morinigo, Victor Javier Kartsch; Barciś, Michał; Barciś, Agata; Giusti, Alessandro; Conti, Francesco; Palossi, Daniele
File in questo prodotto:
File Dimensione Formato  
Binder4.pdf

embargo fino al 04/01/2026

Tipo: Postprint
Licenza: Licenza per accesso libero gratuito
Dimensione 5.43 MB
Formato Adobe PDF
5.43 MB Adobe PDF   Visualizza/Apri   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/953210
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact