With the increase of population in the world, the demand for quality food is increasing too. In recent years, increasing demand and environmental factors have heavily influenced the agricultural production. Automation and robotics for fruit and vegetable production/monitoring have become the new standard. This paper discusses an autonomous Unmanned Aerial Vehicle (UAV) able to navigate through rows orchard rows. The UAV is comprised of a flight controller (AP stack), a microcontroller for analog reading of different sensors, and an On-Board Computer (OBC). Pictures are taken through a camera and streamed through WiFi to a Ground Control Computer (GCC) running a convolution neural network model. Based on prior training, the model outputs three directions: RIGHT, LEFT and STRAIGHT. A moving average of multiple frames per second is extracted and sent to a build-in Proportional-IntegralDerivative (PID) controller on the UAV. After error correction from this feedback, controller sends the direction to the flight controller using MAVLink protocol's radio channel overrides, thus performing autonomous navigation.

Sensor-fusion and deep neural networks for autonomous UAV navigation within orchards / Breslla K.; Bortolotti G.; Boini A.; Perulli G.; Morandi B.; Grappadelli L.C.; Manfrini L.. - ELETTRONICO. - (2020), pp. 9277568.230-9277568.235. (Intervento presentato al convegno 3rd IEEE International Workshop on Metrology for Agriculture and Forestry, MetroAgriFor 2020 tenutosi a University of Trento, ita nel 2020) [10.1109/MetroAgriFor50201.2020.9277568].

Sensor-fusion and deep neural networks for autonomous UAV navigation within orchards

Bortolotti G.;Boini A.;Perulli G.;Morandi B.;Grappadelli L. C.;Manfrini L.
2020

Abstract

With the increase of population in the world, the demand for quality food is increasing too. In recent years, increasing demand and environmental factors have heavily influenced the agricultural production. Automation and robotics for fruit and vegetable production/monitoring have become the new standard. This paper discusses an autonomous Unmanned Aerial Vehicle (UAV) able to navigate through rows orchard rows. The UAV is comprised of a flight controller (AP stack), a microcontroller for analog reading of different sensors, and an On-Board Computer (OBC). Pictures are taken through a camera and streamed through WiFi to a Ground Control Computer (GCC) running a convolution neural network model. Based on prior training, the model outputs three directions: RIGHT, LEFT and STRAIGHT. A moving average of multiple frames per second is extracted and sent to a build-in Proportional-IntegralDerivative (PID) controller on the UAV. After error correction from this feedback, controller sends the direction to the flight controller using MAVLink protocol's radio channel overrides, thus performing autonomous navigation.
2020
2020 IEEE International Workshop on Metrology for Agriculture and Forestry, MetroAgriFor 2020 - Proceedings
230
235
Sensor-fusion and deep neural networks for autonomous UAV navigation within orchards / Breslla K.; Bortolotti G.; Boini A.; Perulli G.; Morandi B.; Grappadelli L.C.; Manfrini L.. - ELETTRONICO. - (2020), pp. 9277568.230-9277568.235. (Intervento presentato al convegno 3rd IEEE International Workshop on Metrology for Agriculture and Forestry, MetroAgriFor 2020 tenutosi a University of Trento, ita nel 2020) [10.1109/MetroAgriFor50201.2020.9277568].
Breslla K.; Bortolotti G.; Boini A.; Perulli G.; Morandi B.; Grappadelli L.C.; Manfrini L.
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/792177
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
social impact