This paper summarizes the results of the first Monocular Depth Estimation Challenge (MDEC) organized at WACV2023. This challenge evaluated the progress of self-supervised monocular depth estimation on the challenging SYNS-Patches dataset. The challenge was organized on CodaLab and received submissions from 4 valid teams. Participants were provided a devkit containing updated reference implementations for 16 State-of-the-Art algorithms and 4 novel techniques. The threshold for acceptance for novel techniques was to outperform every one of the 16 SotA baselines. All participants outperformed the baseline in traditional metrics such as MAE or AbsRel. However, pointcloud reconstruction metrics were challenging to improve upon. We found predictions were characterized by interpolation artefacts at object boundaries and errors in relative object positioning. We hope this challenge is a valuable contribution to the community and encourage authors to participate in future editions.

The Monocular Depth Estimation Challenge / Jaime Spencer; C. Stella Qian; Chris Russell; Simon Hadfield; Erich Graf; Wendy Adams; Andrew J. Schofield; James Elder; Richard Bowden; Heng Cong; Stefano Mattoccia; Matteo Poggi; Zeeshan Khan Suri; Yang Tang; Fabio Tosi; Hao Wang; Youmin Zhang; Yusheng Zhang; Chaoqiang Zhao. - ELETTRONICO. - (2023), pp. 623-632. (Intervento presentato al convegno The Monocular Depth Estimation Challenge Workshop tenutosi a Waikoloa, Hawaii (USA) nel January 7, 2023) [10.1109/WACVW58289.2023.00069].

The Monocular Depth Estimation Challenge

Stefano Mattoccia;Matteo Poggi;Fabio Tosi;Youmin Zhang;Chaoqiang Zhao
2023

Abstract

This paper summarizes the results of the first Monocular Depth Estimation Challenge (MDEC) organized at WACV2023. This challenge evaluated the progress of self-supervised monocular depth estimation on the challenging SYNS-Patches dataset. The challenge was organized on CodaLab and received submissions from 4 valid teams. Participants were provided a devkit containing updated reference implementations for 16 State-of-the-Art algorithms and 4 novel techniques. The threshold for acceptance for novel techniques was to outperform every one of the 16 SotA baselines. All participants outperformed the baseline in traditional metrics such as MAE or AbsRel. However, pointcloud reconstruction metrics were challenging to improve upon. We found predictions were characterized by interpolation artefacts at object boundaries and errors in relative object positioning. We hope this challenge is a valuable contribution to the community and encourage authors to participate in future editions.
2023
2023 IEEE/CVF Winter Conference on Applications of Computer Vision Workshops (WACVW)
623
632
The Monocular Depth Estimation Challenge / Jaime Spencer; C. Stella Qian; Chris Russell; Simon Hadfield; Erich Graf; Wendy Adams; Andrew J. Schofield; James Elder; Richard Bowden; Heng Cong; Stefano Mattoccia; Matteo Poggi; Zeeshan Khan Suri; Yang Tang; Fabio Tosi; Hao Wang; Youmin Zhang; Yusheng Zhang; Chaoqiang Zhao. - ELETTRONICO. - (2023), pp. 623-632. (Intervento presentato al convegno The Monocular Depth Estimation Challenge Workshop tenutosi a Waikoloa, Hawaii (USA) nel January 7, 2023) [10.1109/WACVW58289.2023.00069].
Jaime Spencer; C. Stella Qian; Chris Russell; Simon Hadfield; Erich Graf; Wendy Adams; Andrew J. Schofield; James Elder; Richard Bowden; Heng Cong; Stefano Mattoccia; Matteo Poggi; Zeeshan Khan Suri; Yang Tang; Fabio Tosi; Hao Wang; Youmin Zhang; Yusheng Zhang; Chaoqiang Zhao
File in questo prodotto:
File Dimensione Formato  
The Monocular Depth Estimation Challenge.pdf

accesso aperto

Descrizione: Arxiv pre-print
Tipo: Preprint
Licenza: Licenza per accesso libero gratuito
Dimensione 10.17 MB
Formato Adobe PDF
10.17 MB Adobe PDF Visualizza/Apri
01 (2).pdf

accesso aperto

Tipo: Postprint
Licenza: Licenza per accesso libero gratuito
Dimensione 9.45 MB
Formato Adobe PDF
9.45 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/906999
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 2
social impact