Camera relocalisation is a key computer vision problem. Common techniques match the current image against keyframes with known poses, directly regress the pose, or estimate the pose using camera-world correspondences. Regression forests have become popular for establishing correspondences. They are accurate, but previously needed to be trained offline on the target scene, preventing relocalisation in new environments. Recently, we showed how to adapt a pre-trained forest to a new scene online. The adapted forests perform similarly to offline forests, and can estimate the pose in near real time. Here, we extend this work to perform significantly better and run fully in real time: instead of simply accepting the pose produced by RANSAC, we score its final few hypotheses and select the best one; we chain several instances of our relocaliser (with different parameters) in a cascade, trying fast, less accurate relocalisation first, and falling back to slow, accurate relocalisation if needed; finally, we tune the hyperparameters of our relocalisers to achieve effective overall performance. We achieve state-of-the-art performance on the well-known 7-Scenes and Stanford 4 Scenes benchmarks. We visualise the internal behaviour of our forests, and show how to remove the need to train them offline on a generic scene.

Real-Time RGB-D Camera Pose Estimation in Novel Scenes using a Relocalisation Cascade / Cavallari, Tommaso; Golodetz, Stuart; Lord, Nicholas; Valentin, Julien; Prisacariu, Victor; Di Stefano, Luigi; Torr, Philip H S. - In: IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE. - ISSN 0162-8828. - ELETTRONICO. - 42:10(2020), pp. 2465-2477. [10.1109/TPAMI.2019.2915068]

Real-Time RGB-D Camera Pose Estimation in Novel Scenes using a Relocalisation Cascade

Cavallari, Tommaso;Di Stefano, Luigi;
2020

Abstract

Camera relocalisation is a key computer vision problem. Common techniques match the current image against keyframes with known poses, directly regress the pose, or estimate the pose using camera-world correspondences. Regression forests have become popular for establishing correspondences. They are accurate, but previously needed to be trained offline on the target scene, preventing relocalisation in new environments. Recently, we showed how to adapt a pre-trained forest to a new scene online. The adapted forests perform similarly to offline forests, and can estimate the pose in near real time. Here, we extend this work to perform significantly better and run fully in real time: instead of simply accepting the pose produced by RANSAC, we score its final few hypotheses and select the best one; we chain several instances of our relocaliser (with different parameters) in a cascade, trying fast, less accurate relocalisation first, and falling back to slow, accurate relocalisation if needed; finally, we tune the hyperparameters of our relocalisers to achieve effective overall performance. We achieve state-of-the-art performance on the well-known 7-Scenes and Stanford 4 Scenes benchmarks. We visualise the internal behaviour of our forests, and show how to remove the need to train them offline on a generic scene.
2020
Real-Time RGB-D Camera Pose Estimation in Novel Scenes using a Relocalisation Cascade / Cavallari, Tommaso; Golodetz, Stuart; Lord, Nicholas; Valentin, Julien; Prisacariu, Victor; Di Stefano, Luigi; Torr, Philip H S. - In: IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE. - ISSN 0162-8828. - ELETTRONICO. - 42:10(2020), pp. 2465-2477. [10.1109/TPAMI.2019.2915068]
Cavallari, Tommaso; Golodetz, Stuart; Lord, Nicholas; Valentin, Julien; Prisacariu, Victor; Di Stefano, Luigi; Torr, Philip H S
File in questo prodotto:
File Dimensione Formato  
08706568.pdf

accesso aperto

Tipo: Postprint
Licenza: Licenza per accesso libero gratuito
Dimensione 1.92 MB
Formato Adobe PDF
1.92 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/737458
Citazioni
  • ???jsp.display-item.citation.pmc??? 1
  • Scopus 29
  • ???jsp.display-item.citation.isi??? 32
social impact