This repository hosts the official implementation of the paper Fast Relative Pose Estimation using Relative Depth. It includes a 3-point minimal solver for relative pose implemented in PoseLib, a neural network (RelScaleNet) for estimating relative scales, and code to demonstrate their usage. We also include code for running evaluation on pre-computed SuperPoint+SuperGlue keypoints on ScanNet.
Jonathan Astermark · Yaqing Ding · Viktor Larsson · Anders Heyden
conda env create --file environment.yml
conda activate reldepth
wget -N -P weights https://vision.maths.lth.se/jastermark/relscalenet/weights/model_final.pth
The 3-point solver is implemented in PoseLib. To install it, run
git submodule update --init --recursive
cd PoseLib
pip install .
cd ..
A simple demo on a single image pair is implemented in demo.ipynb
.
We provide code to evaluate RelScaleNet and our 3-point solver on ScanNet-1500 by following the steps below. Note that the results will differ slightly from the paper, as a different RANSAC-implementation was used.
- Download the ScanNet-1500 images and pre-extracted SuperPoint+SuperGlue keypoints.
cd data
bash download_scannet.sh
cd ..
In the paper we used SuperPoint+SuperGlue keypoints extracted at 640x480 pixels, which led to matches with a lower inlier ratio. If you would like to use the same keypoints we used in the paper, they are available here:
wget -N http://vision.maths.lth.se/viktor/posebench/relative/scannet1500_spsg_old.h5
- (Optional) Pre-compute RelScaleNet estimations on the SP+SG keypoints. The results are stored in a *.h5-file.
python precompute_relscalenet.py
-
Run the notebook
evaluate_relscalenet.ipynb
to evaluate RelScaleNet on SP+SG keypoints on ScanNet. -
Run the notebook
evaluate_solver.ipynb
to evaluate our 3-point solver on SP+SG keypoints on ScanNet.
If you find our method useful, please consider citing our paper.
@inproceedings{astermark2024fast,
author = {Astermark, Jonathan and
Ding, Yaqing and
Larsson, Viktor and
Heyden, Anders},
title = {Fast Relative Pose Estimation using Relative Depth},
booktitle = {3DV},
year = {2024}
}