javrtg / DAC

DAC: Detector-Agnostic Spatial Covariances for Deep Local Features (3DV 2024)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

DAC: Detector-Agnostic Spatial Covariances for Deep Local Features


DAC: Detector-Agnostic Spatial Covariances for Deep Local Features
Javier Tirado-Garín, Frederik Warburg, Javier Civera
3DV 2024

Setup

Dependencies can be installed via conda

conda create -n dac python=3.10 --yes
conda activate dac
conda install pytorch torchvision pytorch-cuda=11.6 -c pytorch -c nvidia
conda install numpy scipy ffmpeg scikit-learn matplotlib tqdm plotly pillow pyyaml numba
pip install opencv-python einops kornia==0.6.9 open3d==0.16.0 tabulate omegaconf

or more directly via environment.yml (it may be very slow):

conda env create -f environment.yml
conda activate dac

Datasets

For running the experiments, please download

  • hpatches-sequences-release (direct link)

    [expected directory structure]
        HPATCHES
        └── hpatches-sequences-release
            ├── i_ajuntament
            ├── i_autannes
            .
            .
            .
  • TUM-RGBD freiburg1 (fr1) sequences

    [expected directory structure]
    TUM_RGBD
    ├── freiburg1_<name> # e.g. freiburg1_360
    │   ├── rgb
    │   ├── groundtruth.txt
    │   └── rgb.txt
    ├── ...
    .
    .
    .
  • KITTI odometry sequences (color images, calibration files and ground-truth poses).

    [expected directory structure]
    KITTI
    └── odometry
        └── dataset
            ├── poses
            │   ├── 00.txt
            │   ├── 01.txt
            .   .
            .   .
            .   .
            └── sequences
                ├── 00
                │   ├── image_2
                │   ├── calib.txt
                │   └── times.txt
                .
                .
                .

Absolute paths for each dataset can be modified in datasets/settings.py.

Reproducibility

  • To reproduce the first experiment (uncertainty vs. matching accuracy):

    python run_matching.py --det <model_name>

    where <model_name> corresponds to one of the evaluated systems:
    superpoint / d2net / r2d2 / keynet

    Results will be saved in experiments/matching/results.

  • To reproduce the second experiment (geometry estimation):

    python run_geometry.py --dataset <dataset> --sequence <sequence> --detectors <model_names>

    <dataset> can be either: kitti / tum_rgbd
    <sequence> depends on the selected <dataset>:

    • when kitti, <sequence> can be:
      00 / 01 / 02
    • when tum_rgbd, <sequence> can be:
      freiburg1_xyz / freiburg1_rpy / freiburg1_360.

    <model_names> correspond to the names of the systems to be evaluated. They can be supplied in batch e.g.:
    --detectors superpoint d2net r2d2 keynet

    Results will be saved in experiments/geometry/results.

    Note Since numba is used to speed up some calculations, some warnings will appear while compiling the first time. Once executed, compilation results are cached and no more warnings will appear in succesive executions.

  • To reproduce the interpretability experiment (supp. material):

    python run_interpretability.py --save --det <model_name>

    where <model_name> corresponds to one of the evaluated systems:
    superpoint / d2net / r2d2 / keynet

    Results will be saved in experiments/interpretability/results

  • To reproduce the validation of the EPnPU implementation (supp. material):

    # to use 2D and 3D synthetic noise
    python experiments/geometry/models/benchmarks/bench.py --do_acc_vs_npoints
    
    # to use 2D noise only
    python experiments/geometry/models/benchmarks/bench.py --do_acc_vs_npoints --only_2d_noise

    Results will be saved in experiments/geometry/models/benchmarks/results

BibTex

If you find this code useful, please cite the following paper.

@inproceedings{tirado2023dac,
  title={DAC: Detector-Agnostic Spatial Covariances for Deep Local Features},
  author={Tirado-Gar{\'i}n, Javier and Warburg, Frederik and Civera, Javier},
  booktitle={International Conference on 3D Vision (3DV)},
  year={2024}
}

License and Acknowledgements

The folders detectors/d2net, detectors/keynet, detectors/r2d2 and detectors/superpoint contain code based on the original system repositories (D2Net, Key.Net, R2D2, SuperPoint), each of them having different licenses. The rest of the repository is MIT licensed.

About

DAC: Detector-Agnostic Spatial Covariances for Deep Local Features (3DV 2024)

License:MIT License


Languages

Language:Python 76.0%Language:Jupyter Notebook 24.0%