BIT-XJY / EINet

Explicit Interaction for Fusion-Based Place Recognition

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

EINet

The official code and benchmark for our paper: Explicit Interaction for Fusion-Based Place Recognition.

This work has been accepted by IROS 2024 πŸŽ‰

Jingyi Xu, Junyi Ma, Qi Wu, Zijie Zhou, Yue Wang, Xieyuanli Chen, Wenxian Yu, Ling Pei*.

image

Installation

We follow the installation instructions of our codebase LCPR, which are also posted here.

  • Create a conda virtual environment and activate it
git clone git@github.com:BIT-XJY/EINet.git
cd EINet
conda create -n EINet python=3.8
conda activate EINet
  • Install other dependencies
pip install -r requirements.txt

Data Download

Note that the download data structure should be like:

nuscenes
β”œβ”€ raw_data
β”‚    β”œβ”€ maps
β”‚    β”‚    β”œβ”€ ...
β”‚    β”œβ”€ samples
β”‚    β”‚    β”œβ”€ CAM_BACK
β”‚    β”‚    β”œβ”€ CAM_BACK_LEFT
β”‚    β”‚    β”œβ”€ CAM_BACK_RIGHT
β”‚    β”‚    β”œβ”€ CAM_FRONT
β”‚    β”‚    β”œβ”€ CAM_FRONT_LEFT
β”‚    β”‚    β”œβ”€ CAM_FRONT_RIGHT
β”‚    β”‚    β”œβ”€ LIDAR_TOP
β”‚    β”‚    β”œβ”€ RADAR_BACK_LEFT
β”‚    β”‚    β”œβ”€ RADAR_BACK_RIGHT
β”‚    β”‚    β”œβ”€ RADAR_FRONT
β”‚    β”‚    β”œβ”€ RADAR_FRONT_LEFT
β”‚    β”‚    β”œβ”€ RADAR_FRONT_RIGHT
β”‚    β”œβ”€ sweeps
β”‚    β”‚    β”œβ”€ CAM_BACK
β”‚    β”‚    β”œβ”€ CAM_BACK_LEFT
β”‚    β”‚    β”œβ”€ CAM_BACK_RIGHT
β”‚    β”‚    β”œβ”€ CAM_FRONT
β”‚    β”‚    β”œβ”€ CAM_FRONT_LEFT
β”‚    β”‚    β”œβ”€ CAM_FRONT_RIGHT
β”‚    β”‚    β”œβ”€ LIDAR_TOP
β”‚    β”‚    β”œβ”€ RADAR_BACK_LEFT
β”‚    β”‚    β”œβ”€ RADAR_BACK_RIGHT
β”‚    β”‚    β”œβ”€ RADAR_FRONT
β”‚    β”‚    β”œβ”€ RADAR_FRONT_LEFT
β”‚    β”‚    β”œβ”€ RADAR_FRONT_RIGHT
β”‚    β”œβ”€ v1.0-test
β”‚    β”‚    β”œβ”€ attribute.json
β”‚    β”‚    β”œβ”€ calibrated_sensor.json
β”‚    β”‚    β”œβ”€ ...
β”‚    β”œβ”€ v1.0-traninval
β”‚    β”‚    β”œβ”€ attribute.json
β”‚    β”‚    β”œβ”€ calibrated_sensor.json
β”‚    β”‚    β”œβ”€ ...

NUSC-PR

We propose the NUSC-PR benchmark to split nuScenes datasets with self-supervised and supervised learning schemes.

Self-supervised Data Preparation

  • Extract basic information from nuScenes datasets, and split query and database for NUSC-PR.
cd NUSC-PR
cd self_supervised
python generate_basic_infos.py
python split_dataset.py
cd ..
  • The data structure with a self-supervised learning scheme should be like:
self_supervised_data
β”œβ”€ generate_basic_infos
β”‚    β”œβ”€ nuscenes_infos-bs.pkl
β”‚    β”œβ”€ nuscenes_infos-shv.pkl
β”‚    β”œβ”€ nuscenes_infos-son.pkl
β”‚    β”œβ”€ nuscenes_infos-sq.pkl
β”‚    β”œβ”€ nuscenes_infos.pkl
β”œβ”€ split_dataset
β”‚    β”œβ”€ all_train_query_pos_neg_index_in_infos.pkl
β”‚    β”œβ”€ bs_db_index_in_infos.npy
β”‚    β”œβ”€ bs_test_query_gt_index_in_infos.pkl
β”‚    β”œβ”€ bs_train_query_pos_neg_index_in_infos.pkl
β”‚    β”œβ”€ shv_db_index_in_infos.npy
β”‚    β”œβ”€ shv_test_query_gt_index_in_infos.pkl
β”‚    β”œβ”€ shv_train_query_pos_neg_index_in_infos.pkl
β”‚    β”œβ”€ son_db_index_in_infos.npy
β”‚    β”œβ”€ son_test_query_gt_index_in_infos.pkl
β”‚    β”œβ”€ son_train_query_pos_neg_index_in_infos.pkl
β”‚    β”œβ”€ sq_db_index_in_infos.npy
β”‚    β”œβ”€ sq_test_query_gt_index_in_infos.pkl
β”‚    β”œβ”€ sq_train_query_pos_neg_index_in_infos.pkl

Supervised Data Preparation

  • Extract basic information from nuScenes datasets, and split query and database for NUSC-PR.
cd supervised
python generate_basic_infos.py
python split_dataset.py
python select_pos_neg_samples_by_dis.py
python generate_selected_indicies.py
cd ..
cd ..
  • The data structure with a supervised learning scheme should be like:
supervised_data
β”œβ”€ generate_basic_infos
β”‚    β”œβ”€ nuscenes_infos-bs.pkl
β”‚    β”œβ”€ nuscenes_infos-shv.pkl
β”‚    β”œβ”€ nuscenes_infos-son.pkl
β”‚    β”œβ”€ nuscenes_infos-sq.pkl
β”‚    β”œβ”€ nuscenes_infos.pkl
β”œβ”€ generate_selected_indicies
β”‚    β”œβ”€ bs_db_index_in_infos.npy
β”‚    β”œβ”€ bs_test_query_gt_index_in_infos.pkl
β”‚    β”œβ”€ bs_train_query_pos_neg_index_in_infos.pkl
β”‚    β”œβ”€ shv_db_index_in_infos.npy
β”‚    β”œβ”€ shv_test_query_gt_index_in_infos.pkl
β”‚    β”œβ”€ shv_train_query_pos_neg_index_in_infos.pkl
β”‚    β”œβ”€ son_db_index_in_infos.npy
β”‚    β”œβ”€ son_test_query_gt_index_in_infos.pkl
β”‚    β”œβ”€ son_train_query_pos_neg_index_in_infos.pkl
β”‚    β”œβ”€ sq_db_index_in_infos.npy
β”‚    β”œβ”€ sq_test_query_gt_index_in_infos.pkl
β”‚    β”œβ”€ sq_train_query_pos_neg_index_in_infos.pkl
β”œβ”€ select_pos_neg_samples_by_dis
β”‚    β”œβ”€ bs_test_query_gt_tokens.pkl
β”‚    β”œβ”€ bs_train_query_pos_neg_tokens.pkl
β”‚    β”œβ”€ shv_test_query_gt_tokens.pkl
β”‚    β”œβ”€ shv_train_query_pos_neg_tokens.pkl
β”‚    β”œβ”€ son_test_query_gt_tokens.pkl
β”‚    β”œβ”€ son_train_query_pos_neg_tokens.pkl
β”‚    β”œβ”€ sq_test_query_gt_tokens.pkl
β”‚    β”œβ”€ sq_train_query_pos_neg_tokens.pkl
β”œβ”€ split_dataset
β”‚    β”œβ”€ bs_db_sample_token.npy
β”‚    β”œβ”€ bs_db.npy
β”‚    β”œβ”€ bs_sample_token.npy
β”‚    β”œβ”€ bs_test_query_sample_token.npy
β”‚    β”œβ”€ bs_test_query.npy
β”‚    β”œβ”€ bs_train_query_sample_token.npy
β”‚    β”œβ”€ bs_train_query.npy
β”‚    β”œβ”€ bs_val_query_sample_token.npy
β”‚    β”œβ”€ bs_val_query.npy
β”‚    β”œβ”€ shv_db_sample_token.npy
β”‚    β”œβ”€ shv_db.npy
β”‚    β”œβ”€ shv_sample_token.npy
β”‚    β”œβ”€ shv_test_query_sample_token.npy
β”‚    β”œβ”€ shv_test_query.npy
β”‚    β”œβ”€ shv_train_query_sample_token.npy
β”‚    β”œβ”€ shv_train_query.npy
β”‚    β”œβ”€ shv_val_query_sample_token.npy
β”‚    β”œβ”€ shv_val_query.npy
β”‚    β”œβ”€ son_db_sample_token.npy
β”‚    β”œβ”€ son_db.npy
β”‚    β”œβ”€ son_sample_token.npy
β”‚    β”œβ”€ son_test_query_sample_token.npy
β”‚    β”œβ”€ son_test_query.npy
β”‚    β”œβ”€ son_train_query_sample_token.npy
β”‚    β”œβ”€ son_train_query.npy
β”‚    β”œβ”€ son_val_query_sample_token.npy
β”‚    β”œβ”€ son_val_query.npy
β”‚    β”œβ”€ sq_db_sample_token.npy
β”‚    β”œβ”€ sq_db.npy
β”‚    β”œβ”€ sq_sample_token.npy
β”‚    β”œβ”€ sq_test_query_sample_token.npy
β”‚    β”œβ”€ sq_test_query.npy
β”‚    β”œβ”€ sq_train_query_sample_token.npy
β”‚    β”œβ”€ sq_train_query.npy
β”‚    β”œβ”€ sq_val_query_sample_token.npy
β”‚    β”œβ”€ sq_val_query.npy

TODO

  • Release the paper
  • Release the benchmark NUSC-PR code for EINet
  • Release the source code for EINet
  • Release our pretrained baseline model

Acknowledgement

We thank the fantastic works LCPR, ManyDepth, and AutoPlace for their pioneer code release, which provide codebase for this work.

About

Explicit Interaction for Fusion-Based Place Recognition

License:MIT License


Languages

Language:Python 100.0%