ZikeYan / activeINR

[ICCV 2023] Active Neural Mapping

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Active Neural Mapping

Zike Yan, Haoxiang Yang, Hongbin Zha

           

Update

  • We speed up the completeness evaluation through parallel computation. [2024.03.17]

Installation

Our environment has been tested on Ubuntu 18.04 (CUDA 10.2 with RTX2080Ti) and Ubuntu 20.04(CUDA 10.2/11.3 with RTX2080Ti). Torch1.12.1 is recommended to reproduce the results.

Clone the repo and create conda environment

git clone --recurse-submodules git@github.com:ZikeYan/activeINR.git && cd activeINR

# create conda env
conda env create -f environment.yml
conda activate activeINR

Install pytorch by following the instructions.

pip install torch==1.12.1+cu102 torchvision==0.13.1+cu102 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu102

pip install -e .

Preparation

Simulated environment

Habitat-lab and habitat-sim need to be installed for simulation. We use v0.1.7 (git checkout tags/v0.1.7) and install the habitat-sim with the flag --with-cuda.

pip install -e habitat-lab
cd habitat-sim && python setup.py install --with-cuda

Data

To run the active mapping in the simulated environment, Gibson dataset for Habitat-sim and the Matterport3D dataset should be downloaded. The directory for the downloaded data should be specified in the config file of activeINR/train/configs/gibson.json via the key root.

Trained models

We adopt the DDPPO for point-goal navigation. All pre-trained models can be found here. The model should be placed in activeINR/local_policy_models and specified in the config file of activeINR/train/configs/gibson.json via the key planner.

Run

To run Active Neural Mapping on the Denmark scene of Gibson dataset, run the following command.

python activeINR/train/vis_exploration.py --config activeINR/train/configs/gibson.json --scene_id Denmark

The logs will be saved in the ./activeINR/train/logs/ folder with actions, mesh file, checkpoints of the neural map, etc.

The mesh quality and the exploration coverage can be evaluated through the following manuscript:

python activeINR/eval/eval_action.py --config activeINR/train/configs/gibson.json --scene_id Denmark --file "logs/final/gibson/Denmark/results/action.txt"

python eval/eval_mesh.py

TODO

The repo is still under construction, thanks for your patience.

  • Running with a live camera in ROS.
  • BALD implementation.
  • Loss landscape visualization.

Acknowledgement

Our code is partially based on iSDF and UPEN. We thank the authors for making these codes publicly available.

Citation

@inproceedings{Yan2023iccv,
  title={Active Neural Mapping},
  author={Yan, Zike and Yang, Haoxiang and Zha, Hongbin},
  booktitle={Intl. Conf. on Computer Vision (ICCV)},
  year={2023}
}

About

[ICCV 2023] Active Neural Mapping


Languages

Language:Python 100.0%