firefly-hub / ir-mcl

IR-MCL: Implicit Representation-Based Online Global Localization https://arxiv.org/abs/2210.03113

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

IR-MCL: Implicit Representation-Based Online Global Localization

Haofei Kuang · Xieyuanli Chen · Tiziano Guadagnino · Nicky Zimmerman · Jens Behley · Cyrill Stachniss

University of Bonn

Online localization demo

Abstract

Determining the state of a mobile robot is an essential building block of robot navigation systems. In this paper, we address the problem of estimating the robot’s pose in an indoor environment using 2D LiDAR data and investigate how modern environment models can improve gold standard Monte-Carlo localization (MCL) systems. We propose a neural occupancy field (NOF) to implicitly represent the scene using a neural network. With the pretrained network, we can synthesize 2D LiDAR scans for an arbitrary robot pose through volume rendering. Based on the implicit representation, we can obtain the similarity between a synthesized and actual scan as an observation model and integrate it into an MCL system to perform accurate localization. We evaluate our approach on five sequences of a self-recorded dataset and three publicly available datasets. We show that we can accurately and efficiently localize a robot using our approach surpassing the localization performance of state-of-the-art methods. The experiments suggest that the presented implicit representation is able to predict more accurate 2D LiDAR scans leading to an improved observation model for our particle filter-based localization.

Dependencies

The code was tested with Ubuntu 20.04 with:

  • python version 3.9.
  • pytorch version 1.13.1 with CUDA 11.6
  • pytorch-lighting with 1.9.0

Installation

  • Clone the repo:

    git clone https://github.com/PRBonn/ir-mcl.git
    cd ir-mcl
  • Prepare the python environment (Anaconda is recommended here):

    conda env create -f environment.yml

    or

    conda create --name irmcl python=3.9.13
    conda activate irmcl
    
    conda install -c conda-forge pybind11
    pip install torch torchvision --extra-index-url https://download.pytorch.org/whl/cu116
    pip install pytorch-lightning tensorboardX
    pip install matplotlib scipy open3d
    pip install evo --upgrade --no-binary evo
  • Compile the motion model and resampling module

    cd ir-mcl/mcl & conda activate ir-mcl
    make -j4

Preparation

Datasets

Please refer to PREPARE_DATA to prepare the datasets

Pre-trained Weights

The pre-trained weights are stored at config folder, includes:

  • IPBLab dataset: config/ipblab_nof_weights.ckpt
  • Freiburg Building 079 dataset: config/fr079_nof_weights.ckpt
  • Intel Lab dataset: config/intel_nof_weights.ckpt
  • MIT CSAIL dataset: config/mit_nof_weights.ckpt

Run Experiments

Global Localization Experiments on IPBLab dataset

  • Pre-training NOF on IPBLab dataset (The train/eval/test set of IPBLab dataset are not available now, they will be released after our dataset paper is published!)

    cd ~/ir-mcl
    bash ./shells/pretraining/ipblab.sh
  • Global localization experiments

    cd ir-mcl
    python main.py --config_file ./config/global_localization/loc_config_{sequence_id}.yml
    # for example: python main.py --config_file ./config/global_localization/loc_config_test1.yml
  • Pose-tracking experiments

    cd ir-mcl
    python main.py --config_file ./config/pose_tracking/loc_config_{sequence_id}.yml
    # for example: python main.py --config_file ./config/pose_tracking/loc_config_test1.yml

Observation Model Experiments

  • Train/Test (replace "dataset" in "fr079", "intel", or "mit")
    cd ir-mcl
    bash ./shells/pretraining/{dataset}.sh
    # for example: bash ./shells/pretraining/intel.sh

Supplements for the Experimental Results

Due to the space limitation of the paper, we provide some experimental results as supplements here.

Memery cost

We provide an ablation study on the memory cost between the occupancy grid map (OGM), Hilbert map, and our neural occupancy field (NOF).

Maps type Approximate memory Loc. method RMSE: location (cm) / yaw (degree)
OGM (5cm grid size) 4.00MB AMCL
NMCL
SRRG-Loc
11.11 / 4.15
19.57 / 3.62
8.74 / 1.68
OGM (10cm grid size) 2.00MB AMCL
NMCL
SRRG-Loc
15.01 / 4.18
36.27 / 4.04
12.15 / 1.53
Hilbert Map 0.01MB HMCL 20.04 / 4.50
NOF 1.96NB IR-MCL 6.62 / 1.11

Ablation study on fixed particle numbers

We also provide the experiment to study the performance of global localization under the same particle numbers for all methods. We fixed the number of particles to 100,000. In the below table, all baselines and IR-MCL always use 100,000 particles. IR-MCL is shown for reference.

Method RMSE: location (cm) / yaw (degree)
AMCL
NMCL
HMCL
SRRG-Loc
IR-MCL
11.56 / 4.12
19.57 / 3.62
20.54 / 4.70
8.74 / 1.68
6.71 / 1.11
IR-MCL 6.62 / 1.11

Citation

If you use this library for any academic work, please cite our original paper.

@article{kuang2023ral,
  author    = {Kuang, Haofei and Chen, Xieyuanli and Guadagnino, Tiziano and Zimmerman, Nicky and Behley, Jens and Stachniss, Cyrill},
  title     = {{IR-MCL: Implicit Representation-Based Online Global Localization}},
  journal   = {IEEE Robotics and Automation Letters (RA-L)},
  doi       = {10.1109/LRA.2023.3239318},
  year      = {2023},
  codeurl   = {https://github.com/PRBonn/ir-mcl},
}

Acknowledgment

This work has partially been funded by the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101017008 (Harmony).

About

IR-MCL: Implicit Representation-Based Online Global Localization https://arxiv.org/abs/2210.03113


Languages

Language:Python 88.8%Language:C++ 8.2%Language:Shell 2.4%Language:CMake 0.5%Language:Makefile 0.1%