RESLAM: A real-time robust edge-based SLAM system
Please note that RESLAM is a research project and its code is released without any warranty. RESLAM will most likely not be developed any further
In this work, we present RESLAM, a robust edge-based SLAM system for RGBD sensors. Edges are more stable under varying lighting conditions than raw intensity values, which leads to higher accuracy and robustness in scenes, where feature- or photoconsistency-based approaches often fail. The results show that our method performs best in terms of trajectory accuracy for most of the sequences indicating that edges are suitable for a multitude of scenes.
If you use this work, please cite any of the following publications:
- RESLAM: A real-time robust edge-based SLAM system, Schenk Fabian, Fraundorfer Friedrich, ICRA 2019, pdf
- Combining Edge Images and Depth Maps for Robust Visual Odometry, Schenk Fabian, Fraundorfer Friedrich, BMVC 2017, pdf,video
- Robust Edge-based Visual Odometry using Machine-Learned Edges, Schenk Fabian, Fraundorfer Friedrich, IROS 2017, pdf, video
License
RESLAM is licensed under the GNU General Public License Version 3 (GPLv3).
If you want to use this software commercially, please contact us.
Building the framework
So far, the framework has only been built and tested on the following system.
Requirements
Sophus is now part of this repository (in thirdparty/Sophus).
Building on Windows and backwards compatibility might be added in the future.
Optional
Set the optional packages in the cmake-gui
- Pangolin (for graphical viewer)
Build commands
git clone https://github.com/fabianschenk/RESLAM
cd RESLAM
mkdir build
cd build
cmake . ..
make -j
Known Issues
#2, and #3.
Segmentation Fault with Ceres/EigenSome people report a problem with Ceres/Eigen. Please, have a look at #2, and #3. Make sure that you have the latest (stable) Eigen version 3.3.X and that it matches the one used by Ceres.
#3
Segmentation fault after repeated tracking lossesIn some sequences, e.g. freiburg2_large_with_loop
, there are depth maps containing mostly invalid values.
The problem is that the Kinect and most other RGBD sensors cannot reconstruct surfaces far away from sensor (around > 6 m) due to the small baseline of the sensor.
In such cases, RESLAM does not work and might fail with a segmentation fault after repeated tracking losses. This issue will hopefully be fixed in the future.
How to reproduce the results from the paper
If you enable multi-threading, results might differ a bit since float additions are not executed in the same order during each run!
TUM dataset
Download the sequence you want to test and specify the "associate.txt" file in the dataset_tumX.yaml settings file.
To generate an "associate.txt" file, first download the "associate.py" script from TUM RGBD Tools and then run
python associate.py DATASET_XXX/rgb.txt DATASET_XXX/depth.txt > associate.txt
in the folder, where your dataset is.
In the "RESLAM" directory:
build/RESLAM config_files/reslam_settings.yaml config_files/dataset_tum1.yaml
For evaluation of the absolute trajectory error (ATE) and relative pose error (RPE) download the corresponding scripts from TUM RGBD Tools.
Supported Sensors
Support for other sensors such as Orbbec Astra Pro and Intel RealSense can be adapted from REVO.