ZifaZhu / lmono

An Online SLAM System based on LiDAR-Monocular Camera Sensor Fusion.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

lmono

LMONO-Fusion: An Online SLAM System based on LiDAR-Monocular Camera Sensor Fusion

This is a framework of LiDAR-monocular camera fusion system. Visual information from monocular camera assist in scene recognization in LiDAR odometry and dense mapping. Automatic calibration between LiDAR and a single camera is provided as well. The pre-print version of our paper is available [here]().
  • laser odometry
  • scene recognition/ loop detection
  • automatic and real-time calibration
  • 3D dense map with color information

Sep, 2021 This work is already submitted to RAL.

1. Prerequisites

1.1 Ubuntu and ROS

1.2 Ceres Solver

1.3 PCL

2. Build

2.1 Clone repository

 cd ~/catkin_ws/src
 git clone https://github.com/bobocode/lmono.git
 cd ..
 catkin_make
 source ~/catkin_ws/devel/setup.bash

2.2 Download dataset and test rosbag

  • Download KITTI sequence

  • To generate rosbag file of kitti dataset, you may use the tools

    roslaunch aloam_velodyne kitti_helper.launch
    

2.3 Launch ROS in different terminals

  • Launch ALOAM to gain LiDAR measurements

    roslaunch aloam_velodyne aloam_velodyne_HDL_64.launch
    

or

   roslaunch aloam_velodyne aloam_velodyne_HDL_32.launch
  • Launch Estimator Node

     roslaunch monolio kitti_estimator_xx.launch
    
  • Rosbag Play

    rosbag play xxx.bag
    
  • If you want to open loop detection

     roslaunch monolio kitti_loop_xx.launch
    
  • If you want to open fusion mapping

     roslaunch monolio kitti_map_00.launch
    

3.Acknowledgements

Thanks to A-LOAM and VINS-MONO code authors. The major codes in this repository are borrowed from their efforts.

About

An Online SLAM System based on LiDAR-Monocular Camera Sensor Fusion.


Languages

Language:Makefile 53.2%Language:C++ 42.3%Language:CMake 3.7%Language:C 0.8%