lmono
This is a framework of LiDAR-monocular camera fusion system. Visual information from monocular camera assist in scene recognization in LiDAR odometry and dense mapping. Automatic calibration between LiDAR and a single camera is provided as well. The pre-print version of our paper is available [here](). LMONO-Fusion: An Online SLAM System based on LiDAR-Monocular Camera Sensor Fusion
- laser odometry
- scene recognition/ loop detection
- automatic and real-time calibration
- 3D dense map with color information
Sep, 2021 This work is already submitted to RAL.
1. Prerequisites
1.1 Ubuntu and ROS
-
Ubuntu 64-bit 18.04
-
ROS Melodic. ROS INSTALLATION
1.2 Ceres Solver
- Ceres. Ceres INSTALLATION
1.3 PCL
- PCL PCL INSTALLATION
2. Build
2.1 Clone repository
cd ~/catkin_ws/src
git clone https://github.com/bobocode/lmono.git
cd ..
catkin_make
source ~/catkin_ws/devel/setup.bash
2.2 Download dataset and test rosbag
-
Download KITTI sequence
-
To generate rosbag file of kitti dataset, you may use the tools
roslaunch aloam_velodyne kitti_helper.launch
2.3 Launch ROS in different terminals
-
Launch ALOAM to gain LiDAR measurements
roslaunch aloam_velodyne aloam_velodyne_HDL_64.launch
or
roslaunch aloam_velodyne aloam_velodyne_HDL_32.launch
-
Launch Estimator Node
roslaunch monolio kitti_estimator_xx.launch
-
Rosbag Play
rosbag play xxx.bag
-
If you want to open loop detection
roslaunch monolio kitti_loop_xx.launch
-
If you want to open fusion mapping
roslaunch monolio kitti_map_00.launch
3.Acknowledgements
Thanks to A-LOAM and VINS-MONO code authors. The major codes in this repository are borrowed from their efforts.