arijin / lidar-camera-calibration

It is a code for the calibration of one LIDAR and one camera. It support camera only calibration

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Calibration for one LIDAR and one camera

This code is reffered from the old version of Autoware, and only extract the calibration part.

Setup

This code has been tested in the environment of Ubuntu18.04, ROS Melodic. For other versions of ROS, it should also work theoretically.

  • ROS should be installed firstly. The install tutorial can refer https://www.ros.org/install/.

  • Code compile

    Firstly, new a directory called /catkin_usv in your /home/${YOUR NAME} and add a new directory /src in /home/${YOUR NAME}/catkin_usv . Then extract our repository and put it into /home/${YOUR NAME}/catkin_usv/src.

    $cd /home/${YOUR NAME}/catkin_usv
    $catkin_make

Note: the error may happen when compile using catkin_make.

    1. autoware_msgs包报错
    Could NOT find jsk_recognition_msgs (missing: jsk_recognition_msgs_DIR)

    解决:可能是安装ros包的时候没装完整。

    $ sudo apt install ros-melodic-jsk-recognition-msgs
    1. caktin_make的make阶段报错
    fatal error: nlopt.hpp: No such file or directory

    解决:

    $ sudo apt-get install libnlopt-dev 

Usage

  • Preparation

    ​ We should prepare a chessboard which is 8X6 for pattern number and 0.108mX0.108m for pattern size. Other number and size also be alright. But Note: sparser your point cloud is, bigger the chessboard shold be. For 16-lines LIDAR, the pattern size can be about 0.2mX0.2m.

  • Calibration ROSBAG collection

    ​ According to the official document: ./doc/CalibrationToolkit_Manual.pdf, we set 9 poisitons and 3 configurations with different tilt angles of the plane. Then record it.

    $rosbag record -O my_calibration_dataset ${Your Image Topic} ${Your Pointcloud Topic}
  • Calibrate

    Firstly, you should play the ROSBAG file my_calibration_dataset.bag.

    $rosbag play --pause -l my_calibration_dataset.bag

    Then, run the calibration toolkit.

    $cd /home/${YOUR NAME}/catkin_usv
    $source devel/setup.bash
    $cd devel/lib/calibration_camera_lidar
    $./calibration_toolkit

    Then, follow the official step.

Quick Key

  • Translation : ↑, ↓, ←, →, PgUp, PgDn;
  • Rotation : a, d, w, s, q, e;
  • Projection mode switch : 1 for perspective projection, 2 for orthogonal projection;
  • In orthogonal projection mode : - or , for small viewport, + or . for large viewport;
  • Point size : o for small, p for large;
  • Line width : k for narrow, l for broad;
  • Change background color : b for color selection;
  • Change light color : n for color selection;
  • Clear screen : Delete.

HELP

tutorial video: https://www.youtube.com/watch?v=pfBmfgHf6zg;

tutorial document html: https://gitlab.com/autowarefoundation/autoware.ai/autoware/-/wikis/Calibration.

About

It is a code for the calibration of one LIDAR and one camera. It support camera only calibration


Languages

Language:C++ 61.5%Language:Python 17.8%Language:C 7.0%Language:CMake 6.0%Language:Shell 4.4%Language:QMake 3.3%