MostMoodee / sensor_fusion

By integrating the data of LiDAR and camera, create teacher data sets for monocular camera.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Sensor_Fusion

In this repository, I compiled the source code using ROS for the sensor fusion

Overview

By integrating the data of LiDAR and camera, create teacher data sets for monocular camera.

Requirements

Hardware Spec

  • PC
    • OS : Ubuntu16.04
    • Memory : 8GB
    • CPU : Intel® Core™ i7-7700
    • GPU : GeForce GTX 1050-Ti
  • TX2
  • Robot
    • Sensors
      • SQ-LiDAR(Meiji Univ)
      • ZED(Stereolabs)
      • AMU
    • Vehicle
      • Differetical drive

How to Build

$cd $HOME
$cd catkin_ws/src
$git clone git@github.com:Sadaku1993/sensor_fusion.git
$cd ..
$catkin_make
$cd $HOME
$cd catkin_ws/src
$git clone git@github.com:Sadaku1993/sensor_fusion.git
$cd ..
$catkin_make

Calibration SQ LiDAR and ZED

Watch calibration

Coloring LiDAR PointCloud Using ZED

Watch coloring

DepthImage Using LiDAR Points

Watch depthimage

About

By integrating the data of LiDAR and camera, create teacher data sets for monocular camera.


Languages

Language:C++ 80.1%Language:Shell 11.1%Language:Python 5.0%Language:CMake 3.6%Language:C 0.1%