nmssilva / augmented_perception

Semi-automatic detection, tracking and labelling of active targets for autonomous driving.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

⚠ Instructions incomplete due to missing material ⚠

Augmented Perception package for the ATLASCAR2

This package was built under ROS Kinetic on Ubuntu 16.04.

Setup

  • Create a workspace
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws
catkin_make
  • Clone this repository into the src folder
cd ~/catkin_ws/src
git clone https://github.com/nmssilva/augmented_perception.git
  • Install dependency libraries
sudo apt-get install libraw1394-11 libgtkmm-2.4-dev libglademm-2.4-dev libgtkglextmm-x11-1.2-dev libusb-1.0-0 libpcap-dev libpcap0.8-dev
catkin_make -C ~/catkin_ws

Testing the package (playback bag)

  • Download the test bagfiles (missing link) ⚠
  • Extract it to the src folder
tar -xf bags.tar.gz -C ~/catkin_ws/src/image_labeling/image_labeling/
  • Source the workspace
source ~/catkin/devel/setup.bash

Acquire data on the car

  • Download and Install Flycapture SDK for Ubuntu 16.04 (need to register at ptgrey.com)
  • Turn on car
  • Turn on sensors in the power box
  • Connect ethernet cable
  • Disable Wi-Fi
  • Set Manual IP address to 192.168.0.1
  • Launch the drivers
roslaunch augmented_perception drivers.launch
  • Camera IP mut be set manually using the flycap command
  • Devices are ready to be used. Use Rviz or any tool to visualize laser and camera data.

About

Semi-automatic detection, tracking and labelling of active targets for autonomous driving.

License:MIT License


Languages

Language:C++ 91.5%Language:CMake 4.0%Language:C 3.4%Language:Objective-C 0.8%Language:QMake 0.2%Language:Python 0.1%