Magellen / Autoware-1

Autonomous Vehicle Software

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Autoware for autonomous vehicle

Software for urban autonomous driving, initially developed by Tier IV. This package is now customized for egg vehicle. The following functions are supported:

  • 3D Localization
  • 3D Mapping
  • Path Following (under test)
  • Motion/Steering Control
  • Object Detection

Spec Recommendation

  • Number of CPU cores: 8
  • RAM size: 32GB
  • Storage size: 30GB

Requirements

  • ROS jade (Ubuntu 14.04)
  • OpenCV 2.4.10 or higher
  • Qt 5.2.1 or higher
  • CUDA(Optional)
  • FlyCapture2 (Optional)
  • Armadillo (Optional)

Install dependencies for Ubuntu 14.04 jade

$ sudo apt-get install ros-jade-desktop-full ros-jade-nmea-msgs ros-jade-nmea-navsat-driver ros-jade-sound-play ros-jade-jsk-visualization ros-jade-grid-map ros-jade-gps-common
$ sudo apt-get install ros-jade-controller-manager ros-jade-ros-control ros-jade-ros-controllers ros-jade-gazebo-ros-control ros-jade-sicktoolbox ros-jade-sicktoolbox-wrapper ros-jade-joystick-drivers ros-jade-novatel-span-driver
$ sudo apt-get install libnlopt-dev freeglut3-dev qtbase5-dev libqt5opengl5-dev libssh2-1-dev libarmadillo-dev libpcap-dev gksu libgl1-mesa-dev libglew-dev

NOTE: Please do not install ros-indigo-velodyne-pointcloud package. Please uninstall it if you already installed.

NOTE: Following packages are not supported in ROS Kinetic.

  • gazebo
  • orb slam
  • dpm ocv

How to Build

$ cd $HOME
$ git clone https://github.com/weisongwen/Autoware.git
$ cd ~/Autoware/ros/src
$ catkin_init_workspace
$ cd ../
$ ./catkin_make_release

How to Start

$ cd $HOME/Autoware/ros
$ ./run

How to use this for Egg Vehicle

please refer to the video (not available)

Public Road Demonstration

Steps:

    1. Open the LiDAR driver in https://github.com/weisongwen/15_velodyne. if you have any questions about the connection between the computer and the 3D LiDAR, you can refer to the https://github.com/weisongwen/15_velodyne.
    1. connect the egg vehicle to the computer by run the launch file in /home/wenws/Autoware/ros/src/computing/perception/localization/packages/ivactuator/launch directory (change the directory as you may need) starup.launch, using the follwoing command.
$ roslaunch startup.launch
    1. implemment the ndt_matching-based localization based on Autoware UI.
    1. run the egg vehicle control node in the /home/wenws/Autoware/ros/src/computing/perception/localization/packages/ndt_localizer/nodes/ndt_matching using the following command.
$ python eggvehiclecontrol.py

Research Papers for Reference

  1. S. Kato, E. Takeuchi, Y. Ishiguro, Y. Ninomiya, K. Takeda, and T. Hamada. "An Open Approach to Autonomous Vehicles", IEEE Micro, Vol. 35, No. 6, pp. 60-69, 2015. Link

Demo Videos provided by the Autoware

Public Road Demonstration

Public Road Demonstration

Instruction Videos

Quick Start

Quick Start

Loading Map Data

Loading Map Data

Localization with GNSS

Localization with GNSS

Localization without GNSS

Localization without GNSS

Mapping

Mapping

Planning with ROSBAG

Planning with ROSBAG

Planning with wf_simulator

Planning with wf_simulator

Planning with Hybrid State A*

Planning with wf_simulator

Data Processor for Bag File

Data Processor

Ftrace

Ftrace

About

Autonomous Vehicle Software

License:BSD 3-Clause "New" or "Revised" License


Languages

Language:C++ 77.5%Language:Python 8.2%Language:Java 6.1%Language:C 2.9%Language:CMake 2.6%Language:Cuda 1.8%Language:Shell 0.3%Language:Makefile 0.2%Language:MATLAB 0.2%Language:QMake 0.1%Language:Dockerfile 0.1%Language:Prolog 0.0%Language:GDB 0.0%