Davidnet / CarND-Capstone

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Self-driving Car - System Integration

Udacity - Self-Driving Car NanoDegree

Team

Juan David Galvis jdgalviss@gmail.com

David Cardozo david@kiwicampus.com

Carlos Alvarez charlie@kiwicampus.com

John Betancourt john.betancourt93@gmail.com

Andres Rengifo anferesa239@gmail.com

Project Overview

In this project we develop a system which integrates multiple components to drive a car autonomously (drive by wire, waypoint generation, steering and throttle control and traffic light classification). This system is first implemented in simulation and then in a real car (Carla, Udacity's self driving car.)

Following modules are implemented:

Traffic light Classification

Using a ssd_mobilenetv1 pretrained model (with the COCO dataset), we took data provided by Alex lechner's group to implement transfer learning and obtain a Deep Neural Network model to detect and classify traffic light on images from both simulation and Carla (udacity's self driving car). We use 2 models, one for real images and one for simulation images.

Waypoint Updater

The main goal of this module is to publish a set of waypoints to be followed by the car, we have reduced the number of waypoints in the lane generation to 50 to reduce the computational load. First, the algorithm searches for the closest waypoint to the car and take the next 50 waypoints from the base waypoints. Then, we checked for the stop signal and if it is present, we generate a deceleration waypoint set based on the formula '10-10 exp(-x^2/128)' generating a soft brake behavior. Finally, the final waypoint list is published using a rostopic.

DBW

The drive-by-wire node implements the controllers needed to move the vehicle to follow the target waypoints. The break and throttle are regulated using a classic PID controller, the reference is the linear velocity taken from the waypoint follower. The throttle control takes into account the sign of the control output to send the corresponding brake value when it is necessary. The steering control uses the angular and linear velocities from the waypoint follower twist to calculate the corresponding steering angle.

Results

By integrating all these modules, we can make the car follow waypoints on the road's middle lane while stopping in the presence of red lights.

alt text

In order to see how the correct performance of the traffic light classifier, we set the car on manual mode for it to see red, yellow and green lights:

alt text

Video of simulation on this link.

Udacity Instructions

This is the project repo for the final project of the Udacity Self-Driving Car Nanodegree: Programming a Real Self-Driving Car. For more information about the project, see the project introduction here.

Please use one of the two installation options, either native or docker installation.

Native Installation

  • Be sure that your workstation is running Ubuntu 16.04 Xenial Xerus or Ubuntu 14.04 Trusty Tahir. Ubuntu downloads can be found here.

  • If using a Virtual Machine to install Ubuntu, use the following configuration as minimum:

    • 2 CPU
    • 2 GB system memory
    • 25 GB of free hard drive space

    The Udacity provided virtual machine has ROS and Dataspeed DBW already installed, so you can skip the next two steps if you are using this.

  • Follow these instructions to install ROS

  • Dataspeed DBW

  • Download the Udacity Simulator.

Docker Installation

Install Docker

Build the docker container

docker build . -t capstone

Run the docker file

docker run -p 4567:4567 -v $PWD:/capstone -v /tmp/log:/root/.ros/ --rm -it capstone

Port Forwarding

To set up port forwarding, please refer to the instructions from term 2

Usage

  1. Clone the project repository
git clone https://github.com/udacity/CarND-Capstone.git
  1. Install python dependencies
cd CarND-Capstone
pip install -r requirements.txt
  1. Make and run styx
cd ros
catkin_make
source devel/setup.sh
roslaunch launch/styx.launch
  1. Run the simulator

Real world testing

  1. Download training bag that was recorded on the Udacity self-driving car.
  2. Unzip the file
unzip traffic_light_bag_file.zip
  1. Play the bag file
rosbag play -l traffic_light_bag_file/traffic_light_training.bag
  1. Launch your project in site mode
cd CarND-Capstone/ros
roslaunch launch/site.launch
  1. Confirm that traffic light detection works on real life images

Other library/driver information

Outside of requirements.txt, here is information on other driver/library versions used in the simulator and Carla:

Specific to these libraries, the simulator grader and Carla use the following:

Simulator Carla
Nvidia driver 384.130 384.130
CUDA 8.0.61 8.0.61
cuDNN 6.0.21 6.0.21
TensorRT N/A N/A
OpenCV 3.2.0-dev 2.4.8
OpenMP N/A N/A

We are working on a fix to line up the OpenCV versions between the two.

About

License:MIT License


Languages

Language:CMake 37.6%Language:Python 36.4%Language:C++ 24.8%Language:Dockerfile 0.9%Language:Shell 0.3%