DanielDworakowski / flot

Algorithms for training of autonomous blimp robot through collision prediction.

Home Page: http://meetfibi.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

flōt

https://danieldworakowski.github.io/meetFibi/

Compared to other autonomous platforms, a blimp provides excellent maneuverability and safety for indoor environments due to its low inertia and inherent stability. Our main goal for the project is to build a light-weight blimp and implement a local collision avoidance algorithm, and later combine it with a global planner. One of the main challenges with this project is the significant weight limitations placed on the payload of the blimp. In the design, the sensor payload is limited to a camera along with sonar sensors. Several possible solutions were explored, including traditional methods involving mapping and object detection, as well as solutions like end-to-end learning for collision avoidance.

Project Description

Given the run-time constraints and the desire for functionality in new environments with minimal labeling approaches involving algorithms like SLAM, or scene segmentation were avoided. On the other hand, the end-to-end learning approach allows for automatically labeled training data based on inputs at the time of collection, along with more expressive and generalized features. An interesting aspect of the project is the use of simulated data to first pre-train the network to reduce the need of real data. Given the nature of the problem, the DAGGER algorithm works well to alleviate distribution mismatches and helps to improve the learned policy. In the future we plan to implement deep reinforcement learning methods to tackle the problem with methods similar to CAD2RL and Cognitive Mapping and Planning for Visual Navigation.

The neural network navigating within an environment

Data collected based on https://arxiv.org/abs/1704.05588

An older version of the platform

Installed Software

  • Tensorflow
  • PyTorch
  • OpenAI Gym
  • ROS
  • AirSim

Blocks enviroment is included as a packaged version. If full install of AirSim and Unreal Engine/Editor is required, visit: https://hub.docker.com/r/raejeong/robotics_ws/

Install

Install docker

Install nvidia-docker

Clone this repo

cd flot_ws

Copy the LinuxNoEditor packaged enviroment in the /home/user/workspace/SimulationEnvironments

Edit /opt/ros/kinetic/etc/ros/python_logging.conf on Rpi to remove logging

Run

docker build . -t flot_ws

docker stop flot; docker rm flot; nvidia-docker run -it --ipc=host --env="DISPLAY" --env="QT_X11_NO_MITSHM=1" --volume="/tmp/.X11-unix:/tmp/.X11-unix:rw" -v $(pwd)/workspace:/home/user/workspace --privileged --net=host --name flot flot_ws

  • to run additional terminal

docker exec -it flot bash

  • to kill container

docker stop flot; docker rm flot

  • useful docker commands

docker system prune --all -f

docker save -o

docker load -i

Data collection on the rasp pi

  1. rosmaster --core on pi
  2. roslaunch blimp_control datacollect.launch on pi (check if camera_stream.sh script's IP is your ip)
  3. Copy data to host from ~/.ros/
  4. ffmpeg -i video.h264 image_%06d.png to extract training data
  5. python flot/workspace/tools/blimp_data_postprocessing.py --file <new-file>
  6. python flot/workspace/tools/curator/curator.py --path <new-file>

The data from step 2 will be saved in ~/.ros in a folde

The data here needs to be postprocessed: Run blimp_data_postprocessing.py --files=<name of folder in .ros e.g. 20180201_0203020> The out.csv will be outputted to the folder in .ros

Further postprocessing may be required

About

Algorithms for training of autonomous blimp robot through collision prediction.

http://meetfibi.com


Languages

Language:Python 60.7%Language:C++ 35.6%Language:CMake 1.3%Language:C 1.3%Language:EmberScript 0.6%Language:Shell 0.4%Language:Makefile 0.2%Language:Dockerfile 0.0%Language:Objective-C 0.0%Language:Batchfile 0.0%