oridong / OPNet

Learning Unknown Space for Autonomous Navigation in Clustered Environment

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

OPNet & krrt-planner with map-prediction

Video at: https://www.youtube.com/watch?v=Qb3ni_j0Dic

Preprint: Learning-based 3D Occupancy Prediction for Autonomous Navigation in Occluded Environments

https://arxiv.org/abs/2011.03981

c++ realization of the paper: Kinodynamic RRT*: Asymptotically Optimal Motion Planning for Robots with Linear Dynamics

Building: The depth_sensor_simulator package in uav_simulator is alternative to build with GPU or CPU to render the depth sensor measurement. By default, it is set to build with GPU in CMakeLists:

Dependencies:

1. ROS (I am using Ubuntu 16.04 and ROS Kinetic, other versions maybe also usable)

2. CUDA (I am using 10.2)

3. for branch "main" (inference using NVIDIA TensorRT):
    * TensorRT 7.0.0+cuda10.2 (with TensorRT ONNX  libraries)

If you want to run with only pytorch, please refer to https://gitee.com/leewlz/opnet and branch torch

4. for branch "torch" (inference using a Python node with Pytorch ):
    * python > 2.7
    * numpy, Ipython, tensorboardX
    * Pytorch (I am using 1.3.0, later versions are also usable)

set(ENABLE_CUDA true)

Remember to change the 'arch' and 'code' flags according to your graphics card devices. for branch "torch":

  • Remember to change the model path in net_node.py -- init_param function

Run Simulation:

1. roslaunch state_machine rviz.launch  (to open rviz for visualization)
2. roslaunch state_machine bench_with_pred.launch    (generate environment, start simulator)
3. roslaunch state_machine bench_with_pred.launch  or  bench_aggres.launch    or    bench_safe.launch    (test)

fig1

About

Learning Unknown Space for Autonomous Navigation in Clustered Environment


Languages

Language:C++ 54.0%Language:Python 15.5%Language:Makefile 15.3%Language:CMake 7.5%Language:Common Lisp 3.5%Language:C 1.6%Language:Shell 1.2%Language:Cuda 0.9%Language:CSS 0.3%Language:Objective-C 0.1%Language:Fortran 0.0%Language:HTML 0.0%