RuslanAgishev / bev-net

Bird-eye view map construction for a mobile robot based on RGB and point cloud input

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Bird Eye View Networks (BEV-nets)

This repository containes forks of the SOTA works on local map construction for a mobile robot based on sensory input. For relevant papers with code, please, refer to the BEV-map construction notion section.

The end-to-end architecture that directly extracts a bird’s-eye-view semantic representation of a scene given image data from an arbitrary number of cameras.

Given a single color image captured from a driving platform, the model predicts the bird's-eye view semantic layout of the road and other traffic participants.

KITTI Argoverse

Joint Perception and Motion Prediction for Autonomous Driving Based on Bird's Eye View Maps. In addition to semantic information, the model also predicts motion direction of the cells on a local map based on sequence of lidar sweeps input.

About

Bird-eye view map construction for a mobile robot based on RGB and point cloud input


Languages

Language:Jupyter Notebook 94.1%Language:Python 5.8%Language:CMake 0.1%Language:Shell 0.0%Language:Dockerfile 0.0%Language:Makefile 0.0%