HariyaNobuki / CoverNet

Implementation of multi-modal path prediction - CoverNet with Pytorch and nuscenes dataset (Not completed)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

CoverNet Implementation (WARN : not completed)

This repository contains an implementation of CoverNet and Nuscenes dataset processing.

Phan-Minh, Tung, et al. "Covernet: Multimodal behavior prediction using trajectory sets." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2020.

Setup

  1. Download Repository git clone https://github.com/hyerim-mmc/CoverNet.git

  2. Download Nuscenes Dataset_Full dataset(v1.0)

    • Dataset Architecture should be as follows
    ${project_folder_name}
      |__data
        |__sets
          |__nuscenes
             |__maps
             |__samples
             |__sweeps
             |__v1.0-mini
             |__detection.json
    
  3. Download Map expansion

    • Extract the contents (folders basemap, expansion and prediction) to your nuScenes maps folder.
  4. Download Nuscenes-devkit

  5. Download Pytorch

  6. Download Tensorboard pip install tensorboard or conda install tensorboard

Run

  1. Download the trajectory sets
  2. Write own parsing/learning configuration covernet_config.json
  3. Run python train.py
    • Results will be saved in result folder
    • Check Tensorboard results using tensorboard --logdir=./result/tensorboard and click the website

About

Implementation of multi-modal path prediction - CoverNet with Pytorch and nuscenes dataset (Not completed)


Languages

Language:Python 100.0%