kotthoff / LiDAR4D

πŸ’« [CVPR 2024] LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis

Home Page:https://dyfcalid.github.io/LiDAR4D

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

LiDAR4D

LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis
Zehan Zheng, Fan Lu, Weiyi Xue, Guang Chen†, Changjun Jiang († Corresponding author)
CVPR 2024

Paper | Project Page

This repository is the official PyTorch implementation for LiDAR4D.

Changelog

2023-4-13:πŸ“ˆ We update U-Net of LiDAR4D for better ray-drop refinement.
2023-4-5:πŸš€ Code of LiDAR4D is released.
2023-4-4:πŸ”₯ You can reach the preprint paper on arXiv as well as the project page.
2023-2-27:πŸŽ‰ Our paper is accepted by CVPR 2024.

Introduction

LiDAR4D is a differentiable LiDAR-only framework for novel space-time LiDAR view synthesis, which reconstructs dynamic driving scenarios and generates realistic LiDAR point clouds end-to-end. It adopts 4D hybrid neural representations and motion priors derived from point clouds for geometry-aware and time-consistent large-scale scene reconstruction.

Getting started

Installation

git clone https://github.com/ispc-lab/LiDAR4D.git
cd LiDAR4D

conda create -n lidar4d python=3.9
conda activate lidar4d

# PyTorch
# CUDA 12.1
pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu121
# CUDA 11.8
# pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu118
# CUDA <= 11.7
# pip install torch==2.0.0 torchvision torchaudio

# Dependencies
pip install -r requirements.txt

# Local compile for tiny-cuda-nn
git clone --recursive https://github.com/nvlabs/tiny-cuda-nn
cd tiny-cuda-nn/bindings/torch
python setup.py install

# compile packages in utils
cd utils/chamfer3D
python setup.py install

Dataset

KITTI-360 dataset (Download)

We use sequence00 (2013_05_28_drive_0000_sync) for experiments in our paper.

Download KITTI-360 dataset (2D images are not needed) and put them into data/kitti360.
(or use symlinks: ln -s DATA_ROOT/KITTI-360 ./data/kitti360/).
The folder tree is as follows:

data
└── kitti360
    └── KITTI-360
        β”œβ”€β”€ calibration
        β”œβ”€β”€ data_3d_raw
        └── data_poses

Next, run KITTI-360 dataset preprocessing: (set DATASET and SEQ_ID)

bash preprocess_data.sh

After preprocessing, your folder structure should look like this:

configs
β”œβ”€β”€ kitti360_{sequence_id}.txt
data
└── kitti360
    β”œβ”€β”€ KITTI-360
    β”‚   β”œβ”€β”€ calibration
    β”‚   β”œβ”€β”€ data_3d_raw
    β”‚   └── data_poses
    β”œβ”€β”€ train
    β”œβ”€β”€ transforms_{sequence_id}test.json
    β”œβ”€β”€ transforms_{sequence_id}train.json
    └── transforms_{sequence_id}val.json

Run LiDAR4D

Set corresponding sequence config path in --config and you can modify logging file path in --workspace. Remember to set available GPU ID in CUDA_VISIBLE_DEVICES.
Run the following command:

# KITTI-360
bash run_kitti_lidar4d.sh

Acknowledgment

We sincerely appreciate the great contribution of the following works:

Citation

Please use the following citation if you find our repo or paper helps:

@inproceedings{zheng2024lidar4d,
  title     = {LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis},
  author    = {Zheng, Zehan and Lu, Fan and Xue, Weiyi and Chen, Guang and Jiang, Changjun},
  booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year      = {2024}
  }

License

All code within this repository is under Apache License 2.0.

About

πŸ’« [CVPR 2024] LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis

https://dyfcalid.github.io/LiDAR4D

License:Apache License 2.0


Languages

Language:Python 94.9%Language:Cuda 4.0%Language:C++ 0.7%Language:Shell 0.4%