tarashakhurana / 4d-occ-forecasting

CVPR 2023: Official code for `Point Cloud Forecasting as a Proxy for 4D Occupancy Forecasting'

Home Page:https://www.cs.cmu.edu/~tkhurana/ff4d/index.html

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Training & inference time

wzzheng opened this issue · comments

Thanks for the exciting work! Could you let me know the approximate training and inference time? How much memory does it take for training and on what device? Thanks in advance!

Hi, thanks for your question! Training takes ~8hrs on nuScenes on 8 x NVIDIA RTX 3090s, while using up about 80% of its memory. Inference on the same GPU per batch happens at 3Hz.

Hi, thanks for your question! Training takes ~8hrs on nuScenes on 8 x NVIDIA RTX 3090s, while using up about 80% of its memory. Inference on the same GPU per batch happens at 3Hz.

According to the default settings, the test batch size is 2. Could you give some hinds about why the inference speed is slow?

Hi, the voxel grid size is too big for the network to process (700 x 700 x 45). If I remember correctly, this is what took the most amount of time during inference, and rendering (after the network's forward pass) is not the biggest bottleneck.