ondrejbiza / sunds

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Scene understanding datasets

Unittests PyPI version

SunDs is a collection of ready-to-use datasets for scene understanding tasks (3d object detection, semantic segmentation, nerf rendering,...). It provides:

  • An API to easily load datasets to feed into your ML models.
  • A collection of ready-to-use datasets.
  • Helper tools to create new datasets.
import sunds

ds = sunds.load('kubric:nerf_synthetic/lego', split='train', task=sunds.tasks.Nerf())
for ex in ds:
  ex['ray_origin']

To use sunds, see the documentation:

Load datasets

Some datasets are pre-processed and published directly in gs://kubric-public/tfds. You can stream them directly from GCS with:

sunds.load('kubric:nerf_synthetic/lego')

The kubric: prefix is just an alias for

sunds.load('nerf_synthetic/lego', data_dir='gs://kubric-public/tfds')

For best performance, it's recommended to copy the data locally with gsutil:

pip install gsutil  # Only once

# Download the `nerf_synthetic_frames` and `nerf_synthetic_scenes` datasets
DATA_DIR=~/tensorflow_datasets/
mkdir $DATA_DIR
gsutil -m cp -r gs://kubric-public/tfds/nerf_synthetic_*/ $DATA_DIR

After the data has been copied locally, it can be loaded directly.

sunds.load('nerf_synthetic/lego')

If you copy locally to another folder than ~/tensorflow_datasets/, you'll have to specify data_dir='/path/to/tfds/'.

This is not an official Google product.

About

License:Apache License 2.0


Languages

Language:Python 100.0%