Pavankunchala / ganimator

A motion generation model learned from a single example [SIGGRAPH 2022]

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

GANimator: Neural Motion Synthesis from a Single Sequence

Python Pytorch

This repository provides a library for novel motion synthesis from a single example, as well as applications including style transfer, motion mixing, key-frame editing and conditional generation. It is based on our work GANimator: Neural Motion Synthesis from a Single Sequence that is published in SIGGRAPH 2022.

The library is still under development.

Prerequisites

This code has been tested under Ubuntu 20.04. Before starting, please configure your Anaconda environment by

conda env create -f environment.yaml
conda activate ganimator

Or you may install the following packages (and their dependencies) manually:

  • pytorch 1.10
  • tensorboard
  • tqdm
  • scipy

Quick Start

We provide several pretrained models for various characters. Download and extract the pretrained model from Google Drive.

Novel motion synthesis

Run demo.sh. The result for Salsa and Crab Dace will be saved in ./results/pre-trained/{name}/bvh. The result after foot contact fix will be saved as result_fixed.bvh

Applications and Evaluation

Under development.

Train from scratch

We provide instructions for retraining our model.

We include several animations under ./data directory.

Here is an example for training the crab dance animation:

python train.py --bvh_prefix=./data/Crabnew --bvh_name=Crab-dance-long --save_path={save_path}

You may specify training device by --device=cuda:0 using pytorch's device convention.

For customized bvh file, specify the joint names that should be involved during the generation and the contact name in ./bvh/skeleton_databse.py, and set corresponding bvh_prefix and bvh_name parameter for train.py.

Acknowledgements

The code in models/skeleton.py is adapted from deep-motion-editing by @kfiraberman, @PeizhuoLi and @HalfSummer11.

Part of the code in bvh is adapted from the work of Daniel Holden.

Part of the training examples is taken from Mixamo and Truebones.

Citation

If you use this code for your research, please cite our paper:

@article{li2022ganimator,
  author = {Li, Peizhuo and Aberman, Kfir and Zhang, Zihan and Hanocka, Rana and Sorkine-Hornung, Olga },
  title = {GANimator: Neural Motion Synthesis from a Single Sequence},
  journal = {ACM Transactions on Graphics (TOG)},
  volume = {41},
  number = {4},
  pages = {138},
  year = {2022},
  publisher = {ACM}
}

About

A motion generation model learned from a single example [SIGGRAPH 2022]


Languages

Language:Python 99.9%Language:Shell 0.1%