bernakabadayi / ganavatar

[3DV'24] GAN-Avatar: Controllable Personalized GAN-based Human Head Avatar

Home Page:https://ganavatar.github.io/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

GAN-Avatar: Controllable Personalized GAN-based Human Head Avatar

Berna Kabadayi · Wojciech Zielonka · Bharat Lal Bhatnagar . Gerard Pons-Moll . Justus Thies

International Conference on 3D Vision (3DV), 2024

teaser

This is the Pytorch implementation of GAN-Avatar. More details please check our project page.

Getting Started

Clone the repo

git clone https://github.com/bernakabadayi/ganavatar
cd ganavatar

Setup the submodules recursively

mkdir eg3d
git submodule update --init --recursive

Ganavatar uses eg3d for finetuning and its submodule Deep3DFaceRecon_pytorch for data processing. Please follow their instructions to setup it.

Following the issue from eg3d apply the following patch to fix triplane init.

git apply patch/eg3d.patch

Download the pretrained eg3d model trained on ffhq and put inside models/ folder.

Finetunine eg3d with the following options.

conda activate eg3d

python eg3d/train.py --data=/wojtek_1gan --gpus=8 --batch=32 --cfg=ffhq --gamma=5 --snap=10 --outdir=training_runs_rebut --gen_pose_cond=False --neural_rendering_resolution_initial=128 --neural_rendering_resolution_final=128 --resume=/models/eg3d-fixed-triplanes-ffhq.pkl --metrics=none

Generate frontal looking images for expression mapping network.

python gen_images_eg3d.py --args=/cfg/datagen/args_nf01_neck.yaml

Ganavatar uses expression parameters to train mapping network. Expression parameters from frontal images can be extracted as follows:

python scripts/preprocess_mapping.py --indir=/frontal/img

Train mapping network

python lib/mapping_train.py --args ../cfg/mapnet/args_train_nf01_neck.yaml

Test mapping network

python mapping_test.py --args ../../cfg/maptest/args_test_nf01_neck.yaml

Run Ganavatar on your dataset

Sample dataset and appearance model training json can be found here.

We provide scripts to process INSTA actors for Ganavatar training.

Your tracked mesh (i.e., FLAME) should align with eg3d marching cube result, located in models/. After obtaining transformation matrix, run insta2ganavatar.py

python scripts/insta2ganavatar.py

Pretrained models

If you need the pretrained models, please contact berna.kabadayi@tue.mpg.de

Citation

Cite us if you find this repository is helpful to your project:

@misc{kabadayi2023ganavatar,
      title={GAN-Avatar: Controllable Personalized GAN-based Human Head Avatar}, 
      author={Berna Kabadayi and Wojciech Zielonka and Bharat Lal Bhatnagar and Gerard Pons-Moll and Justus Thies},
      year={2023},
      eprint={2311.13655},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

Acknowledgments

Here are some great resources we benefit from:

License

This code and model are available for non-commercial scientific research purposes as defined in the LICENSE file. By downloading and using the code and model you agree to the terms in the LICENSE. Please also check eg3d license.

About

[3DV'24] GAN-Avatar: Controllable Personalized GAN-based Human Head Avatar

https://ganavatar.github.io/

License:Other


Languages

Language:Python 100.0%