huangyangyi / ELICIT

[ICCV 2023] One-shot Implicit Animatable Avatars with Model-based Priors

Home Page:https://huangyangyi.github.io/ELICIT/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Is there a way to train in a way that consumes less VRAM?

yutaokuyama opened this issue · comments

Hi, thanks for your excellent research.
I would like to do a new training to compare your method with the method I am researching.
Due to limited computing resources, I would like to keep the VRAM consumption to about 48 GB, is it possible to do so through a configuration item?

Thank you in advance.

You can try using smaller N_samples and patch.size in the config file for less VRAM consumption, for instance N_samples: 64 size: 256. Note the performance is not guaranteed for this configuration.

Thank you for your response. I really appreciate it.