KinWaiCheuk / demucs_lightning

Demucs Lightning: A PyTorch lightning version of Demucs with Hydra and Tensorboard features

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Change Demucs precision training

lyndonlauder opened this issue · comments

Hi, thank you for this lightning version of demucs.

Can you tell me what precision demucs trains with? Is it possible to change this to fp16/fp32? Please could you show me how to do this.

Thanks!

I have made a new commit, now you can use half-precision training via the following command:

python train.py trainer.precision=16

By default, 32-bit precision is used.

Please do a git pull origin master again to get the most updated version and try it out. Feel free to report any problem that you found!

Thank you very much @KinWaiCheuk

I currently have a demucs model checkpoint that has been trained for 200 epochs with 32-bit precision, is it possible to continue training from this checkpoint but with 16-bit precision?

I have updated the code once again. Now you can specify resume_checkpoint in CLI to choose the checkpoint to resume training.

You can have a look at this part of the README.md for more information.

Feel free to suggest more features or report bugs. Thanks for using!