facebookresearch / mae

PyTorch implementation of MAE https//arxiv.org/abs/2111.06377

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Is possible to enable FP16 or TF32 in pretraining?

Wongboo opened this issue · comments

Is possible to enable FP16 or TF32 in pretraining?

I have the same question, anyone know?