facebookresearch / mae

PyTorch implementation of MAE https//arxiv.org/abs/2111.06377

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Input of 64 patches instead of 224

MorYona opened this issue · comments

commented

Hi,
The data that I'm working on has small images and I want to improve the runtime of the model by reducing the input for the model to 64 instead of 224.
Is there a pretrained model available with this architecture?
If not, what steps should I follow to train one myself?

Thanks