Batch size for pre-training
aries-young opened this issue · comments
aries-young commented
hello, I wonder which batch size should be chosen for pre-training? In configs/pretrain.yaml batch_size=75, but in paper batch_size=2880 for vit-b.
PyTorch code for BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
aries-young opened this issue · comments
hello, I wonder which batch size should be chosen for pre-training? In configs/pretrain.yaml batch_size=75, but in paper batch_size=2880 for vit-b.