salesforce / BLIP

PyTorch code for BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Batch size for pre-training

aries-young opened this issue · comments

hello, I wonder which batch size should be chosen for pre-training? In configs/pretrain.yaml batch_size=75, but in paper batch_size=2880 for vit-b.