albumentations-team / autoalbument

AutoML for image augmentation. AutoAlbument uses the Faster AutoAugment algorithm to find optimal augmentation policies. Documentation - https://albumentations.ai/docs/autoalbument/

Home Page:https://albumentations.ai/docs/autoalbument/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Fails when batch is an odd number

golunovas opened this issue · comments

It seems like the issue is coming from here.

As far as I understand, it requires an even batch otherwise it fails here

interpolated = alpha * real + (1 - alpha) * fake

Hey, @golunovas

Could you please provide a use case for the odd batch_size? It seems that the best way to handle that problem is to include an explicit check into AutoAlbument which ensures that batch_size is an odd number before running a search phase. Otherwise, I think some problems related to the different number of augmented and not-augmented images may arise.

Well, I ran into that issue on the last batch of the epoch when I just ran a search on the generated config search.yaml where drop_last wasn't set to true for the dataloader and I had an odd number of samples in the dataset. But the odd batch size will lead to exactly the same issue. IMO, the easiest solution is to set drop_last to true by default and require an even batch size to be set in the config.

I have added the drop_last: True parameter to config files created by autoalbument-create. Example configs now also contain this parameter. The fixed version 0.0.4 is also uploaded to PyPI.