xyupeng / ContrastiveCrop

[CVPR 2022 Oral] Crafting Better Contrastive Views for Siamese Representation Learning

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

A little confusion about training

lenka844 opened this issue · comments

commented

Thanks for released the brilliant work again! When i was in traing process, i found that if i resume the training, the process will rsume from the saved checkpoint, but the self.use_box will return to the False, i mean the type of crop will becam to random crop, i was confused whether that will bother the end of training? or should i set the warmup_epochs to zero, when i resume trainging, in order to make sure that the type of crop is same as previously? Thanks for your reply.

Hi,
train_set.use_box is determined here

train_set.use_box = epoch >= cfg.warmup_epochs + start_epoch

The checkpoint will save the epoch and when it resumes the mode will continue as CCrop if the epoch is larger than warmup_epochs.

commented

Hi, train_set.use_box is determined here

train_set.use_box = epoch >= cfg.warmup_epochs + start_epoch

The checkpoint will save the epoch and when it resumes the mode will continue as CCrop if the epoch is larger than warmup_epochs.

Thanks for replying. I 've noticed this part of code. But I thought this code means that no matter how epochs I start to resume, it must through the warmup epochs to make sure the epoch >= cfg.warmup_epochs + start_epoch, and that also means i have to use random crop instead of Ccrop in the checkpoints.
let me give u a example: when i beakdown at 150eochs, and my warmup epochs set to 100 epochs, it already using the Ccrop, but when i resume the training, the start_epoch in the code became to 150, and the 'epoch 'in the formula——epoch >= cfg.warmup_epochs + start_epoch, also became to 150 epoch, so the condition to use the Ccrop has yet to be met, it has to use the Random crop warmup epochs. And i try to verify this situation, it really exists,when i resume, it can't continue use the Ccrop. I wondering if it is my operation problems。

I think you are correct. I have fixed it like this:

train_set.use_box = epoch >= cfg.warmup_epochs + 1

I think now the logic is correct.
Thank you very much for finding this problem :)

commented

I think you are correct. I have fixed it like this:

train_set.use_box = epoch >= cfg.warmup_epochs + 1

I think now the logic is correct.
Thank you very much for finding this problem :)

Thanks for replying again!