WXinlong / DenseCL

Dense Contrastive Learning (DenseCL) for self-supervised representation learning, CVPR 2021 Oral.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Details about loss_lambda warmup

alohays opened this issue · comments

Thank you for your great work.
Could you give the implementation detail or code of the loss_lambda warmup setting stated in the DenseCL paper?

Hi, it is implemented as a hook. I have uploaded it to this repo. Please refer to hooks/densecl_warmup_hook.py for details.

For usage, you just need to add this to your config file:
custom_hooks = [ dict(type='DenseCLWarmupHook', start_iters=10000) ]