liz6688 / DiLo

Code and pertained models for the paper "Distilling Localization for Self-Supervised Representation Learning"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

DiLo Code and Pretrained Models

Distilling Localization for Self-Supervised Representation Learning
Nanxuan Zhao* Zhirong Wu* Rynson W.H. Lau Stephen Lin

Dataloader

saliency_dataset.py is an example dataloader for adding the DiLo copy-and-paste augmentation.

The saliency estimation models can be found:

Pretrained Models

Base Model Saliency Estimation Model download
MoCo RBD model
MoCo BASNet model
MoCo v2 RBD model
MoCo v2 BASNet model

Citation

Please cite our paper if you use DiLo in your research or wish to refer to the results published in the paper.

@inproceedings{ZhaoAAAI2021, 
    author = {Nanxuan Zhao and Zhirong Wu and Rynson W.H. Lau and Stephen Lin}, 
    title = {Distilling Localization for Self-Supervised Representation Learning}, 
    booktitle = {Proceedings of the AAAI Conference on Artificial Intelligence}, 
    year = {2021} 
}

About

Code and pertained models for the paper "Distilling Localization for Self-Supervised Representation Learning"


Languages

Language:Python 100.0%