wangkai930418 / attndistill

code for our paper "Attention Distillation: self-supervised vision transformer students need more guidance" in BMVC 2022

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Attention Distillation: self-supervised vision transformer students need more guidance (BMVC 2022)

Kai Wang, Fei Yang and Joost van de Weijer

Requirements

Please check the packages in your environment with the "requirements.txt", normally these scripts don't depend on much on the package versions.

Reproducing

Please modify the data paths in "dino_teacher.sh" and "mugs_teacher.sh" to run the scripts by "bash xxx.sh".

Download

The teacher models checkpoints can be downloaded from the github repositories of DINO (https://github.com/facebookresearch/dino) and Mugs (https://github.com/sail-sg/mugs).

Others

If you have any question, do not hesitate to contact me or post an issue.

About

code for our paper "Attention Distillation: self-supervised vision transformer students need more guidance" in BMVC 2022


Languages

Language:Python 97.2%Language:Shell 2.8%