Meta-knowledge-Lab / DLB

Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Self-Distillation from the Last Mini-Batch (DLB)

This is a pytorch implementation for "Self-Distillation from the Last Mini-Batch for Consistency Regularization". The paper was accepted by CVPR 2022.

The paper is available at https://arxiv.org/abs/2203.16172.

Run dlb.py for the proposed self distillation method.

About

Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"

License:MIT License


Languages

Language:Python 100.0%