deepzlk

deepzlk

Geek Repo

Github PK Tool:Github PK Tool

deepzlk's repositories

MLCSD-Net

A Multi-level Collaborative Self-Distillation Learning for Improving Adaptive Inference Efficiency

Language:PythonStargazers:1Issues:1Issues:0

anytime

Anytime Dense Prediction with Confidence Adaptivity (ICLR 2022)

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

MachineLearning

some real example of machine learn algorithm

Language:PythonStargazers:0Issues:0Issues:0
Language:ShellStargazers:0Issues:0Issues:0

pytorch-book

PyTorch tutorials and fun projects including neural talk, neural style, poem writing, anime generation

Language:Jupyter NotebookLicense:MITStargazers:0Issues:0Issues:0

RepDistiller

[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods

Language:PythonLicense:BSD-2-ClauseStargazers:0Issues:0Issues:0

Teacher-free-Knowledge-Distillation

Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization

Language:PythonLicense:MITStargazers:0Issues:0Issues:0