Kenneth Borup's repositories
knowledgeDistillation
PyTorch implementation of (Hinton) Knowledge Distillation and a base class for simple implementation of other distillation methods.
self_distillation
Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression
centered_kernel_alignment
Implementation of Centered Kernel Alignment (CKA)
gaussian_process_self_distillation
Official implementation of Self-Distillation for Gaussian Processes
SeqSleepNet
SeqSleepNet (in Pytorch), ensemble models and semi-supervised knowledge distillation.
DistillWeighted
Official code for the ICCV paper: Distilling from Similar Tasks for Transfer Learning on a Budget
Best-README-Template
An awesome README template to jumpstart your projects!
downloadArxiv
Download Arxiv paper with title directly from command line.
ds_project_template
Template repository for simple data science projects
PyTorch-Model-Compare
Compare neural networks by their feature similarity