indy-lab / ProtoTransfer

Official code for the paper "Self-Supervised Prototypical Transfer Learning for Few-Shot Classification"

Home Page:https://arxiv.org/abs/2006.11325

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Self-Supervised Prototypical Transfer Learning for Few-Shot Classification

This repository contains the reference source code and pre-trained models (ready for evaluation) for our paper Self-Supervised Prototypical Transfer Learning for Few-Shot Classification.

Part of this work has been presented at the ICML 2020 Workshop on Automated Machine Learning.

ProtoTransfer method illustration

Structure

omni-mini/

Contains instructions and all runnable code for ProtoTransfer & UMTRA for our Omniglot and mini-ImageNet experiments

cdfsl-benchmark/

Contains instructions, all runnable code and pre-trained models for ProtoTransfer & UMTRA for our CDFSL benchmark experiments

Setup

For setting up a Python environment to run our experiments, please refer to omni-mini/setup. The dataset setups can be found in omni-mini and cdfsl-benchmark.

Citation

If you find our code useful, please consider citing our work using the bibtex:

@article{medina2020selfsupervised,
    title="{Self-Supervised Prototypical Transfer Learning for Few-Shot Classification}",
    author={Carlos Medina and Arnout Devos and Matthias Grossglauser},
    journal={arXiv preprint arXiv:2006.11325},
    year={2020}
}

About

Official code for the paper "Self-Supervised Prototypical Transfer Learning for Few-Shot Classification"

https://arxiv.org/abs/2006.11325

License:MIT License


Languages

Language:Python 63.3%Language:Jupyter Notebook 31.6%Language:Shell 4.4%Language:Dockerfile 0.7%