kyungmnlee / RenyiCL

Contrastive self-supervised learning using Rényi divergence

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

RenyiCL: Contrastive Learning with Skew Renyi Divergence

Introduction

This is an official PyTorch implementation of NeurIPS 2022 paper RényiCL: Contrastive Learning with skew Rényi Divergence.

Results

pretrain
epochs
linear
acc
pretrain
files
linear
files
eval
logs
300 76.2 ckpt ckpt txt

Usage: Preparation

Install

The code has been tested with CUDA 11.3, PyTorch 1.11.0 and timm 0.4.9.

Usage: Self-supervised Pre-Training

For 100 epoch without multi-crops:

python main_renyicl.py \
  --ema-cos \
  --crop-min=.2 \
  --dist-url tcp://localhost:10002 \
  --epochs 100 \
  --multiprocessing-distributed \
  --world-size 1 \
  --rank 0 \
  --data /data/ImageNet/ \
  --outdir ../outdir/ \
  --trial renyicl_100ep

For 100 epoch with multi-crops:

python main_renyicl.py \
  --ema-cos \
  --crop-min=.2 \
  --dist-url tcp://localhost:10002 \
  --epochs 100 \
  --multiprocessing-distributed --world-size 1 --rank 0 \
  --data /data/ImageNet/ \
  --n_crops 6 \
  --outdir ../outdir/ \
  --trial renyicl_100ep_mc 

To reproduce our results in main paper:

python main_renyicl.py \
  --ema-cos \
  --crop-min=.2 \
  --dist-url tcp://localhost:10002 \
  --epochs 300 \
  --multiprocessing-distributed --world-size 1 --rank 0 \
  --data /data/ImageNet/ \
  --outdir ../outdir/ \
  --trial renyicl_300ep_mc \
  --n_crops 6

Then, it will results 76.2% in ImageNet linear evaluation protocol.

To run MoCo v3 with multi-crops:

python main_mocov3.py \
  --ema-cos \
  --crop-min=.2 \
  --dist-url tcp://localhost:10002 \
  --epochs 100 \
  --multiprocessing-distributed --world-size 1 --rank 0 \
  --data /data/ImageNet/ \
  --outdir ../outdir/ \
  --trial mocov3_100ep_mc \
  --n_crops 6

Then, it will results 73.5% in ImageNet linear evaluation protocol.

Usage: Linear Classification

We use SGD with batch size 4096 for linear evaluation.

python main_lincls.py \
  --dist-url 'tcp://localhost:10002' \
  --multiprocessing-distributed --world-size 1 --rank 0 \
  --pretrained /tmp/trial/checkpoint_last.pth.tar \
  --data /data/ImageNet \
  --save_dir /tmp/trial/eval/

Citation

@article{lee2022r,
  title={R$\backslash$'enyiCL: Contrastive Representation Learning with Skew R$\backslash$'enyi Divergence},
  author={Lee, Kyungmin and Shin, Jinwoo},
  journal={arXiv preprint arXiv:2208.06270},
  year={2022}
}

About

Contrastive self-supervised learning using Rényi divergence


Languages

Language:Python 100.0%