zhangjb416 / Hierarchical-Augmentation

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Learning Hierarchical Augmentation Invariance with Expanded Views

This is a PyTorch implementation for the paper Rethinking the Augmentation Module in Contrastive Learning: Learning Hierarchical Augmentation Invariance with Expanded Views. Simsiam is used as the contrastive learning backbone.

Preparation

Install PyTorch and download the ImageNet dataset following the official PyTorch ImageNet training code.

Unsupervised Pre-Training

Only multi-gpu, DistributedDataParallel training is supported; single-gpu or DataParallel training is not supported.

To do unsupervised pre-training of a ResNet-50 model on ImageNet in an 8-gpu machine, run:

python main_train.py \
  -a resnet50 \
  --dist-url 'tcp://localhost:10001' --multiprocessing-distributed --world-size 1 --rank 0 \
  --fix-pred-lr \
  --data [your imagenet-folder with train and val folders]

The above command performs pre-training with a non-decaying predictor learning rate for 200 epochs.

Linear Classification

With a pre-trained model, to train a supervised linear classifier on frozen features/weights in an 8-gpu machine, run:

python main_lincls.py \
  -a resnet50 \
  --dist-url 'tcp://localhost:10001' --multiprocessing-distributed --world-size 1 --rank 0 \
  --pretrained [your checkpoint path]/checkpoint_0199.pth.tar \
  --data [your dataset-folder with train and val folders]

Transferring to Object Detection and Segmentation Tasks

Same as MoCo for object detection transfer, please see moco/detection.

About


Languages

Language:Python 100.0%