wadelucky / unsupervised-face-representation

Implementation for Pre-training strategies and datasets for facial representation learning, ECCV 2022

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Pre-training strategies and datasets for facial representation learning

This is the PyTorch implementation for Facial Representation Learning (FRL) paper:

@inproceedings{bulat2022pre,
  title={Pre-training strategies and datasets for facial representation learning},
  author={Bulat, Adrian and Cheng, Shiyang and Yang, Jing and Garbett, Andrew and Sanchez, Enrique and Tzimiropoulos, Georgios},
  journal={ECCV},
  year={2022}
}

Model Zoo

They provide bellow some of the models trained in a self-supervised manner. More models to be added later on.

data backbone url
VGG ResNet 50 model
VGG (1M) ResNet 50 model
FPR-Flickr ResNet 50 model

Installation

To use the code, clone the repo and install the following packages:

git clone https://github.com/wadelucky/unsupervised-face-representation

Requirements

  • Python >= 3.8
  • Numpy
  • pytorch: install instructions
  • torchvision: conda install torchvision -c pytorch
  • apex: install instructions
  • OpenCV: pip install opencv-python
  • H5Py: conda install h5py
  • tensorboard: pip install tensorboard
  • pandas

Note, if you are using pytorch > 1.10 and experience issues with apex, please see #1282. Alternatively you can switch to the native pytorch amp.

Training

bash~~ ~~bash scripts/run.sh~~ ~~

In HPC, you may need to modify some hyperparameters.

sbatch face.sbatch

Before running the script make sure to set the appropiate paths. The models released in the paper were trained using 64 K40 GPUs.

Preparing data

For instructions regarding getting the data, see DATASET.md

Acknowledgement

We thank the original authors for releasing their code: SwAV, MoCo, BYOL, and vissl which we base our code base upon.

About

Implementation for Pre-training strategies and datasets for facial representation learning, ECCV 2022

License:MIT License


Languages

Language:Python 90.5%Language:Shell 5.3%Language:JavaScript 4.2%