Parker-Lyu / mixup-cifar10

mixup: Beyond Empirical Risk Minimization

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Mixup-CIFAR10

By Hongyi Zhang, Moustapha Cisse, Yann Dauphin, David Lopez-Paz.

Facebook AI Research

Introduction

Mixup is a generic and straightforward data augmentation principle. In essence, mixup trains a neural network on convex combinations of pairs of examples and their labels. By doing so, mixup regularizes the neural network to favor simple linear behavior in-between training examples.

This repository contains the implementation used for the results in our paper (https://arxiv.org/abs/1710.09412).

Citation

If you use this method or this code in your paper, then please cite it:

@article{
zhang2018mixup,
title={mixup: Beyond Empirical Risk Minimization},
author={Hongyi Zhang, Moustapha Cisse, Yann N. Dauphin, David Lopez-Paz},
journal={International Conference on Learning Representations},
year={2018},
url={https://openreview.net/forum?id=r1Ddp1-Rb},
}

Requirements and Installation

  • A computer running macOS or Linux
  • For training new models, you'll also need a NVIDIA GPU and NCCL
  • Python version 3.6
  • A PyTorch installation

Training

Use python train.py to train a new model. Here is an example setting:

$ CUDA_VISIBLE_DEVICES=0 python train.py --lr=0.1 --seed=20170922 --decay=1e-4

License

This project is CC-BY-NC-licensed.

Acknowledgement

The CIFAR-10 reimplementation of mixup is adapted from the pytorch-cifar repository by kuangliu.

About

mixup: Beyond Empirical Risk Minimization

License:Other


Languages

Language:Python 100.0%