ngcthuong / pytorch-cifar

PyTorch on the CIFAR10 with state-of-the-art neural networks with pre-trained data

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Train CIFAR10 with PyTorch

PyTorch on the CIFAR10 dataset with various up-to-date neural networks

Additional Updates

Training

  • Can be run using IDE such as visual code.
  • Can use terminar by
	python main.py --network_name VGG16
  • Currently

New features

  1. Save file with all support network name

  2. Added save check point for every epoch

    • Best accuracy epoch is save at checkpoint/network/network_best.pth
    • Option to save network at each epoch
    • Network will automatic resum training
    • Train from begining (i.e. epoch 0) by
    	python main.py --network_name VGG16 --resume False 
    
    • E.g. VGG11 training data is saved at checkpoint/VGG11/VGG11_epoch#no_epoch.pth
  3. Learning rate is now controlled

  4. Bring tensorboardx Support

    • Using command
    	tensorboard --logdir=log_dir --host localhost --port 8088
    
    • Open browser at http://localhost:8088/
    • Only save scalar values of training testing accuracy, loss, epoch, and learning rate
  5. Avaiabled trained data, learning rate are at range [0 25 50 100 ] are 0.1 ,0.01, and 0.0001 (total 100 epochs). In the original repo, author use learning rate of 0.1, 0.01, and 0.001 for each 150 epochs (total 450 epochs).

Todo

  1. Pretrain data of all network

Accuracy

At best test accuracy epoch

Model Train Acc. Test Acc. Ref. Acc Year
VGG11 99.17% 90.92% 92.64% 2014
VGG13 99.49% 90.73% - 2014
VGG16 99.42% 90.76% - 2014
VGG19 99.15% 90.15% - 2014
GoogLeNet/Inception v1 99.93% 93.70% - 2015
ResNet18 99.76% 94.14% 93.02% 2015
ResNet34 99.89% 94.60% - 2015
ResNet50 -% -% - 2015
ResNet101 -% -% - 2015
ResNet152 -% -% - 2015
PreActResNet18 -% -% 95.11% 2016
PreActResNet34 -% -% - 2016
PreActResNet50 -% -% - 2016
PreActResNet101 -% -% - 2016
PreActResNet101 -% -% - 2016
DenseNet121 99.79% 94.26% 95.04% 2016
DenseNet161 -% -% - 2016
DenseNet169 -% -% - 2016
DenseNet201 -% -% - 2016
ResNeXt29(2x64d) 99.81% 94.25% 94.82% 2016
ResNeXt29(4x64d) 99.92% 94.04% - 2016
ResNeXt29(8x64d) -% -% - 2016
ResNeXt29(16x64d) -% -% - 2016
ResNeXt29(32x4d) 99.85% 94.55% 94.73% 2016
SENet18 99.55% 93.56% - 2017
ShuffleNetG2 96.18% 90.32% - 2017
ShuffleNetG3 96.29% 90.71% - 2017
ShuffleNetG8 -% -% - 2017
ShuffleNetV2 -% -% - 2018
PNASNetA -% -% - 2017
PNASNetB -% -% - 2017
DPN26 99.61% 93.86% 95.16% 2017
DPN92 -% -% - 2017
MobileNet 95.02% 89.83% - 2017
MobileNetV2 97.12% 92.17% 94.43% 2018
EfficientNetB0 91.26% 87.85% - 2019

Ref. accuracy is reported in the original repo at KuangLiu Repo

About

PyTorch on the CIFAR10 with state-of-the-art neural networks with pre-trained data

License:MIT License


Languages

Language:Python 100.0%