cympfh / MNIST-etude

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

MNIST-etude

This project is a collection of MNIST classification.

諸フレームワークに触れることを目的にします.

More examples of variety frameworks and classification methods, rather than higher performance (accuracy).

(semisup) stands for "semi-supervised learning."

  • tensorflow/
    • linear
      • Acc 91%
    • conv
      • Acc 95%
  • chainer/
    • linear
      • Acc 90.48% / 10 epoch
    • conv
      • Acc 97.78% / 10 epoch
    • vat (semisup)
      • Acc 93.3% / 10 epoch, 500 labels
  • keras/
    • linear
      • Acc 91.66% / 10 epoch
    • conv
      • Acc 97.42% / 10 epoch
    • rnn
      • Acc 96.68% / 10 epoch
    • Learning by Association (semisup)
  • pytorch/
    • linear
      • Acc 91.65% / 10 epoch
    • conv
      • Acc 97.74% / 10 epoch
    • rnn
      • Acc 95.18% / 10 epoch

Rule

  • Optimizers: SGD
  • Dataset: MNIST
    • 60k items for training
    • 10k items for testing

Networks

Very simple architectures are adopted.

linear: has only 1 linear (or dense) layer

28x28 (raw Image)
== 784 (Flatten)
-> 10 (Linear)
-> 10 (Softmax)

conv

28x28
== 1x28x28 (Resize)
-> 8x12x12 (Convolution(kernel=5, stride=2))
-> _ (elu)
-> 16x4x4 (Convolution(kernel=5, stride=2))
-> _ (elu)
== 256 (Flatten)
-> 10 (Linear)
-> 10 (Softmax)

rnn

28x28 = (timestep, input)
-> 20 (LSTM(hidden_dim=128))
-> 10 (Linear)
-> 10 (Softmax)

VAT (= Convolution + Virtual Adversarial Training)

Semi-supervised. The labeled items are learned with the simple CNN (previous). The all (labeled and unlabeld) are used in VAT.

About


Languages

Language:Python 62.8%Language:Jupyter Notebook 33.4%Language:Hy 3.0%Language:Makefile 0.6%Language:Dockerfile 0.3%