Simple neural network and ResNet performance comparison
- python 3.6.7
- Tensorflow 1.12.0
- keras 2.1.6-tf
Path | Function |
---|---|
SNN/snn_fashion_mnist.py | The original structure and hyperparameters |
SNN/snn_fashion_mnist_lr_optimizer.py | Compare Momentum and Adam with different learning rates |
SNN/snn_fashion_mnist_dropout.py | Compare different dropout rates |
SNN/snn_fashion_mnist_optimal.py | The final model with optimal settings |
- Comparison of Deep network architectures Simple neural network and ResNet.[Done!]
- Using Momentum optimizer and Adam optimizer with different learning rate.[Done!]
- Using dropout.[Done!]
- Using batch normalization.[Done!]
- Using different activation functions including relu, tanh, leaky_relu, Sigmoid, etc. [Done!]
- Using data augmentation.[Done!]
- Using different optimizers such as ADAGRAD, ADADELTA, ADAM, RMSPROP, MOM. [Done!]
- Using local response normalization.[Done!]