wencoast / SimpleCNNandResNet

SimpleCNN and ResNet performance comparison

Home Page:https://github.com/wencoast/SimpleCNNandResNet

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Deep_learning_1st_project

Simple neural network and ResNet performance comparison

Running Environment

  • python 3.6.7
  • Tensorflow 1.12.0
  • keras 2.1.6-tf

Running the Code for SNN

Path Function
SNN/snn_fashion_mnist.py The original structure and hyperparameters
SNN/snn_fashion_mnist_lr_optimizer.py Compare Momentum and Adam with different learning rates
SNN/snn_fashion_mnist_dropout.py Compare different dropout rates
SNN/snn_fashion_mnist_optimal.py The final model with optimal settings

TODO List

  1. Comparison of Deep network architectures Simple neural network and ResNet.[Done!]
  2. Using Momentum optimizer and Adam optimizer with different learning rate.[Done!]
  3. Using dropout.[Done!]
  4. Using batch normalization.[Done!]
  5. Using different activation functions including relu, tanh, leaky_relu, Sigmoid, etc. [Done!]
  6. Using data augmentation.[Done!]
  7. Using different optimizers such as ADAGRAD, ADADELTA, ADAM, RMSPROP, MOM. [Done!]
  8. Using local response normalization.[Done!]

About

SimpleCNN and ResNet performance comparison

https://github.com/wencoast/SimpleCNNandResNet


Languages

Language:Python 100.0%