ellyanalinden / MNIST_onehiddenlayer

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

MNIST_onehiddenlayer

Table of Contents

  • Project description
  • Development Tools
  • Results
  • Conclusion
  • References

Project description

Comparing different non-linear activation (Sigmoid, TanH, ReLu) using MNIST dataset with one hidden layer

Development Tools

  • PyTorch framework

Results

  • Sigmoid: Iteration 3000. Loss: 0.7164252996444702. Accuracy: 83
  • Tanh: Iteration 3000. Loss: 0.1971532106399536. Accuracy: 94
  • ReLU: Iteration 2500. Loss: 0.09552594274282455. Accuracy: 95

Conclusion

ReLU showed the highest accuracy, however, only a little different from Tanh

References

  • following Practical Deep Learning with PyTorch Course

About


Languages

Language:Jupyter Notebook 100.0%