mohcinemadkour / Dropout-for-Deep-Learning-Regularization-explained-with-Examples

https://medium.com/@mohcine.madkour/dropout-for-deep-learning-regularization-explained-with-examples-dee81f0de35a

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Tutorial: Dropout as Regularization and Bayesian Approximation

This tutorial aims to give readers a complete view of dropout, which includes the implementation of dropout (in PyTorch), how to use dropout and why dropout is useful. Basically, dropout can (1) reduce overfitting (so test results will be better) and (2) provide model uncertainty.

Please view my tutorial here.

References

[1] Improving neural networks by preventing co-adaptation of feature detectors, G. E. Hinton, et al., 2012
[2] Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning, Y. Gal, and Z. Ghahramani, 2016
[3] Dropout: A Simple Way to Prevent Neural Networks from Overfitting, N. Srivastava, et al., 2014

About

https://medium.com/@mohcine.madkour/dropout-for-deep-learning-regularization-explained-with-examples-dee81f0de35a


Languages

Language:Jupyter Notebook 100.0%