satwik77 / mDA-Tensorflow

Tensorflow implementation of the paper Marginalized Denoising Auto-encoders for Nonlinear Representations (ICML 2014)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Marginalised Denoising Autoencoders for Nonlinear Respresentations

Tensorflow implementation of the paper Marginalized Denoising Auto-encoders for Nonlinear Representations (ICML 2014). Other denoising techniques have longer training time and high computational demands. mDA addresses the problem by implicitly denoising the raw input via Marginalization and, thus, is effectively trained on infinitely many training samples without explicitly corrupting the data. There are similar approaches but they have non-linearity or latent representations stripped away. This addresses the disadvantages of those approaches, and hence is a generalization of those works.

Requirements

  • Python 2.7
  • Tensorflow
  • NumPy

Run

To train the demo model :

python mdA.py 

Demo Results

Resulted filters of first layer during training:
Image Filter Gif
The filters are continuously improving and learning specialized feature extractors.

References

  • Chen, Minmin, et al. "Marginalized denoising auto-encoders for nonlinear representations." International Conference on Machine Learning. 2014. [Paper]
  • Vincent, Pascal, et al. "Extracting and composing robust features with denoising autoencoders." Proceedings of the 25th international conference on Machine learning. ACM, 2008. [Paper]

About

Tensorflow implementation of the paper Marginalized Denoising Auto-encoders for Nonlinear Representations (ICML 2014)

License:MIT License


Languages

Language:Python 100.0%