doyend / SNNs

Tutorials and implementations for "Self-normalizing networks"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Self-Normalizing Networks

Tutorials and implementations for "Self-normalizing networks"(SNNs) as suggested by Klambauer et al. (arXiv pre-print).

Versions

  • Python 3.5 and Tensorflow 1.1

Note for Tensorflow 1.4 users

Tensorflow 1.4 already has the function "tf.nn.selu" and "tf.contrib.nn.alpha_dropout" that implement the SELU activation function and the suggested dropout version.

Tutorials

  • Multilayer Perceptron (notebook)
  • Convolutional Neural Network on MNIST (notebook)
  • Convolutional Neural Network on CIFAR10 (notebook)

KERAS CNN scripts:

Design novel SELU functions

  • How to obtain the SELU parameters alpha and lambda for arbitrary fixed points (notebook)

Basic python functions to implement SNNs

are provided as code chunks here: selu.py

Notebooks and code to produce Figure 1

are provided here: Figure1

Calculations and numeric checks of the theorems (Mathematica)

are provided as mathematica notebooks here:

UCI, Tox21 and HTRU2 data sets

About

Tutorials and implementations for "Self-normalizing networks"

License:GNU General Public License v3.0


Languages

Language:Jupyter Notebook 80.7%Language:Mathematica 16.1%Language:Python 3.2%