inejc / nnlib

:mortar_board: Minimal neural networks library developed for educational purposes

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Minimal neural networks library for educational purposes

Build Status codecov codebeat badge Codacy Badge

A pure Python and NumPy implementation of a neural networks library developed for educational purposes.

It focuses on readability rather than speed and thus aims at providing an easily understandable toy code, as opposed to the real production grade libraries.

Gradient checks

All analytical solutions are gradient checked with a numerical method (a center divided difference formula).

Currently implemented

Future plans

  • Dropout
  • Batch normalization
  • Different initializations
  • Tanh, Sigmoid activations
  • Convolutional layer
  • Additional optimizers (RMSprop, Adagrad, Adam)

Example usage

A computational graph that maintains the connectivity of the layers is called a Model (see model.py).

from nnlib import Model

model = Model()

New layers are added to the model with the add() method

from nnlib.layers import FullyConnected, ReLU, SoftmaxWithCrossEntropy

model.add(FullyConnected(num_input_neurons=20, num_neurons=50))
model.add(ReLU())
model.add(FullyConnected(num_input_neurons=50, num_neurons=3))
model.add(SoftmaxWithCrossEntropy())

An optimizer needs to be passed to the compile() method to prepare everything for training

from nnlib.optimizers import SGD

model.compile(SGD(lr=0.01))

To train the model call the fit() method and to make predictions call the predict_proba() or the predict() method

model.fit(X_train, y_train, batch_size=32, num_epochs=100)

y_probs = model.predict_proba(X_test)
y_pred = model.predict(X_test)

Resources

About

:mortar_board: Minimal neural networks library developed for educational purposes

License:MIT License


Languages

Language:Python 100.0%