binkjakub / neural-networks

Implementation of neural networks in pure numpy

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Neural Networks

Repository contains implementation of basic neural networks architectures using numpy only. This code was prepared for neural network classes at Wroclaw University of Science and Technology.

At the current state, implementation supports (linked modules):

  1. Architectures

  2. Layers

  3. Activations

  4. Losses

  5. Optimizers:

  6. Initializers:

NOTE: It is possible to easly extend or add architectures by composing existing layers or implement new ones.

Implementation details

Implementation utilizes OOP and computational graph approach (with manual gradient flow) and was inspired by article: Nothing but NumPy: Understanding & Creating Neural Networks with Computational Graphs from Scratch. Also several concepts about architecture of solution was taken from PyTorch. Under the scope of this project several study experiments on MNIST dataset, which investigate neural networks, were implemented.

Project structure

└── src
    ├── data_processing
    │   ├── notebooks
    │   └── scripts
    ├── datasets
    ├── experiments
    │   ├── notebooks
    │   ├── one_hidden_mnist
    │   ├── optimizers_mnist
    │   └── scripts
    ├── metrics
    ├── nn
    │   ├── activations
    │   ├── layers
    │   ├── losses
    │   ├── networks
    │   └── optimizers
    └── utils


Author: Jakub Binkowski

About

Implementation of neural networks in pure numpy


Languages

Language:Jupyter Notebook 90.6%Language:Python 9.4%