ymanzi / neuralnetwork

Neural Network Library from scratch with Python

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Neural Network Lib

Member: 🌜 Ymanzi 🌛

Challenge

Implement a Multilayer Perceptron Library from scratch

Structure

struct

Perceptron (FeedForward)

perceptron

Backpropagation Equations

1

2

3

Formules

Loss Functions Implemented

  • Cross Entropy
  • Means Squared Error

Activation Functions Implemented

  • Sigmoid
  • Tanh
  • ReLU
  • SoftMax

atab

Regularization Implemented

Regularization is a technique which makes slight modifications to the learning algorithm such that the model generalizes better. This in turn improves the model's performance on the unseen data as well, avoiding overfitting.

  • L1/L2 Regularization : L1/L2 regularization try to reduce the possibility of overfitting by keeping the values of the weights and biases small.
  • Dropout : To apply DropOut, we randomly select a subset of the units and clamp their output to zero, regardless of the input; this effectively removes those units from the model.

dropout

  • Dropout Connect : We apply dropout with the weights, instead of nodes

dropout

Resources

About

Neural Network Library from scratch with Python


Languages

Language:Python 100.0%