SameetAsadullah / Neural-Network-Implementation

Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

SameetAsadullah/Neural-Network-Implementation Stargazers