This contains the implementation on numpy for standard vanilla neural networks with many functionalities including batch norm, dropout and L1 and L2 regluarisers. Available activations - Relu, Softmax, Sigmoid.
Geek Repo:Geek Repo
Github PK Tool:Github PK Tool