There are 0 repository under relu-derivative topic.
A small walk-through to show why ReLU is non linear!
Backward pass of ReLU activation function for a neural network.
Towards a regularity theory for ReLU networks (construction of approximating networks, ReLU derivative at zero, theory)
Implemented back-propagation algorithm on a neural network from scratch using Tanh and ReLU derivatives and performed experiments for learning purpose