RoboticsClubIITJ / ML-DL-implementation

An implementation of ML and DL algorithms from scratch in python using nothing but NumPy and Matplotlib.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Other variants of Relu can be added in the script activation.py

parva-jain opened this issue · comments

I'm willing to add other variants of the Relu activation function for a wide range of applications of this package.

@parva-jain Sure you can work on it. Before that can you please give a brief description of what all you are going to add?

I have heard of Parametric Relu(PRelu) and Scaled ELU(SELU) functions but only able to find out their Keras implementation till now.
Also, there can be added more activation functions like a binary step function and a swish function.

@parva-jain sure go ahead, you can work on them. Make sure to follow PEP 8 formatting and correctness of your code.

LGTM too.