RoboticsClubIITJ / ML-DL-implementation

An implementation of ML and DL algorithms from scratch in python using nothing but NumPy and Matplotlib.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Convert Activations from simple functions to classes and add gradient method.

rohansingh9001 opened this issue · comments

Currently, all the activations in activations.py are simple functions.

However, for future implementation of Neural Networks, we will also need derivatives methods of each of these
functions. You can have a look into loss_func.py for reference. There, each class represents a loss function and has both loss
and derivative methods. You have to implement something similar for activations.

Converting all activations is not a requirement however you may implement only one or as many you prefer. Each implementation should be in different PR's. If you make a single PR implementing more than one function, I will only give your points for 1 easy contribution. So make sure you capitalise on this 😉.

Hey @rohansingh9001, I have raised a PR after converting the sigmoid function. Please review it and notify me if there are any changes to be made so that I can convert the rest accordingly.

Hey @rohansingh9001, @agrawalshubham01, could you guys reopen this issue since its not completed or should I open another issue for this?

@Abjcodes now you can contribute here.

Hey @rohansingh9001, @agrawalshubham01, I noticed there are still some activations that are yet to be converted. Should I raise multiple PRs or only one?

@Abjcodes raise single PR, as it will be easier for us to get close look on multiple models , simultaneously.

Hey @kwanit1142 @rohansingh9001 @agrawalshubham01, Please check the PR. Also, I need some clarification regarding the gradient of heviside function. Is it okay to include Diracdelta using the sympy module?
I will fix the failing checks along with the changes to be made if there are any.