SG-Akshay10 / Gradient-Descent

A Whitebox implementation of gradient descent linear regression model.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Gradient-Descent

A Whitebox implementation of gradient descent linear regression model.

Gradient Descent is known as one of the most commonly used optimization algorithms to train machine learning models by means of minimizing errors between actual and expected results. Further, gradient descent is also used to train Neural Networks.

What is Gradient Descent or Steepest Descent?

Gradient descent was initially discovered by "Augustin-Louis Cauchy" in mid of 18th century. Gradient Descent is defined as one of the most commonly used iterative optimization algorithms of machine learning to train the machine learning and deep learning models. It helps in finding the local minimum of a function.

image

The main objective of using a gradient descent algorithm is to minimize the cost function using iteration. To achieve this goal, it performs two steps iteratively:

  • Calculates the first-order derivative of the function to compute the gradient or slope of that function.
  • Move away from the direction of the gradient, which means slope increased from the current point by alpha times, where Alpha is defined as Learning Rate. It is a tuning parameter in the optimization process which helps to decide the length of the steps.

What is Cost-function?

The cost function is defined as the measurement of difference or error between actual values and expected values at the current position and present in the form of a single real number.

Ouput

image

From this output we can say that our model is works the best for sk-learn model and the least worst for gradient decent model

About

A Whitebox implementation of gradient descent linear regression model.

License:Apache License 2.0


Languages

Language:Jupyter Notebook 97.2%Language:Python 2.8%