108mk / Gradient_Descent

As part of Optimization techniques for ML, I implemented gradient descent technique for 'Least sqaure function' as our objective function. I have supplement it with initial data visualization via correlation matrix and pairplots (via seaborn).

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Gradient_Descent for 'Least Square Method'

As part of Optimization techniques for ML, I implemented gradient descent technique to optimize the objective 'Least sqaure function'. The goal is to predict height of a person from their weight (and gender) via linear regression. I have supplemented the analysis with data visualization employing correlation matrix and pairplots (via seaborn).

Data-set I: Height Prediction form weight (and gender).

alt text

Model: Linear Regression.

Data Visualization: PairPlot and Correlation plot

alt text

alt text

Convergence Analysis: Two different learning rate ($\gamma$) gives drastically different converegence.

With $\gamma$ = 0.1 (Trial learning rate.)

alt text

With $\gamma = \frac{1}{L}$; where L is the Smoothness parameter of the optimized Least sqaure function.

alt text

About

As part of Optimization techniques for ML, I implemented gradient descent technique for 'Least sqaure function' as our objective function. I have supplement it with initial data visualization via correlation matrix and pairplots (via seaborn).

License:MIT License


Languages

Language:Jupyter Notebook 99.6%Language:Python 0.4%