c-c-c-c / optimizers.numpy

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Various Optimizers based on Gradient Descent

  • Final update: 2018. 12. 15.
  • All right reserved @ Il Gu Yi 2018

Educational Purpose

  • Implementation various optimization algorithms based on gradient descent
  • Only use numpy, don't use deep learning framework like TensorFlow
  • Low level coding in each algorithm

Getting Started

Prerequisites

  • Python 3.6
    • numpy, matplotlib
  • Jupyter notebook
  • OS X and Linux (Not validated on Windows but probably it might work)

Contents

Linear Regression using Gradient Descent

Optimization of Beale Function using Various Gradient Descent Algorithms

Results

Optimization of Linear Regression using Various Gradient Descent Algorithms

regression_all regression_all.gif

Optimization of Beale Function using Various Gradient Descent Algorithms

all_test_optimizers all_test_optimizers.gif

References

Author

Il Gu Yi

About

License:Apache License 2.0


Languages

Language:Jupyter Notebook 100.0%