TwiggyDaniels / HigherOrderOptimizationMethods

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Iowa State University COMS 578X Fall 2019

Requirements: 1) Python 3.6+ 2) Pip 3 or some other package manager 3) Python packages in requirements.txt pip3 install -r requirements.txt Running Experiments: python3 size_n#_test.py ... etc.

HigherOrderOptimizationMethods

This project examines the implementation and convergence of several optimization algorithms. Included in the project are Gradient Descent, Heavy-Ball, Conjugate Gradient Descent, Nesterov Accelerated Gradient Descent, FISTA, and Barzilai-Borwein. All step sizes are determined utilizing an inexact line search with backtracking or an exact line search via the golden-selection search. The methods are compared for multiple vector sizes (5, 10, & 50) and all optimization methods had their parameters tuned via a limited grid search.

Prerequisites

Python 3.6+

Recommended:

Some Linux Distribution

Installing

A step by step series of examples that tell you how to get a development env running

Install the necessary Python packages

pip3 install -r requirements.txt

Running

All experiments are currently split by vector size. Please note, they are subject to change as the repository is cleaned.

To perform a grid search for a vector size:

python3 size_n#_search.py

To run a set of experiments with tuned parameters for a vector size:

python3 size_n#_test.py

TODO

  • Extensive documentation overhaul (mostly comments on the methods, helpers mostly good)
  • Consolidate the experiments into notebooks
  • Improve directory structure

About


Languages

Language:Python 100.0%