taoyao1221 / GOSE

local minima finding algorithm that can escape from saddle point in one step

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Gradient descent with One-Step Escaping (GOSE)

This repository contains pytorch code that produces the one step escape method (negative curvature descent step) in the paper: Saving Gradient and Negative Curvature Computations: Finding Local Minima More Efficiently. We adopt the one step escape method together with Adam (Adam: A Method for Stochastic Optimization) for training deep networks.

Reference

About

local minima finding algorithm that can escape from saddle point in one step


Languages

Language:Python 100.0%