There are 2 repositories under second-order-optimization topic.
PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation preconditioner and more)
A C++ interface to formulate and solve linear, quadratic and second order cone problems.
An implementation of PSGD Kron second-order optimizer for PyTorch
Distributed K-FAC Preconditioner for PyTorch
FEDL-Federated Learning algorithm using TensorFlow (Transaction on Networking 2021)
This repository implements FEDL using pytorch
PyTorch implementation of the Hessian-free optimizer
Implementation of PSGD optimizer in JAX
Hessian-based stochastic optimization in TensorFlow and keras
Compatible Intrinsic Triangulations (SIGGRAPH 2022)
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
Federated Learning using PyTorch. Second-Order for Federated Learning. (IEEE Transactions on Parallel and Distributed Systems 2022)
Minimalist deep learning library with first and second-order optimization algorithms made for educational purpose
LIBS2ML: A Library for Scalable Second Order Machine Learning Algorithms
Subsampled Riemannian trust-region (RTR) algorithms
Prototyping of matrix free Newton methods in Julia
An efficient and easy-to-use Theano implementation of the stochastic Gauss-Newton method for training deep neural networks.
Newton’s second-order optimization methods in python
The repository contains code to reproduce the experiments from our paper Error Feedback Can Accurately Compress Preconditioners available below:
Matrix-multiplication-only KFAC; Code for ICML 2023 paper on Simplifying Momentum-based Positive-definite Submanifold Optimization with Applications to Deep Learning
A curated list of resources for second-order stochastic optimization
Modular optimization library for PyTorch. I changed the API a lot and that will soon be pushed here💀
NG+: A new second-order optimizer for deep learning
sophia optimizer further projected towards flat areas of loss landscape
A collection of second-order optimizers and experiments in JAX
Concepts and algorithms in core learning theory
Regularization, Bayesian Model Selection and k-fold Cross-Validation Selection
Discussion of advantages and disadvantages of AdaHessian, a state-of-the-art Second Order Methods over First Order Methods on a Non-Convex Optimization Problem (digits classification on MNIST database using ResNet18). - @ EPFL