There are 23 repositories under optimizer topic.
Linux System Optimizer and Monitoring - https://oguzhaninan.github.io/Stacer-Web
torch-optimizer -- collection of optimizers for Pytorch
On the Variance of the Adaptive Learning Rate and Beyond
An implementation of React v15.x that optimizes for small script size
GLSL optimizer based on Mesa's GLSL compiler. Used to be used in Unity for mobile shader optimization.
Virtual-machine Translation Intermediate Language
Portfolio optimization and back-testing.
The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training”
Heimer is a simple cross-platform mind map, diagram, and note-taking tool written in Qt.
Scour - An SVG Optimizer / Cleaner
An Artifact optimizer for Genshin Impact.
Linux Optimizer
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
AdamP: Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights (ICLR 2021)
Effortless plugin and play Optimizer to cut model training costs by 50%. New optimizer that is 2x faster than Adam on LLMs.
RyTuneX is a cutting-edge optimizer built with the WinUI 3 framework, designed to amplify the performance of Windows devices. Crafted for both Windows 10 and 11.
lookahead optimizer (Lookahead Optimizer: k steps forward, 1 step back) for pytorch
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers advanced quantization and compression tools for deploying state-of-the-art neural networks.
A tool to automate and optimize DraftKings and FanDuel lineup construction.
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
Implementation of the Adan (ADAptive Nesterov momentum algorithm) Optimizer in Pytorch
Writing eBPF programs with Elixir!
A Honkai Star Rail optimizer, relic scorer, damage calculator, and various other tools for building and gearing characters
Code for Adam-mini: Use Fewer Learning Rates To Gain More https://arxiv.org/abs/2406.16793
optimizer & lr scheduler & loss function collections in PyTorch
Explore the energy-efficient dataflow scheduling for neural networks.