intsystems / hippotrainer

[BMM 24-25] HippoTrainer: Gradient-Based Hyperparameter Optimization

Home Page:https://intsystems.github.io/hippotrainer/

Repository from Github https://github.comintsystems/hippotrainerRepository from Github https://github.comintsystems/hippotrainer

HippoTrainer

HippoTrainer

Gradient-Based Hyperparameter Optimization for PyTorch πŸ¦›

PyTorch Inspired by Optuna

Docs Tests Coverage

License Contributors Issues Pull Requests

HippoTrainer is a PyTorch-compatible library for gradient-based hyperparameter optimization, implementing cutting-edge algorithms that leverage automatic differentiation to efficiently tune hyperparameters.

πŸ“¬ Assets

  1. Technical Meeting 1 - Presentation
  2. Technical Meeting 2 - Jupyter Notebook
  3. Technical Meeting 3 - Jupyter Notebook
  4. Documentation
  5. Tests
  6. Blog Post

πŸš€ Features

  • Algorithm Zoo: T1-T2, Neumann, HOAG, DrMAD
  • PyTorch Native: Direct integration with torch.nn.Module
  • Memory Efficient: Checkpointing & implicit differentiation
  • Scalable: From laptop to cluster with PyTorch backend

πŸ“œ Algorithms

  • T1-T2 (Paper): One-step unrolled optimization
  • Neumann (Paper): Leveraging Neumann series approximation for implicit differentiation
  • HOAG (Paper): Implicit differentiation via conjugate gradient
  • DrMAD (Paper): Memory-efficient piecewise-linear backpropagation

🀝 Contributors

  • Daniil Dorin (Basic code writing, Final demo, Algorithms)
  • Igor Ignashin (Project wrapping, Documentation writing, Algorithms)
  • Nikita Kiselev (Project planning, Blog post, Algorithms)
  • Andrey Veprikov (Tests writing, Documentation writing, Algorithms)
  • We welcome contributions!

πŸ“„ License

HippoTrainer is MIT licensed. See LICENSE for details.

About

[BMM 24-25] HippoTrainer: Gradient-Based Hyperparameter Optimization

https://intsystems.github.io/hippotrainer/

License:MIT License


Languages

Language:Jupyter Notebook 96.7%Language:Python 3.3%