lrhammond / hypergrad

Simple and extensible hypergradient for PyTorch [modified for Stackelberg RL]

Home Page:https://mosko.tokyo/hypergrad

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

hypergrad

pytest PyPI - Version PyPI - Python Version document

Simple and extensible hypergradient for PyTorch

Installation

First, install torch and its accompanying torchvision appropriately. Then,

pip install hypergrad

Methods

Implicit hypergradient approximation (via approximated inverse Hessian-vector product)

Implementation of these methods can be found in hypergrad/approximate_ihvp.py

Citation

To cite this repository,

@software{hypergrad,
    author = {Ryuichiro Hataya},
    title = {{hypergrad}},
    url = {https://github.com/moskomule/hypergrad},
    year = {2023}
}

hypergrad is developed as a part of the following research projects:

@inproceedings{hataya2023nystrom,
    author = {Ryuichiro Hataya and Makoto Yamada},
    title = {{Nystr\"om Method for Accurate and Scalable Implicit Differentiation}},
    booktitle = {AISTATS},
    year = {2023},
}

About

Simple and extensible hypergradient for PyTorch [modified for Stackelberg RL]

https://mosko.tokyo/hypergrad

License:MIT License


Languages

Language:Python 100.0%