kardasbart / MultiLR

A method for assigning separate learning rate schedulers to different parameters group in a model.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

MultiLR

A method for assigning separate learning rate schedulers to different parameters groups in a model. Pull requests are welcome.

Usage

Write a lambda function that constructs a scheduler for each parameter group.

scheduler = MultiLR(optimizer, 
                [lambda opt: torch.optim.lr_scheduler.StepLR(opt, step_size=10, gamma=0.5), 
                 lambda opt: torch.optim.lr_scheduler.LinearLR(opt, start_factor=0.25, total_iters=10)])

About

A method for assigning separate learning rate schedulers to different parameters group in a model.

License:MIT License


Languages

Language:Python 100.0%