refactor ```lr_scheduler```
whr94621 opened this issue · comments
The way we configured lr_scheduler
before is a little bit messy:
- We should configure it under
optimizer_configs
as this part is actually related to optimization. Also there too many options undertraining_configs
- Lack of a
d_model
option tonoam
scheduler - Lack of docs
So I push a new branch about the my refactoring of this part. Feel free to add some comments below.