whr94621 / NJUNMT-pytorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

refactor ```lr_scheduler```

whr94621 opened this issue · comments

The way we configured lr_scheduler before is a little bit messy:

  • We should configure it under optimizer_configs as this part is actually related to optimization. Also there too many options under training_configs
  • Lack of a d_model option to noam scheduler
  • Lack of docs

So I push a new branch about the my refactoring of this part. Feel free to add some comments below.