ming024 / FastSpeech2

An implementation of Microsoft's "FastSpeech 2: Fast and High-Quality End-to-End Text to Speech"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Fluctuating training loss

299792459b opened this issue · comments

image

Red is training loss
Orange is valid loss

Other datasets show same behavior

Training loss fluctuates up and down drastically. Anyone know why is this happening? Thank you

Try lowering learning rate.

Try lowering learning rate.

which parameter in config is actually LR?