calclavia / DeepJ

A deep learning model for style-specific music generation.

Home Page:https://arxiv.org/abs/1801.00887

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

is it normal for loss to increased a lot while training?

DefinitlyEvil opened this issue · comments

Epoch 60/1000
10954/10954 [==============================] - 659s - loss: 0.0475
Epoch 61/1000
10954/10954 [==============================] - 658s - loss: 0.0474
Epoch 62/1000
10954/10954 [==============================] - 658s - loss: 0.0473
Epoch 63/1000
10954/10954 [==============================] - 659s - loss: 0.0471
Epoch 64/1000
10954/10954 [==============================] - 658s - loss: 0.0472
Epoch 65/1000
10954/10954 [==============================] - 658s - loss: 0.0612
Epoch 66/1000
10954/10954 [==============================] - 659s - loss: 0.0958
Epoch 67/1000
10954/10954 [==============================] - 657s - loss: 0.0877

you can see around epoch 66 the loss had increased a lot, does it mean hours of training wasted? sry I am a beginner of this... :(

the script shutdown automatically after few epochs and when I restart training it shows loss around 0.047 again.

Hmm that shouldn't happen. It could be over-training. The script stops automatically when loss does not decrease.

And I restarted it xD

oh btw a lot of midi files contains raw embelishment notes like trembling(unlike in notation software), but that will confuse the network so I think there shouldn't be embelishment notes