Are the pertained weights frozen while training?
prateekmalhotra-hover opened this issue · comments
Hi! Great work! I just wanted to inquire in more detail whether, while training, you're freezing the old weights. Thanks!
of course no
I think the pretrained weights should be frozen (at least at first) or some of the advantage they give you will be erased. For instance, the images below show training/validation loss curves (on some small segmentation dataset) for three different scenarios. The best performance was given by freezing the pretrained knowledge.
Has the author already provided pretrained weights?