ternaus / TernausNet

UNet model with VGG11 encoder pre-trained on Kaggle Carvana dataset

Home Page:https://arxiv.org/abs/1801.05746

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Are the pertained weights frozen while training?

prateekmalhotra-hover opened this issue · comments

Hi! Great work! I just wanted to inquire in more detail whether, while training, you're freezing the old weights. Thanks!

of course no

I think the pretrained weights should be frozen (at least at first) or some of the advantage they give you will be erased. For instance, the images below show training/validation loss curves (on some small segmentation dataset) for three different scenarios. The best performance was given by freezing the pretrained knowledge.

  1. Frozen weights on pretrained vgg11 encoder:
    ternaus_frozen

  2. Unfrozen weights on pretrained encoder:
    DRIVE_ternausnet_loss_curves

  3. Randomly initialized (pretrained=False):
    DRIVE_ternausnet_loss_curves_pt_false

Has the author already provided pretrained weights?