qubvel / segmentation_models.pytorch

Segmentation models with pretrained backbones. PyTorch.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Are SMP pretrained encoders' weights freezed by default? (since there's no any parameter for that)

neotod opened this issue · comments

for SM keras, there's some parameter for the encoder that you can set when you build then model class:
it stated here -> https://segmentation-models.readthedocs.io/en/latest/tutorial.html#fine-tuning

I wonder if such a thing exists in SMP (pytorch version) too?
because not freezing the encoder's weights will ruin them while training.
I want only train my decoder while encoder's weights are freezed (AKA fine tunning)
are encoders' weights freezed by default?

I used this code to freeze the weights explicitly, and judging by GPU memory usage, they are not frozen by default.

for param in model.encoder.parameters():
    param.requires_grad = False

Just ran into this :D Would be great to have a line or two in the API docs about this.