Why freeze all bn (affine) layers?
lilichu opened this issue · comments
Why freeze all bn (affine) layers?
Detectron.pytorch/lib/modeling/ResNet.py
Line 76 in 8315af3
def _init_modules(self):
# Freeze all bn (affine) layers !!!
self.apply(lambda m: freeze_params(m) if isinstance(m, mynn.AffineChannel2d) else None)