SoftwareGift / fixup

Fixup initialization implementation

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Wide Residual Network with optional Fixup initialization

The code presents the implementation of Fixup as an option for standard Wide ResNet. When BatchNorm and Fixup are enabled simultaneously, Fixup initialization and the standard structure of the residual block are used.

Usage example:

python train.py --layers 40 --widen-factor 10 --batchnorm False --fixup True

Acknowledgment

Wide Residual Network by Sergey Zagoruyko and Nikos Komodakis

Fixup Initialization: Residual Learning Without Normalization by Hongyi Zhang, Yann N. Dauphin, Tengyu Ma

Fixup implementation was originally introduced by Andy Brock

WRN code by xternalz

About

Fixup initialization implementation


Languages

Language:Python 100.0%