Densenet: wrong structure of transition layer
crclz opened this issue · comments
According to original densenet implementation, the transition layer should be BN-ReLU-Conv-Pool
, but the code in this repository is BN-Conv-Pool
. BN-ReLU is missing, which may hurt the accuracy of the model.
the densenet (from paper author):
https://github.com/liuzhuang13/DenseNet/blob/cf511e4add35a7d7a921901101ce7fa8f704aee2/models/densenet.lua#L37-L52
this repo:
pytorch-cifar100/models/densenet.py
Lines 47 to 58 in 2149cb5
by the way, maybe the description in the paper is misleading:
The transitionlayers used in our experiments consist of a batch normal-ization layer and an 1×1 convolutional layer followed by a2×2 average pooling layer.