batchnorm handing issue when inference
lucasjinreal opened this issue · comments
MagicSource commented
When inference on single image (batch size = 1) got error:
mobilenetv3.py", line 199, in forward
out = self.hs3(self.bn3(self.linear3(out)))
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 489, in __call__
result = self.forward(*input, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/batchnorm.py", line 76, in forward
exponential_average_factor, self.eps)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/functional.py", line 1619, in batch_norm
raise ValueError('Expected more than 1 value per channel when training, got input size {}'.format(size))
ValueError: Expected more than 1 value per channel when training, got input size torch.Size([1, 1280])
zihaozhang9 commented
I had the same problem. How to solve this problem?
CConory commented
You can heed https://arxiv.org/pdf/1905.02244.pdf that there is no bn in the SEmodule.
When the AdaptiveAvgpool option make the feature_map to channelsX1X1 and the batch size is One, the batch normalization will wrong.
So just remove the BN option in the SEmodule will be ok!
Hao commented
Same question, interesting problem hah hah
zihaozhang9 commented
我已收到您发送的邮件。我会及时查看。