Inconsistent parameters
iris0329 opened this issue · comments
Hi
As you mentioned in issues 17
For the Number of Parameters we used the built-in functions of Pytorch like so:
sum(p.numel() for p in model.parameters() if p.requires_grad)
For the FLOPs we used this package: https://github.com/sovrasov/flops-counter.pytorch
Originally posted by @TiagoCortinhal in #17 (comment)
Following your advice, I also calculated FLOPs and parameters myself.
But what is strange is that the parameter I calculated is 6.71M instead of 6.73M in the paper. At the same time, FLOPs are the same as the results in the paper.
Attach my code:
from ptflops import get_model_complexity_info
with torch.cuda.device(0):
model = SalsaNext(nclasses=20)
macs, params = get_model_complexity_info(model, (5, 64, 2048), as_strings=True,
print_per_layer_stat=True, verbose=True)
print('{:<30} {:<8}'.format('Computational complexity: ', macs))
print('{:<30} {:<8}'.format('Number of parameters: ', params))
# Computational complexity: 62.84 GMac 1 Mac = 2 FLOPs
# Number of parameters: 6.71 M
Do you have any suggestions for reproducing the results in the paper?
Best,
Iris