Lyken17 / pytorch-OpCounter

Count the MACs / FLOPs of your PyTorch model.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Is matmul operation automatically considered?

GostInShell opened this issue · comments

Is a torch.matmul operation implemented in a forward layer automatically considered?

The matmul implemented here

def counter_matmul(input_size, output_size):

seems to be used in
def onnx_counter_matmul(diction, node):

Does this indicate the FLOPs of matmul will be considered when using ONNX model rather than a pytorch one?

Yes it does.
ONNX model counter depends on operators, while PyTorch model counter depends on layers. You will get similar results when using both models(if we have implemented related operators and layers).
Thop pytorch model counter will not count matmul, but it will be considered in the case that matmul is used in the layer(such as convolution layers).

Thanks a lot! That perfectly solved my confusion!