will torch.matmul regards as zero_ops ?
DavideHe opened this issue · comments
DavideHe commented
when I get the macs with thop ,I find the op's macs of torch.matmul
will is 0, but it is very heavy macs in self attention of transformer
class M(nn.Module):
def __init__(self):
super().__init__()
def forward(self,x):
out = torch.matmul(x,x.transpose(-1, -2))
print(out.shape)
return out