Lyken17 / pytorch-OpCounter

Count the MACs / FLOPs of your PyTorch model.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Different results when reusing the defined nn.LeakyReLU()

csbhr opened this issue · comments

Thank you for sharing the excellent tools.

I got a problem when I calculated the MACs of my model where a defined nn.LeakyReLU() was reused. An example is as following:

import torch
import torch.nn as nn
from thop import profile

class MyModel1(nn.Module):
    def __init__(self):
        super(MyModel1, self).__init__()
        act = nn.LeakyReLU()
        self.m1 = nn.Sequential(
            nn.Conv2d(32, 32, 3, 1, 1),
            act)
        self.m2 = nn.Sequential(
            nn.Conv2d(32, 32, 3, 1, 1),
            act)

    def forward(self, x):
        x = self.m1(x)
        x = self.m2(x)
        return x

class MyModel2(nn.Module):
    def __init__(self):
        super(MyModel2, self).__init__()
        self.m1 = nn.Sequential(
            nn.Conv2d(32, 32, 3, 1, 1),
            nn.LeakyReLU())
        self.m2 = nn.Sequential(
            nn.Conv2d(32, 32, 3, 1, 1),
            nn.LeakyReLU())

    def forward(self, x):
        x = self.m1(x)
        x = self.m2(x)
        return x

if __name__ == '__main__':
    model = MyModel1()
    input = torch.randn(1, 32, 256, 256)
    macs, params = profile(model, inputs=(input,))
    print('Model1: ', macs, params)
    # Output: Model1:  1228931072.0 18496.0

    model = MyModel2()
    input = torch.randn(1, 32, 256, 256)
    macs, params = profile(model, inputs=(input,))
    print('Model2: ', macs, params)
    # Output: Model2:  1216348160.0 18496.0

For these two models, the operations are the same, but the calculated MACs are indeed different. Is this something wrong?

Looking forward to your reply.

Same problem with #162
Please check it.