pplntech / pytorch_model_summary

pytorch model summary, statistic parameters number, memory usage, FLOPs and so on

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

PyTorch model summary

pytorch model summary, statistic parameters number, memory usage, MAdd and so on

use ResNet50 as an example

                           module name   input shape  output shape  parameter quantity inference memory(MB)         MAdd duration percent
0                         conv1_Conv2d     3 224 224    64 112 112                9408               3.06MB  235,225,088           26.32%
1                      bn1_BatchNorm2d    64 112 112    64 112 112                 128               3.06MB    3,211,264            0.95%
2                            relu_ReLU    64 112 112    64 112 112                   0               3.06MB      802,816            0.61%
3                    maxpool_MaxPool2d    64 112 112    64  56  56                   0               0.77MB    1,605,632            1.64%
4                layer1.0.conv1_Conv2d    64  56  56    64  56  56                4096               0.77MB   25,489,408            0.34%
5             layer1.0.bn1_BatchNorm2d    64  56  56    64  56  56                 128               0.77MB      802,816            0.16%
6                layer1.0.conv2_Conv2d    64  56  56    64  56  56               36864               0.77MB  231,010,304            2.47%
7             layer1.0.bn2_BatchNorm2d    64  56  56    64  56  56                 128               0.77MB      802,816            0.23%
8                layer1.0.conv3_Conv2d    64  56  56   256  56  56               16384               3.06MB  101,957,632            0.68%
9             layer1.0.bn3_BatchNorm2d   256  56  56   256  56  56                 512               3.06MB    3,211,264            0.94%
10                  layer1.0.relu_ReLU   256  56  56   256  56  56                   0               3.06MB      802,816            0.53%
11        layer1.0.downsample.0_Conv2d    64  56  56   256  56  56               16384               3.06MB  101,957,632            1.12%
12   layer1.0.downsample.1_BatchNorm2d   256  56  56   256  56  56                 512               3.06MB    3,211,264            0.89%
13               layer1.1.conv1_Conv2d   256  56  56    64  56  56               16384               0.77MB  102,559,744            0.61%
14            layer1.1.bn1_BatchNorm2d    64  56  56    64  56  56                 128               0.77MB      802,816            0.20%
15               layer1.1.conv2_Conv2d    64  56  56    64  56  56               36864               0.77MB  231,010,304            2.50%
16            layer1.1.bn2_BatchNorm2d    64  56  56    64  56  56                 128               0.77MB      802,816            0.24%
17               layer1.1.conv3_Conv2d    64  56  56   256  56  56               16384               3.06MB  101,957,632            0.68%
18            layer1.1.bn3_BatchNorm2d   256  56  56   256  56  56                 512               3.06MB    3,211,264            0.87%
19                  layer1.1.relu_ReLU   256  56  56   256  56  56                   0               3.06MB      802,816            0.47%
20               layer1.2.conv1_Conv2d   256  56  56    64  56  56               16384               0.77MB  102,559,744            0.87%
21            layer1.2.bn1_BatchNorm2d    64  56  56    64  56  56                 128               0.77MB      802,816            0.19%
22               layer1.2.conv2_Conv2d    64  56  56    64  56  56               36864               0.77MB  231,010,304            2.50%
23            layer1.2.bn2_BatchNorm2d    64  56  56    64  56  56                 128               0.77MB      802,816            0.17%
24               layer1.2.conv3_Conv2d    64  56  56   256  56  56               16384               3.06MB  101,957,632            0.65%
25            layer1.2.bn3_BatchNorm2d   256  56  56   256  56  56                 512               3.06MB    3,211,264            0.82%
26                  layer1.2.relu_ReLU   256  56  56   256  56  56                   0               3.06MB      802,816            0.41%
27               layer2.0.conv1_Conv2d   256  56  56   128  56  56               32768               1.53MB  205,119,488            1.30%
28            layer2.0.bn1_BatchNorm2d   128  56  56   128  56  56                 256               1.53MB    1,605,632            0.39%
29               layer2.0.conv2_Conv2d   128  56  56   128  28  28              147456               0.38MB  231,110,656            2.52%
30            layer2.0.bn2_BatchNorm2d   128  28  28   128  28  28                 256               0.38MB      401,408            0.19%
31               layer2.0.conv3_Conv2d   128  28  28   512  28  28               65536               1.53MB  102,359,040            0.83%
32            layer2.0.bn3_BatchNorm2d   512  28  28   512  28  28                1024               1.53MB    1,605,632            0.70%
33                  layer2.0.relu_ReLU   512  28  28   512  28  28                   0               1.53MB      401,408            0.21%
34        layer2.0.downsample.0_Conv2d   256  56  56   512  28  28              131072               1.53MB  205,119,488            1.47%
35   layer2.0.downsample.1_BatchNorm2d   512  28  28   512  28  28                1024               1.53MB    1,605,632            0.68%
36               layer2.1.conv1_Conv2d   512  28  28   128  28  28               65536               0.38MB  102,660,096            0.33%
37            layer2.1.bn1_BatchNorm2d   128  28  28   128  28  28                 256               0.38MB      401,408            0.12%
38               layer2.1.conv2_Conv2d   128  28  28   128  28  28              147456               0.38MB  231,110,656            1.98%
39            layer2.1.bn2_BatchNorm2d   128  28  28   128  28  28                 256               0.38MB      401,408            0.19%
40               layer2.1.conv3_Conv2d   128  28  28   512  28  28               65536               1.53MB  102,359,040            0.50%
41            layer2.1.bn3_BatchNorm2d   512  28  28   512  28  28                1024               1.53MB    1,605,632            0.44%
42                  layer2.1.relu_ReLU   512  28  28   512  28  28                   0               1.53MB      401,408            0.14%
43               layer2.2.conv1_Conv2d   512  28  28   128  28  28               65536               0.38MB  102,660,096            0.70%
44            layer2.2.bn1_BatchNorm2d   128  28  28   128  28  28                 256               0.38MB      401,408            0.18%
45               layer2.2.conv2_Conv2d   128  28  28   128  28  28              147456               0.38MB  231,110,656            1.43%
46            layer2.2.bn2_BatchNorm2d   128  28  28   128  28  28                 256               0.38MB      401,408            0.18%
47               layer2.2.conv3_Conv2d   128  28  28   512  28  28               65536               1.53MB  102,359,040            0.62%
48            layer2.2.bn3_BatchNorm2d   512  28  28   512  28  28                1024               1.53MB    1,605,632            0.48%
49                  layer2.2.relu_ReLU   512  28  28   512  28  28                   0               1.53MB      401,408            0.15%
50               layer2.3.conv1_Conv2d   512  28  28   128  28  28               65536               0.38MB  102,660,096            0.47%
51            layer2.3.bn1_BatchNorm2d   128  28  28   128  28  28                 256               0.38MB      401,408            0.17%
52               layer2.3.conv2_Conv2d   128  28  28   128  28  28              147456               0.38MB  231,110,656            1.20%
53            layer2.3.bn2_BatchNorm2d   128  28  28   128  28  28                 256               0.38MB      401,408            0.12%
54               layer2.3.conv3_Conv2d   128  28  28   512  28  28               65536               1.53MB  102,359,040            0.79%
55            layer2.3.bn3_BatchNorm2d   512  28  28   512  28  28                1024               1.53MB    1,605,632            0.69%
56                  layer2.3.relu_ReLU   512  28  28   512  28  28                   0               1.53MB      401,408            0.18%
57               layer3.0.conv1_Conv2d   512  28  28   256  28  28              131072               0.77MB  205,320,192            0.69%
58            layer3.0.bn1_BatchNorm2d   256  28  28   256  28  28                 512               0.77MB      802,816            0.23%
59               layer3.0.conv2_Conv2d   256  28  28   256  14  14              589824               0.19MB  231,160,832            1.12%
60            layer3.0.bn2_BatchNorm2d   256  14  14   256  14  14                 512               0.19MB      200,704            0.12%
61               layer3.0.conv3_Conv2d   256  14  14  1024  14  14              262144               0.77MB  102,559,744            0.32%
62            layer3.0.bn3_BatchNorm2d  1024  14  14  1024  14  14                2048               0.77MB      802,816            0.63%
63                  layer3.0.relu_ReLU  1024  14  14  1024  14  14                   0               0.77MB      200,704            0.07%
64        layer3.0.downsample.0_Conv2d   512  28  28  1024  14  14              524288               0.77MB  205,320,192            1.41%
65   layer3.0.downsample.1_BatchNorm2d  1024  14  14  1024  14  14                2048               0.77MB      802,816            0.71%
66               layer3.1.conv1_Conv2d  1024  14  14   256  14  14              262144               0.19MB  102,710,272            0.29%
67            layer3.1.bn1_BatchNorm2d   256  14  14   256  14  14                 512               0.19MB      200,704            0.12%
68               layer3.1.conv2_Conv2d   256  14  14   256  14  14              589824               0.19MB  231,160,832            0.90%
69            layer3.1.bn2_BatchNorm2d   256  14  14   256  14  14                 512               0.19MB      200,704            0.12%
70               layer3.1.conv3_Conv2d   256  14  14  1024  14  14              262144               0.77MB  102,559,744            0.33%
71            layer3.1.bn3_BatchNorm2d  1024  14  14  1024  14  14                2048               0.77MB      802,816            0.44%
72                  layer3.1.relu_ReLU  1024  14  14  1024  14  14                   0               0.77MB      200,704            0.07%
73               layer3.2.conv1_Conv2d  1024  14  14   256  14  14              262144               0.19MB  102,710,272            0.62%
74            layer3.2.bn1_BatchNorm2d   256  14  14   256  14  14                 512               0.19MB      200,704            0.19%
75               layer3.2.conv2_Conv2d   256  14  14   256  14  14              589824               0.19MB  231,160,832            2.00%
76            layer3.2.bn2_BatchNorm2d   256  14  14   256  14  14                 512               0.19MB      200,704            0.12%
77               layer3.2.conv3_Conv2d   256  14  14  1024  14  14              262144               0.77MB  102,559,744            0.33%
78            layer3.2.bn3_BatchNorm2d  1024  14  14  1024  14  14                2048               0.77MB      802,816            0.43%
79                  layer3.2.relu_ReLU  1024  14  14  1024  14  14                   0               0.77MB      200,704            0.07%
80               layer3.3.conv1_Conv2d  1024  14  14   256  14  14              262144               0.19MB  102,710,272            0.39%
81            layer3.3.bn1_BatchNorm2d   256  14  14   256  14  14                 512               0.19MB      200,704            0.12%
82               layer3.3.conv2_Conv2d   256  14  14   256  14  14              589824               0.19MB  231,160,832            1.57%
83            layer3.3.bn2_BatchNorm2d   256  14  14   256  14  14                 512               0.19MB      200,704            0.16%
84               layer3.3.conv3_Conv2d   256  14  14  1024  14  14              262144               0.77MB  102,559,744            0.32%
85            layer3.3.bn3_BatchNorm2d  1024  14  14  1024  14  14                2048               0.77MB      802,816            0.42%
86                  layer3.3.relu_ReLU  1024  14  14  1024  14  14                   0               0.77MB      200,704            0.08%
87               layer3.4.conv1_Conv2d  1024  14  14   256  14  14              262144               0.19MB  102,710,272            0.41%
88            layer3.4.bn1_BatchNorm2d   256  14  14   256  14  14                 512               0.19MB      200,704            0.12%
89               layer3.4.conv2_Conv2d   256  14  14   256  14  14              589824               0.19MB  231,160,832            1.09%
90            layer3.4.bn2_BatchNorm2d   256  14  14   256  14  14                 512               0.19MB      200,704            0.25%
91               layer3.4.conv3_Conv2d   256  14  14  1024  14  14              262144               0.77MB  102,559,744            0.66%
92            layer3.4.bn3_BatchNorm2d  1024  14  14  1024  14  14                2048               0.77MB      802,816            0.76%
93                  layer3.4.relu_ReLU  1024  14  14  1024  14  14                   0               0.77MB      200,704            0.10%
94               layer3.5.conv1_Conv2d  1024  14  14   256  14  14              262144               0.19MB  102,710,272            0.42%
95            layer3.5.bn1_BatchNorm2d   256  14  14   256  14  14                 512               0.19MB      200,704            0.12%
96               layer3.5.conv2_Conv2d   256  14  14   256  14  14              589824               0.19MB  231,160,832            0.84%
97            layer3.5.bn2_BatchNorm2d   256  14  14   256  14  14                 512               0.19MB      200,704            0.12%
98               layer3.5.conv3_Conv2d   256  14  14  1024  14  14              262144               0.77MB  102,559,744            0.44%
99            layer3.5.bn3_BatchNorm2d  1024  14  14  1024  14  14                2048               0.77MB      802,816            0.55%
100                 layer3.5.relu_ReLU  1024  14  14  1024  14  14                   0               0.77MB      200,704            0.15%
101              layer4.0.conv1_Conv2d  1024  14  14   512  14  14              524288               0.38MB  205,420,544            1.00%
102           layer4.0.bn1_BatchNorm2d   512  14  14   512  14  14                1024               0.38MB      401,408            0.44%
103              layer4.0.conv2_Conv2d   512  14  14   512   7   7             2359296               0.10MB  231,185,920            1.63%
104           layer4.0.bn2_BatchNorm2d   512   7   7   512   7   7                1024               0.10MB      100,352            0.16%
105              layer4.0.conv3_Conv2d   512   7   7  2048   7   7             1048576               0.38MB  102,660,096            0.31%
106           layer4.0.bn3_BatchNorm2d  2048   7   7  2048   7   7                4096               0.38MB      401,408            0.62%
107                 layer4.0.relu_ReLU  2048   7   7  2048   7   7                   0               0.38MB      100,352            0.04%
108       layer4.0.downsample.0_Conv2d  1024  14  14  2048   7   7             2097152               0.38MB  205,420,544            0.61%
109  layer4.0.downsample.1_BatchNorm2d  2048   7   7  2048   7   7                4096               0.38MB      401,408            0.62%
110              layer4.1.conv1_Conv2d  2048   7   7   512   7   7             1048576               0.10MB  102,735,360            0.35%
111           layer4.1.bn1_BatchNorm2d   512   7   7   512   7   7                1024               0.10MB      100,352            0.16%
112              layer4.1.conv2_Conv2d   512   7   7   512   7   7             2359296               0.10MB  231,185,920            0.78%
113           layer4.1.bn2_BatchNorm2d   512   7   7   512   7   7                1024               0.10MB      100,352            0.16%
114              layer4.1.conv3_Conv2d   512   7   7  2048   7   7             1048576               0.38MB  102,660,096            0.94%
115           layer4.1.bn3_BatchNorm2d  2048   7   7  2048   7   7                4096               0.38MB      401,408            1.17%
116                 layer4.1.relu_ReLU  2048   7   7  2048   7   7                   0               0.38MB      100,352            0.04%
117              layer4.2.conv1_Conv2d  2048   7   7   512   7   7             1048576               0.10MB  102,735,360            0.33%
118           layer4.2.bn1_BatchNorm2d   512   7   7   512   7   7                1024               0.10MB      100,352            0.16%
119              layer4.2.conv2_Conv2d   512   7   7   512   7   7             2359296               0.10MB  231,185,920            0.78%
120           layer4.2.bn2_BatchNorm2d   512   7   7   512   7   7                1024               0.10MB      100,352            0.17%
121              layer4.2.conv3_Conv2d   512   7   7  2048   7   7             1048576               0.38MB  102,660,096            0.29%
122           layer4.2.bn3_BatchNorm2d  2048   7   7  2048   7   7                4096               0.38MB      401,408            0.63%
123                 layer4.2.relu_ReLU  2048   7   7  2048   7   7                   0               0.38MB      100,352            0.05%
124                  avgpool_AvgPool2d  2048   7   7  2048   1   1                   0               0.01MB      100,352            0.05%
125                          fc_Linear          2048          1000             2049000               0.00MB    4,095,000            0.66%
=========================================================================================================================================
total parameters quantity: 25,557,032
total memory: 109.69MB
total MAdd: 8,219,737,624

About

pytorch model summary, statistic parameters number, memory usage, FLOPs and so on


Languages

Language:Python 100.0%