sail-sg / poolformer

PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)

Home Page:https://arxiv.org/abs/2111.11418

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to measure MACs?

DoranLyong opened this issue · comments

Hi, thanks for your nice work :)
I also watched your presentation record through this conference.

I want to apply the poolformer for my work, can I ask how did you measure the MACs of the architecture introduced in your paper?
Or if you were not bothered, I want to ask if I could be shared your measurement code.

Hi @DoranLyong ,

Thanks for your attention.

I used to count the MACs of PoolFormer similar to this code.

However, I found it convenient and accurate enough to use the package fvcore. An example is shown in misc/mac_count_with_fvcore.py. I will update the MAC results measured by fvcore on the arXiv recently.

Thanks for your kind response and code example!

I did check your code and got confused about why you said that FLOPs by fvcore are actually MACs.
I have tried to explore the difference between FLOPs vs. MACs, and found a similar issue.

I understand that FLOP counts both addition and multiplication in separate, but MAC considers both at once.
For example, given ax+b equation, FLOPs are 2(one multiply and one addition), but MAC is 1.

Formally, MACs = $\frac{1}{2}$ $\times$ FLOPs.

When I checked your code in misc directory, I expected there should be division by 2 but there isn't.

Can I ask let me know why FLOP here actually means MAC?

Hi @DoranLyong ,

This is a common typo in many papers on computer vision. FLOP in these papers actually means MAC. For example, ResNet-50 actually has 8.2G FLOPs and 4.1G MACs.

The package fvcore also follows this common typo. You can check it by specifying the model in misc/mac_count_with_fvcore.py.

model = timm.models.resnet50()
# or
# from torchvision.models import resnet50
# model = resnet50()

The output of 4.1G actually means MACs.

got it! =)

thank you very much! I learned a new thing owing to you.

You are welcome :)