Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FLOPS in ResNet50_ImageNet #94

Open
miltonmondal opened this issue Jun 14, 2022 · 2 comments
Open

FLOPS in ResNet50_ImageNet #94

miltonmondal opened this issue Jun 14, 2022 · 2 comments
Labels
question Further information is requested

Comments

@miltonmondal
Copy link

miltonmondal commented Jun 14, 2022

Getting 4.12B flops using your code whereas almost all research papers mentioned 4.09B flops for this configuration

(pytorch default 76.15% test accuracy for pretrained model)

Can you please modify the code or mention the reason for getting 0.03B increase in FLOPs?

@jkhu29
Copy link

jkhu29 commented Nov 15, 2022

fvcore will get the output of 4.09G, but it will also print

Skipped operation aten::batch_norm 53 time(s)
Skipped operation aten::max_pool2d 1 time(s)
Skipped operation aten::add_ 16 time(s)
Skipped operation aten::adaptive_avg_pool2d 1 time(s)

Perhaps those papers ignore the computation of some of the operators.

@sovrasov sovrasov added the question Further information is requested label Nov 16, 2022
@sovrasov
Copy link
Owner

@jkhu29 you're right! ptflops also considers batch norms and poolings as non-zero ops, that's why it outputs slightly greater numbers than expected.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants