Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GFLOPs #4

Open
amazing-cc opened this issue Jul 4, 2023 · 3 comments
Open

GFLOPs #4

amazing-cc opened this issue Jul 4, 2023 · 3 comments

Comments

@amazing-cc
Copy link

amazing-cc commented Jul 4, 2023

I implemented distilpose via mmpose 1.x and found that the GFLOPs don't match what the paper reported?

Input shape: (1, 3, 256, 192)
Flops: 2.724G
Params: 5.413M

2023/07/04 18:03:50 - mmengine - INFO - arch table

+---------------------------+----------------------+-----------+--------------+
| module | #parameters or shape | #flops | #activations |
+---------------------------+----------------------+-----------+--------------+
| model | 5.413M | 2.724G | 19.353M |
| backbone | 0.325M | 1.016G | 6.488M |
| backbone.conv1 | 1.728K | 21.234M | 0.786M |
| backbone.conv1.weight | (64, 3, 3, 3) | | |
| backbone.bn1 | 0.128K | 1.573M | 0 |
| backbone.bn1.weight | (64,) | | |
| backbone.bn1.bias | (64,) | | |
| backbone.conv2 | 36.864K | 0.113G | 0.197M |
| backbone.conv2.weight | (64, 64, 3, 3) | | |
| backbone.bn2 | 0.128K | 0.393M | 0 |
| backbone.bn2.weight | (64,) | | |
| backbone.bn2.bias | (64,) | | |
| backbone.layer1 | 0.286M | 0.879G | 5.505M |
| backbone.layer1.0 | 75.008K | 0.23G | 1.966M |
| backbone.layer1.1 | 70.4K | 0.216G | 1.18M |
| backbone.layer1.2 | 70.4K | 0.216G | 1.18M |
| backbone.layer1.3 | 70.4K | 0.216G | 1.18M |
| head.tokenhead | 5.088M | 1.708G | 12.865M |
| head.tokenhead.keypoin�� | (1, 17, 192) | | |
| head.tokenhead.pos_emb�� | (1, 256, 192) | | |
| head.tokenhead.patch_t�� | 0.59M | 0.151G | 49.152K |
| head.tokenhead.patch_�� | (192, 3072) | | |
| head.tokenhead.patch_�� | (192,) | | |
| head.tokenhead.transfo�� | 4.444M | 1.557G | 12.816M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.mlp_head | 1.349K | 32.64K | 85 |
| head.tokenhead.mlp_he�� | 0.384K | 16.32K | 0 |
| head.tokenhead.mlp_he�� | 0.965K | 16.32K | 85 |
+---------------------------+----------------------+-----------+--------------+

Can you tell me why this happen?

@yshMars
Copy link
Owner

yshMars commented Jul 4, 2023

We provide tools for testing GFLOPs in our codes. It's in tools -> test_flops.sh .
You can run the script and it will disaplay the detailed info about GFLOPs. Then you can compare them to your reimplmented version.
Sorry I've been busy recently and I can't do these work these days. If you find anything please tell me in this issue or contact me via email. Thanks a lot.

@amazing-cc
Copy link
Author

Thanks for the prompt reply, I did what you said. It looks like the different is mainly from the transformer module, and I'm sure I haven't changed anything about this piece of code(tokenpose).
image

Can you give me some advice on what causes this difference?
Could it be because of a version issue with PyTorch (Torch 1.7.0 v.s. Torch>=1.11.0)? Or is the old version of MMPOSE not counting accurately enough?
thanks again.

@yshMars
Copy link
Owner

yshMars commented Jul 4, 2023

I assume that the difference might be caused by the update of mmcv.
If you check get_flops.py, you can find that it's the function get_model_complexity_info that does all the gflops computation work, and it's from mmcv. It lies in mmcv/cnn/utils/flops_counter.py .
You might have to check the history of get_model_complexity_info to find out what was changed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants