-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GFLOPs #4
Comments
We provide tools for testing GFLOPs in our codes. It's in tools -> test_flops.sh . |
Thanks for the prompt reply, I did what you said. It looks like the different is mainly from the transformer module, and I'm sure I haven't changed anything about this piece of code(tokenpose). Can you give me some advice on what causes this difference? |
I assume that the difference might be caused by the update of mmcv. |
I implemented distilpose via mmpose 1.x and found that the GFLOPs don't match what the paper reported?
Input shape: (1, 3, 256, 192)
Flops: 2.724G
Params: 5.413M
2023/07/04 18:03:50 - mmengine - INFO - arch table
+---------------------------+----------------------+-----------+--------------+
| module | #parameters or shape | #flops | #activations |
+---------------------------+----------------------+-----------+--------------+
| model | 5.413M | 2.724G | 19.353M |
| backbone | 0.325M | 1.016G | 6.488M |
| backbone.conv1 | 1.728K | 21.234M | 0.786M |
| backbone.conv1.weight | (64, 3, 3, 3) | | |
| backbone.bn1 | 0.128K | 1.573M | 0 |
| backbone.bn1.weight | (64,) | | |
| backbone.bn1.bias | (64,) | | |
| backbone.conv2 | 36.864K | 0.113G | 0.197M |
| backbone.conv2.weight | (64, 64, 3, 3) | | |
| backbone.bn2 | 0.128K | 0.393M | 0 |
| backbone.bn2.weight | (64,) | | |
| backbone.bn2.bias | (64,) | | |
| backbone.layer1 | 0.286M | 0.879G | 5.505M |
| backbone.layer1.0 | 75.008K | 0.23G | 1.966M |
| backbone.layer1.1 | 70.4K | 0.216G | 1.18M |
| backbone.layer1.2 | 70.4K | 0.216G | 1.18M |
| backbone.layer1.3 | 70.4K | 0.216G | 1.18M |
| head.tokenhead | 5.088M | 1.708G | 12.865M |
| head.tokenhead.keypoin�� | (1, 17, 192) | | |
| head.tokenhead.pos_emb�� | (1, 256, 192) | | |
| head.tokenhead.patch_t�� | 0.59M | 0.151G | 49.152K |
| head.tokenhead.patch_�� | (192, 3072) | | |
| head.tokenhead.patch_�� | (192,) | | |
| head.tokenhead.transfo�� | 4.444M | 1.557G | 12.816M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.transf�� | 0.37M | 0.13G | 1.068M |
| head.tokenhead.mlp_head | 1.349K | 32.64K | 85 |
| head.tokenhead.mlp_he�� | 0.384K | 16.32K | 0 |
| head.tokenhead.mlp_he�� | 0.965K | 16.32K | 85 |
+---------------------------+----------------------+-----------+--------------+
Can you tell me why this happen?
The text was updated successfully, but these errors were encountered: