Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to generate the superpoint.pt? #2

Open
liubamboo opened this issue Dec 10, 2020 · 6 comments
Open

How to generate the superpoint.pt? #2

liubamboo opened this issue Dec 10, 2020 · 6 comments

Comments

@liubamboo
Copy link

Hello, your project is great. But I have a question. The superpoint.pt is from pytorch project or c++ project? Could you tell me how to generate it?

@liubamboo liubamboo mentioned this issue Dec 12, 2020
@ChanWoo25
Copy link
Owner

Thanks for your attention! :)

"superpoint.pt" is not much different from the "superpoint_v1.pth" provided by Magicleap. I think it would be helpful if you refer to this LINK.

Simply put, you can read it using pytorch and save it with the pt extension. It's not much about C++.

@liubamboo
Copy link
Author

@ChanWoo25 Thank you! I solve it by using torch.jit.trace to transfer the "superpoint_v1.pth" provided by Magicleap to a new model. And it can work.But now I has a question. I run the CreateVoca to generate the DBoW Vocabulary(yml format)and put the DBoW Vocabulary into superpoint_slam. It doesn't work. It stop in ComputeBoW(). I use the Vocabulary from superpoint_slam and It can work. So I think my DBoW Vocabulary have some bugs. Have you try to replace the DBoW Vocabulary in SuperPoint_SLAM?

And my DBoW yml file is like
vocabulary:
k: 10
L: 5
scoringType: 0
weightingType: 0
nodes:
- { nodeId:1, parentId:0, weight:3.1172121710917007e+00,
descriptor:"0.0354999 0.125961 -0.0899333 0.00940716 -0.0404284 -0.0681934 -0.000698927 -0.129488 0.0596507 -0.0352885 -0.100353 -0.00107607 0.0115445 0.0383281 -0.00665728 0.0467515 -0.198072 0.0997283 0.00653879 0.0282129 0.156231 -0.142805 0.0929327 -0.137059 0.0668409 0.150995 -0.0693844 -0.0766955 -0.041515 -0.153341 -0.0530757 0.0540285 -0.084946 0.0180105 0.121748 -0.0676764 0.230421 0.0762116 0.0669327 -0.0890834 0.126961 -0.188613 0.157426 0.0674022 0.0543198 0.00510156 -0.0718658 -0.0548448 0.0952988 0.0044786 -0.153464 0.0371513 0.109263 0.0892266 -0.0309564 0.0759547 -0.0795075 0.0361993 -0.0254162 0.0726163 -0.0482989 -0.113146 -0.0129798 -0.021902 -0.0136797 0.0957096 -0.201533 -0.0887019 0.151989 -0.171577 0.0205965 -0.0746266 -0.0182118 0.0777944 0.186204 -0.018468 0.194916 -0.0438617 0.0290822 0.0368869 -0.048466 0.0319289 -0.0238049 0.0643619 0.0388753 0.060765 0.118967 -0.128794 0.0865447 0.00288163 -0.0940577 0.0105423 0.0632482 0.0407168 -0.0778853 0.0066329 0.0849079 -0.167564 0.0577636 -0.0248432 -0.0139746 0.110944 0.0570377 -0.0205937 -0.0931026 -0.00960186 0.00720341 -0.0533386 0.121489 0.116467 -0.0546351 0.115026 0.215079 -0.223696 -0.268556 -0.22581 0.131816 -0.0447868 -0.00238234 -0.0351309 0.0907687 0.0915385 -0.0322984 0.0394833 -0.0734853 -0.0545587 -0.0266285 0.137623 0.0620815 -0.0788311 0.0871121 -0.194911 -0.123108 -0.126513 0.140195 -0.0601778 -0.0877871 -0.15231 0.00645607 0.0714905 0.159278 -0.110403 0.151996 -0.0206717 -0.0113557 0.028608 -0.186627 0.176118 -0.068964 0.0219737 -0.0434809 0.134475 0.0660336 0.197536 -0.0774245 -0.169559 0.0735524 -0.117962 0.0240644 0.00481006 -0.212233 -0.125976 -0.038999 -0.141939 -0.0296288 -0.022527 -0.0990938 0.0819112 0.0307572 -0.101368 0.155772 -0.106299 0.0846459 -0.115195 -0.0944857 -0.0163783 0.00181647 0.0297214 -0.0214813 0.15572 0.148555 -0.0451894 -0.0243046 -0.0647202 0.090212 0.0562619 0.103249 -0.0505745 -0.00872183 0.0909672 -0.076492 0.130413 0.0233149 0.100497 -0.0491502 -0.031341 -0.0269015 -0.144976 -0.0639115 -0.0625203 -0.0354729 0.181546 0.109445 0.00274729 0.0505961 -0.149337 -0.0849629 -0.0537802 0.064498 -0.111422 0.0521305 -0.153909 0.0649416 0.0308576 0.106565 -0.143596 -0.175881 -0.103822 0.0445059 0.0837604 -0.123117 -0.0301544 0.0444908 0.0924708 0.150015 0.0405757 -0.00521247 0.107609 0.0918586 -0.0847282 -0.025882 0.139215 -0.0492126 0.0832153 -0.0281849 -0.0227442 0.159082 0.0228036 0.124301 -0.0736366 -0.0804549 0.0800293 0.0624154 -0.0583405 -0.0609488 -0.0556457 0.0762758 0.0391196 0.103908 -0.0837154 0.224688 -0.0982641 -0.0789496 -0.129398 0.024566 -0.0360153 " }
- { nodeId:2, parentId:0, weight:0.,
...

And the DBoW yml file in superpoint_slam is like:
vocabulary:
k: 10
L: 5
scoringType: 0
weightingType: 0
nodes:
- { nodeId:1, parentId:0, weight:0.,
descriptor:"dbw3 5 256 0.0282166 -0.0464277 -0.00155563 0.0195318 0.0127344 0.0175524 0.0249182 0.0505561 0.00974953 0.0191589 -0.126822 -0.0269695 0.0156602 -0.00148103 0.0266696 -0.0278376 0.0183598 0.0225065 0.0102972 0.0135393 -0.0298388 -0.015842 0.0644389 0.0026523 0.019828 -0.00464698 -0.0318678 0.00451956 0.0167672 0.00993656 0.0064205 0.0114803 0.0052258 -0.0204265 -0.00298808 0.0393944 -0.0195278 -0.013887 -0.0227335 -0.0162304 -0.0071446 -0.00422807 0.0428517 -0.0317671 0.0283621 0.0111701 -0.00566452 -0.0400856 -0.0196577 0.0216044 0.0196101 -0.0285099 -0.0151599 0.0212448 -0.0288515 -0.0600829 -0.00861907 0.0214418 0.0375813 0.0166094 -0.00850956 0.010408 -0.0336234 0.0310988 0.0168146 0.0161289 0.0427318 -0.0044199 -0.0122952 -0.0365116 -0.0145487 -0.0397346 -0.0221509 -0.00777042 -0.0464547 -0.00426373 -0.00957853 0.00308297 0.0132791 -0.0196195 -0.0377574 -0.029988 0.0265767 0.0262487 0.0140505 0.00488274 0.00680912 0.0107932 -0.070846 -0.0113015 0.0128653 -0.0112844 0.0181235 -0.00195907 0.0191648 0.0254948 -0.0359513 -0.0123985 -0.0175331 0.0196461 0.000914373 -0.0114064 -0.00685857 -0.00109303 -0.0183508 -0.0132181 0.0116307 -0.0248163 0.0327876 -0.0247617 -0.0454079 0.0262968 -0.0764144 0.0390536 0.0297468 -0.0273197 0.0202212 -0.00681736 0.0196127 0.00834888 -0.0134622 0.012962 0.0112936 0.0178503 0.0262749 -0.0485005 -0.0285796 -0.000385143 0.0147169 -0.0462185 0.00297543 0.0275997 0.0373025 0.0245792 -0.00278855 0.0522282 -0.0288043 0.00905006 -0.00756075 -0.013557 -0.014353 0.000465126 -0.0129177 -0.0350644 0.0297775 0.0208471 0.0449397 -0.0053479 0.00241352 -0.00604664 0.006685 0.0144987 0.0226759 0.00105402 -0.020786 0.0129925 0.0320702 0.0445253 -0.0233705 0.027613 -0.00479526 -0.0295112 -0.0437024 0.0131428 0.0180217 0.0324005 0.00466164 -0.0128045 0.00932035 0.043001 -0.00728973 -0.0377903 0.0018688 0.00254708 -0.0367907 -0.0331428 -0.0475313 -0.0203362 -0.00430156 0.00493915 -0.0095337 0.0088231 -0.0356051 0.0450481 -0.034091 0.00540604 0.00357041 -0.0229231 -0.0263163 0.00353934 0.0208586 -0.0530529 0.0175665 -0.0260063 -0.013499 0.0175644 0.00625461 0.0114385 0.00166789 0.03291 -0.0270635 -0.00973511 -0.0434037 0.0292823 0.00742841 0.000494358 -0.0306714 -0.0159139 -0.0414082 -0.0288355 -0.0115343 -0.00388788 -0.0139166 0.021995 0.00987418 -0.00274973 -0.0514535 0.0107591 -0.00696206 -0.0293223 0.00747669 -0.0166595 -0.00588887 0.00667826 0.0119736 -0.00979452 -0.00674867 1.18017e-06 0.0537927 -0.0359476 -0.0194478 -0.0138035 0.0166307 0.024314 0.00792531 0.015141 -0.0113583 -0.00644338 0.0120866 -0.046237 0.00113854 0.0419657 -0.0702566 -0.0104919 0.0462897 -0.00122678 -0.0525829 -0.000213172 -0.0186052 -0.0135538 -0.0245714 -0.0288605 0.00682914 -0.0338058 0.018147 0.0170514 " }

I notice the DBoW yml file of superpoint_slam has dbw3 5 256 but my DBoW yml file doesn't. And the DBoW yml file of superpoint_slam is generated by DBoW3 but my DBoW yml file is generated by DBoW2.

Could you help me?

@ChanWoo25
Copy link
Owner

I think your problem is issued by SuperPoint-SLAM Repository. If Then, You should open the issue at that repos.

To answer, I have not yet released DBoW yml file for SuperPoint-SLAM. If you want to use your own, then Use DBoW2 yml file.
In the first place, DBoW2 yml file and that of DBoW3 are not very different. This function code may help you understand.

@liubamboo
Copy link
Author

@ChanWoo25 Thanks a lot. And In main_CreateVoca.cpp you use voc.saveToTextFile() to save txt format vocabulary. Can I use voc.save() to save yml format vocabulary?

@Y-pandaman
Copy link

@ChanWoo25 Thank you! I solve it by using torch.jit.trace to transfer the "superpoint_v1.pth" provided by Magicleap to a new model. And it can work.But now I has a question. I run the CreateVoca to generate the DBoW Vocabulary(yml format)and put the DBoW Vocabulary into superpoint_slam. It doesn't work. It stop in ComputeBoW(). I use the Vocabulary from superpoint_slam and It can work. So I think my DBoW Vocabulary have some bugs. Have you try to replace the DBoW Vocabulary in SuperPoint_SLAM?

can you teach me how to convert the .pth files generated by pytorch into .pt files that libtorch can use? I teied torch.jit.trace, but it did not work.

@weiningwei
Copy link

@ChanWoo25 Thank you! I solve it by using torch.jit.trace to transfer the "superpoint_v1.pth" provided by Magicleap to a new model. And it can work.But now I has a question. I run the CreateVoca to generate the DBoW Vocabulary(yml format)and put the DBoW Vocabulary into superpoint_slam. It doesn't work. It stop in ComputeBoW(). I use the Vocabulary from superpoint_slam and It can work. So I think my DBoW Vocabulary have some bugs. Have you try to replace the DBoW Vocabulary in SuperPoint_SLAM?

can you teach me how to convert the .pth files generated by pytorch into .pt files that libtorch can use? I teied torch.jit.trace, but it did not work.

Hello, can you convert the .pth convert the .pth files generated by pytorch into .pt files that libtorch can use? If you can ,could you please tell me how to do it? I get the .pt file, but when I use it, for the same input image, it got the different outputs. This problem bothers me very much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants