Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Download URL, Model Weights #154

Open
zhanghang1989 opened this issue May 13, 2021 · 12 comments
Open

Download URL, Model Weights #154

zhanghang1989 opened this issue May 13, 2021 · 12 comments

Comments

@zhanghang1989
Copy link
Owner

Due to hight downloading volume violate the Wasabi cloud rule, please manually download the models for now.

https://drive.google.com/file/d/10GXXl9ekgD3-npa7LiXZpQg6FI1bwpky/view?usp=sharing
https://drive.google.com/file/d/1e-Z88a1c14Cwwn02CiBzQ2kFwO6OozHd/view?usp=sharing
https://drive.google.com/file/d/1uSSvy4V7ALjousc7Tqy3tlGH7nLUijhG/view?usp=sharing
https://drive.google.com/file/d/1dwMhRiuz-E7je-gK0mEFrIXp33MU2_FP/view?usp=sharing

@zhanghang1989 zhanghang1989 pinned this issue May 13, 2021
@skarfie123
Copy link

skarfie123 commented May 14, 2021

Are these for object detection or instance segmentation?
I'm looking for Mask Cascade R-CNN

@skarfie123
Copy link

@skarfie123
Copy link

thank you!

@manningchuor
Copy link

Hello, would it be possible to share a folder with all of the models we could download that were used in your encoding for semantic segmentation? Thanks!

@zhanghang1989
Copy link
Owner Author

Hello, would it be possible to share a folder with all of the models we could download that were used in your encoding for semantic segmentation? Thanks!

Thanks! There are quite a lot of models for semantic segmentation. I probably need to do it when I figure out a new way of releasing models. Do you have a specific model that you're interested in testing now?

@graceehuu
Copy link

Hello, would it be possible to share a folder with all of the models we could download that were used in your encoding for semantic segmentation? Thanks!

Thanks! There are quite a lot of models for semantic segmentation. I probably need to do it when I figure out a new way of releasing models. Do you have a specific model that you're interested in testing now?

Hi! @manningchuor and I are interested in testing EncNet_ResNet50s_ADE - thanks!

@ailzhang
Copy link

ailzhang commented May 17, 2021

Hi @zhanghang1989, I had to disable resnest models in pytorch/hub for now pytorch/hub#198 since the model weights are no longer available.
A possible workaround is if less than 2GB, it’s recommended to attach it to a project release and use the url from the release. Let me know if you are ready to re-enable the test in pytorch hub. Thanks!

@zhanghang1989
Copy link
Owner Author

@ailzhang Thanks a lot for the suggestion! The GitHub release is a great solution.

I just made the changes at:
#156

The torch hub CI may be enabled. Thanks again.

@LbinB
Copy link

LbinB commented Jul 6, 2021

@zhanghang1989 Could you please give me the certain model url used for ResNeSt-50-DCNv2 (ours) in detectron2?

@zhanghang1989
Copy link
Owner Author

zhanghang1989 commented Jul 6, 2021

@ghost
Copy link

ghost commented Aug 31, 2022

Hi, I decided to follow this and try the semantic segmentation task.
I executed the following code.

python scripts/prepare_ade20k.py
python test.py --dataset ADE20K --model-zoo EncNet_ResNet50s_ADE --eval

However, I got an error when downloading the pre-trained model.

The error message says that "resnet50s-a75c83cf.pth" is needed but is not locally available, so it is pulled from wasabi, and the URL is inaccessible due to restrictions.

I saved "resnest50-528c19ca.pth" from the above google drive URL to "/root/.encoding/models/" and ran test.py again, but got the same error.

So, could you please share "resnet50s-a75c83cf.pth" with us?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants