Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

batchsize #15

Open
cchenghao opened this issue Dec 19, 2020 · 2 comments
Open

batchsize #15

cchenghao opened this issue Dec 19, 2020 · 2 comments

Comments

@cchenghao
Copy link

How much performance will decrease if batchsize=1 is used in the first stage? thanks.

@layumi
Copy link
Owner

layumi commented Dec 21, 2020

Hi @cchenghao
It will make the result unstable. The performance will be around 44~45% from my experience.

@cchenghao
Copy link
Author

cchenghao commented Dec 21, 2020

Hi @cchenghao
It will make the result unstable. The performance will be around 44~45% from my experience.

Is this difference too big? But I optimized your code a bit. There are many operations that can be distributed to reduce memory. Now I can train the model with batchsize=2 under 12GB. And I want to remind that: I am asking about the first stage, not the second stage. Don't other methods use 1280×720 for training under batchsize=1? Thanks for your reply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants