You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @cchenghao
It will make the result unstable. The performance will be around 44~45% from my experience.
Is this difference too big? But I optimized your code a bit. There are many operations that can be distributed to reduce memory. Now I can train the model with batchsize=2 under 12GB. And I want to remind that: I am asking about the first stage, not the second stage. Don't other methods use 1280×720 for training under batchsize=1? Thanks for your reply.
How much performance will decrease if batchsize=1 is used in the first stage? thanks.
The text was updated successfully, but these errors were encountered: