Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistency when batch size varies #11

Closed
ShengyuH opened this issue Nov 6, 2019 · 2 comments
Closed

Inconsistency when batch size varies #11

ShengyuH opened this issue Nov 6, 2019 · 2 comments

Comments

@ShengyuH
Copy link

ShengyuH commented Nov 6, 2019

hi Chris,

I'm performing a classification task using MinkowskiEngine. I train the neural network with batch_size=8.

I set the batch_size to be 1 at test phase, and the result is quite bad, there's a big gap between validation set and test set. I increase the batch_size and the result is better.

Can you help explain this? I know the batch normalization layer might be the reason, but I've never encounter this phenomenon with other frameworks.

@chrischoy
Copy link
Owner

Maybe model.eval()?

@ShengyuH
Copy link
Author

ShengyuH commented Nov 6, 2019

Yes!! I should miss model.eval(). Thanks!

@ShengyuH ShengyuH closed this as completed Nov 6, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants