SNN with BatchNorm layers doesn't give the correct validation accuracy when net.eval() is set (there is some conflicting issue between net.eval() and BatchNorm2d in SNNs,) #561
Unanswered
gwgknudayanga
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi Experts,
I trained a SNN which has BatchNorm2d layers also.
But at the validation it always gives very low accuracy though the training accuracy is good.
When i remove the net.eval() (still keeping the torch.no_grad()) at the validation time, then validation accuracy is good.
When I removed the batchnorm layers this issue involve with net.eval() doesn't exist. It seems there is some conflicting issue between net.eval() and BatchNorm2d in SNNs,
Thank you.
Rgds.
Udayanga
Beta Was this translation helpful? Give feedback.
All reactions