Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

validation loss always lower than train loss #225

Open
ycc66104116 opened this issue May 30, 2022 · 0 comments
Open

validation loss always lower than train loss #225

ycc66104116 opened this issue May 30, 2022 · 0 comments

Comments

@ycc66104116
Copy link

hi, i recently use the code to train my own dataset. and i found something strange, that is on tensorboard, my val loss is always lower then train loss. however the train acc is higher than val acc.
the important thing is i don't know why my val loss is always lower than train loss. the gap between the two curves maintain the same, like the image below. i can't make the two converge.
image

i know there are something different in model.train() and model.eval(), but i expect the curves can converge as the epoch increase.
does anyone encountered this question before? would this happened when training dataset is too small?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant