-
Notifications
You must be signed in to change notification settings - Fork 147
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why should we double the number of outputs when using dropout? #5
Comments
It doesn't say double the number in the last layer. It says double the number in that layer -- where you add dropout. Regularizing a net with dropout will usually allow you to make it larger compared to a network that doesn't use dropout. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi Daniel,
Thank you for modifying Alex's code to enable Hinton's dropout.
Is it possible for you to please explain in the README why you suggest doubling the number of outputs in the last layer when using dropout?
RHH
Does that mean if are making a simple binary classifier, then the number of outputs should be four when using dropout? How do we interpret four outputs from a binary classifier?
The text was updated successfully, but these errors were encountered: