Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why should we double the number of outputs when using dropout? #5

Closed
invisibleroads opened this issue May 18, 2014 · 1 comment · May be fixed by #7
Closed

Why should we double the number of outputs when using dropout? #5

invisibleroads opened this issue May 18, 2014 · 1 comment · May be fixed by #7

Comments

@invisibleroads
Copy link

Hi Daniel,

Thank you for modifying Alex's code to enable Hinton's dropout.

Is it possible for you to please explain in the README why you suggest doubling the number of outputs in the last layer when using dropout?

RHH

In practice, you'll probably also want to double the number of outputs in that layer.

Does that mean if are making a simple binary classifier, then the number of outputs should be four when using dropout? How do we interpret four outputs from a binary classifier?

@dnouri
Copy link
Owner

dnouri commented May 18, 2014

It doesn't say double the number in the last layer. It says double the number in that layer -- where you add dropout.

Regularizing a net with dropout will usually allow you to make it larger compared to a network that doesn't use dropout.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants