Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About julia files in examples - overfitting #144

Open
jskDr opened this issue Oct 6, 2015 · 2 comments
Open

About julia files in examples - overfitting #144

jskDr opened this issue Oct 6, 2015 · 2 comments

Comments

@jskDr
Copy link

jskDr commented Oct 6, 2015

I found that 'test-wasserstein.jl' does not consider overfitting yet although it includes AccuracyLayer. If my understanding is right, I am suggesting to include a new extended example. In neural networks, early stop is one of the essential item to train the network.

The following is my executing results of 'test-wasserstein.jl' with increasing of max_iter to 20000:
Accuracy (avg over 1000) = 90.9000%
...
Accuracy (avg over 1000) = 94.3000%
...
Accuracy (avg over 1000) = 94.0000%
...
Accuracy (avg over 1000) = 93.7000%

@pluskid
Copy link
Owner

pluskid commented Oct 11, 2015

@jskDr Thanks! We have a DecayOnValidation learning rate policy that allows one to half the learning rate when the performance on validation set drops.

@jskDr
Copy link
Author

jskDr commented Oct 11, 2015

@pluskid I mean not improving the performance of a test set but stopping of training after the performance of a test set starts to drop earlier than reaching the stopping criteria, so called early stopping. If early stopping is not yet implemented in add_coffee_break(), I might want to implement it.
http://www.mathworks.com/help/nnet/ug/improve-neural-network-generalization-and-avoid-overfitting.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants