Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add discriminative loss for mlp and y-hat #5

Open
dribnet opened this issue Jun 25, 2016 · 1 comment · May be fixed by #6
Open

add discriminative loss for mlp and y-hat #5

dribnet opened this issue Jun 25, 2016 · 1 comment · May be fixed by #6

Comments

@dribnet
Copy link
Contributor

dribnet commented Jun 25, 2016

This code has been working great for me. Currently, the discriminative regularization in this codebase uses the loss from the batch norm layers of the ConvolutionalSequence, but the paper covers the more general case of taking loss from elsewhere in the classifier, as shown below:

4654ee39-75d4-463c-b808-8bd4a2c5e616

I'd like to add losses at the higher layers - MLP and y-hat. I'm curious if anyone else is interested in working on this with me - the discriminative regularization parts are some of trickier parts of building the training computation graph, and y-hat currently doesn't have a named node in the classifier graph.

@vdumoulin
Copy link
Owner

I think that would be a good idea. I won't have time to work on it, but if you have questions or would like guidance I can sporadically chime in (and of course review the PR).

@dribnet dribnet linked a pull request Aug 12, 2016 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants