Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not resetting gradParameters #71

Open
ibrahim5253 opened this issue Apr 1, 2017 · 1 comment
Open

Not resetting gradParameters #71

ibrahim5253 opened this issue Apr 1, 2017 · 1 comment

Comments

@ibrahim5253
Copy link

Hello,

In main.lua, while training discriminator in function fDx, why aren't gradParameters reset to zero after the forward pass with real images and before the one with fake images? I think it should matter. Would like to know your thoughts.

@fonfonx
Copy link

fonfonx commented Apr 21, 2017

If gradParameters are reset to 0 after the first forward pass then this forward pass becomes useless because you lose the gradients you have computed. The backward method calls the accGradParameters which accumulate the gradients (it adds them to the gradParameters tensor). Since you want to train on both real and fake examples you need to call this method twice without resetting the gradParameters tensor before the second call.

At the end the gradParameters tensor that is returned contains the sum of the gradients with respect to the real and the fake data.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants