Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

updateOutput called multiple times #18

Open
geert56 opened this issue Aug 9, 2016 · 2 comments
Open

updateOutput called multiple times #18

geert56 opened this issue Aug 9, 2016 · 2 comments

Comments

@geert56
Copy link

geert56 commented Aug 9, 2016

Creating a graph did not work for me. I get errors. It turns out that apply_func in graphgen.lua runs the original updateOutput again (after the first forward() call) via base_func. This goes wrong for any layer that somehow creates a new tensor because then the new tensor pointer cannot be found in the nodes table. A simple fix is to do just "return self.output" in those cases.

@fmassa
Copy link
Owner

fmassa commented Aug 9, 2016

Hi, thanks for opening this issue.

The fix you propose is included in this pending PR #10, but only that is not enough to handle all cases (like in nn.Parallel). I haven't merged that PR yet because I'm afraid that the other changes I did there (like overwriting __index) could have unexpected effects.

I can factorize the bit that removes the need of running forward twice and merge it to master if it solves your problem.

But usually, it's a good practice to keep the output tensors of a module unchanged during different forward calls, and only changing its data.

@geert56
Copy link
Author

geert56 commented Aug 11, 2016

I understand that it is not easy to piggy-back on existing Torch routines and be able to handle the diverse suite of "layers" all correctly. I am glad to see my problem confirmed. I have my own work-around and just hope that other can benefit as well. Keep up the good work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants