-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error: In include/poptorch_err/ExceptionHandling.hpp:76: 'poplar_stream_memory_allocation_error' #9
Comments
Hi, the exception you're hitting is thrown when the allocation of stream buffer fails: in this case, too much memory is being used by a data stream ( It would be great if you could share instructions to reproduce the error and SDK version you've been using: this would enable us to to share specific remedies to the issue. In the meanwhile, here are some more generic suggestions that might help:
Hope this helps for now, I encourage you to share a reproducer for us to investigate this further. Best, |
This is for training, so I left outputMode at default, which should already be last. What values would be streamed from IPU to host when training? The loss calculation would be done on the IPU since it's part of the model. Could the stream error be related to data streaming between IPUs? |
No, those streams are for IPU/host communication. Have you had a chance to try any of the other suggestions @ariannas-graphcore provided? |
Yeah. Those either doesn't work, or can't be used (lower further) |
By any chance are you trying to profile the Poplar executable by generating a PopVision profile? (by setting the environment variable: We can try and provide more specific support but we would need additional information on the system, software and model you are encountering the error in:
Ideally if you can send us a code sample which reproduces the error we can provide more specific advice to help fix the problem. |
Is there an explanation of this error?
The text was updated successfully, but these errors were encountered: