-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is InferenceSession.Run thread-safe? #114
Comments
It's safe to invoke Run() on the same session object in multiple threads. No need for any external synchronization. This aspect is documented in the design doc. https://github.com/Microsoft/onnxruntime/blob/master/docs/HighLevelDesign.md |
@pranavsharma Thanks! Forgot to mention the scope of my question was dotnet nuget package, just want to double confirm it's still true (assuming it's just a slim wrapper of c api) |
Yes, it also true for C#. |
Hi, @pranavsharma , sorry for jumping in the thread, but I am having concern on the thread safety of MKL-DNN execution provider. As this comment mentioned, MKL-DNN has to define |
I'm trying to do inferencing with spark and I'm getting the error:
running it on a single thread Seems to work... Not sure if this has something todo with Thread safety, it only fails one time every 180 images .... |
This is not more true. We experienced concurrency bugs with the tensorRT runtime provider, if there is no mutex lock around the session Run call. |
cc @jywu-msft |
@r0l1 which version of OnnxRuntime/TensorRT EP did you encounter this on? (did you build from source or use a prebuilt package) There was a concurrency bug/regression that was fixed a few months ago. wanted to confirm you are no longer encountering the issue with the latest versions. |
@jywu-msft thank you for the fast response. I opened a new issue here: #19275 |
Is it true that i can keep a single instance for 1 model and call Run method concurrently with no problem? Or should I lock around Run or make a pool of InferenceSession?
The text was updated successfully, but these errors were encountered: