-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
C# I need to run the program on NPU (OnnxRuntime + DirectML + NPU),but it failed #19846
Comments
Cc: @fdwr As a side note, please, read the following article. https://onnxruntime.ai/docs/tutorials/csharp/basic_csharp.html |
您好!您的邮件已收到,感谢您的来信!
|
Cc: @wchao1115 |
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details. |
您好!您的邮件已收到,感谢您的来信!
|
I also have the same problem on intel npu run with directml. Did someone fix this problem? |
您好!您的邮件已收到,感谢您的来信!
|
It looks like the basic problem here is that there's no way to set provider options when using the DirectML provider from C#. What you want to be able to write is something like this: // Sadly, this does not exist
options.AppendExecutionProvider_DML(
3,
new Dictionary<string, string> { { "device_filter", "any" } }); The DML provider accepts a couple of provider options, and the I did try this: // Sadly, although this expresses what we want, the library rejects it
options.AppendExecutionProvider(
"DML",
new Dictionary<string, string> { { "device_filter", "any" }, { "device_id", "3" } }); but that also doesn't work because this general-looking So there doesn't appear to be a way to get this working with There's an additional question of whether the DirectML provider will actually be able to use the NPU. I have Surface Laptop Studio 2 which I think has the 3700VC NPU, and although there is a DirectML implementation, even with the latest Intel drivers for it, none of the examples I've found that are supposed to show how to use an Intel NPU through DirectML actually work. When it comes to create the device object, there's a half second delay and then it fails reporting
However, I know people have got the DirectML examples working for newer Intel NPUs, so it would still be good for Intel seem to push people to use Open VINO in preference to DirectML on their NPUs (presumably because that ties you to Intel) so possibly the Open VINO ONNX provider is the way to go, but as far as I can see, there isn't a pre-built Open VINO ONNX package for C# up on NuGet. It looks like you have to build your own! |
您好!您的邮件已收到,感谢您的来信!
|
Describe the issue
Hello, big guys, I checked the Microsoft blog and did the following test
https://blogs.windows.com/windowsdeveloper/2024/02/01/introducing-neural-processor-unit-npu-support-in-directml-developer-preview/
I use Windows 11 devices with Intel® Core™ Ultra processors with Intel® AI boost.
I have the latest NPU driver installed on my device, and intel openvino can reason successfully based on NPU
C# I need to run the program on NPU (OnnxRuntime + DirectML + NPU),but it failed。
so, Install Microsoft.AI.DirectML 1.13.1 and the Microsoft.ML.OnnxRuntime.DirectML 1.17.1 on VS2022 Nuget
Write C# test code
SessionOptions options = new SessionOptions();
options.AppendExecutionProvider_DMl(0) // GPU
Call AppendExecutionProvider_DMl function and set the parameter 0 (GPU). Inference can be successful based on GPU.
SessionOptions options = new SessionOptions();
options.AppendExecutionProvider_DMl(1) // NPU
ET but set parameter 1 (NPU), inference based on NPU error, error is as follows:
Microsoft.ML.OnnxRuntime.OnnxRuntimeException: '[ErrorCode:RuntimeException] D:\a_work\1\s\onnxruntime\core\providers\dml\dml_provider_factory.cc(442)\onnxruntime.DLL!00007FF8DEFC3304: (caller: 00007FF8DEFC35B8) Exception(1) tid(5da8) C0262002 Specified display adapter handle is invalid.
Query the source code of OnnxRuntime github, but build a C++ unit test environment to test some functions in the dml_provider_factory.cc file, the test code can be run on the device to enumerate the NPU device adapter information, and then 442 lines of code does not throw an exception.
Question 1: Why does C# code get an error by calling the NuGet package onnxruntime+directml+npu?
Question 2: How does onnxruntime+directml run on npu?
thanks
To reproduce
// c# code
static void Main(string[] args)
{
var availableProviders = OrtEnv.Instance().GetAvailableProviders();
// c++ code
dml_provider_test.zip
Urgency
No response
Platform
Windows
OS Version
Windows 11 Pro 22H2 22621.3155
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.17.1
ONNX Runtime API
C#
Architecture
X64
Execution Provider
DirectML
Execution Provider Library Version
Microsoft.AI.DirectML 1.13.1
The text was updated successfully, but these errors were encountered: