Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use OpenVINO as EP for inference modeling on NPUs #22989

Closed
Gusha-nye opened this issue Dec 3, 2024 · 0 comments
Closed

How to use OpenVINO as EP for inference modeling on NPUs #22989

Gusha-nye opened this issue Dec 3, 2024 · 0 comments
Labels
ep:OpenVINO issues related to OpenVINO execution provider

Comments

@Gusha-nye
Copy link

Hello big guys, I have downloaded a project from Github(https://github.com/microsoft/ai-powered-notes-winui3-sample?tab=readme-ov-file) and now I want to adjust the EP of the project to OPENVINO. As shown below.

Image

But currently there is no way to enable NPU and still perform the inference process on GPU. I followed the way in that document (https://onnxruntime.ai/docs/build/eps.html#openvino) to build different EPs for ONNX Runtime, but both failed.
Currently when I run it through the following code, the output is shown below.

DmlExecutionProvider
CPUExecutionProvider

I would like to know how I can configure this to allow the project to reason on the NPU?

@github-actions github-actions bot added the ep:OpenVINO issues related to OpenVINO execution provider label Dec 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ep:OpenVINO issues related to OpenVINO execution provider
Projects
None yet
Development

No branches or pull requests

1 participant