Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Onnx Runtime EntryPointNotFoundException: OrtGetApiBase in Unity Application. #20048

Open
ledbetterj1atwit opened this issue Mar 23, 2024 · 3 comments
Labels
platform:windows issues related to the Windows platform

Comments

@ledbetterj1atwit
Copy link

ledbetterj1atwit commented Mar 23, 2024

Describe the issue

Im using ML.net and The Onnx Runtime (3.0.1, 1.17.1 respectively) to load an Onnx model for inference from within a Unity(2021.3.23f1) application. The packages were installed from NuGet using NuGetForUnity which places the packages in <project dir>/Assets/Packages

Whenever I try to apply the model (mlContext.Transforms.ApplyOnnxModel(outputColumnNames, inputColumnNames, modelPathS);) I get

EntryPointNotFoundException: OrtGetApiBase assembly:<unknown assembly> type:<unknown type> member:(null)
Microsoft.ML.OnnxRuntime.NativeMethods..cctor () (at <db8a5557c9b54ae7905ea2adba430b5c>:0)
Rethrow as TypeInitializationException: The type initializer for 'Microsoft.ML.OnnxRuntime.NativeMethods' threw an exception.
Microsoft.ML.OnnxRuntime.SessionOptions..ctor () (at <db8a5557c9b54ae7905ea2adba430b5c>:0)
Microsoft.ML.Transforms.Onnx.OnnxModel..ctor (System.String modelFile, System.Nullable`1[T] gpuDeviceId, System.Boolean fallbackToCpu, System.Boolean ownModelFile, System.Collections.Generic.IDictionary`2[TKey,TValue] shapeDictionary, System.Int32 recursionLimit, System.Nullable`1[T] interOpNumThreads, System.Nullable`1[T] intraOpNumThreads) (at <23755ae4f3b243f8bebd1b959e7493a8>:0)
Microsoft.ML.Transforms.Onnx.OnnxTransformer..ctor (Microsoft.ML.Runtime.IHostEnvironment env, Microsoft.ML.Transforms.Onnx.OnnxTransformer+Options options, System.Byte[] modelBytes) (at <23755ae4f3b243f8bebd1b959e7493a8>:0)
Microsoft.ML.Transforms.Onnx.OnnxTransformer..ctor (Microsoft.ML.Runtime.IHostEnvironment env, System.String[] outputColumnNames, System.String[] inputColumnNames, System.String modelFile, System.Nullable`1[T] gpuDeviceId, System.Boolean fallbackToCpu, System.Collections.Generic.IDictionary`2[TKey,TValue] shapeDictionary, System.Int32 recursionLimit, System.Nullable`1[T] interOpNumThreads, System.Nullable`1[T] intraOpNumThreads) (at <23755ae4f3b243f8bebd1b959e7493a8>:0)
Microsoft.ML.Transforms.Onnx.OnnxScoringEstimator..ctor (Microsoft.ML.Runtime.IHostEnvironment env, System.String[] outputColumnNames, System.String[] inputColumnNames, System.String modelFile, System.Nullable`1[T] gpuDeviceId, System.Boolean fallbackToCpu, System.Collections.Generic.IDictionary`2[TKey,TValue] shapeDictionary, System.Int32 recursionLimit, System.Nullable`1[T] interOpNumThreads, System.Nullable`1[T] intraOpNumThreads) (at <23755ae4f3b243f8bebd1b959e7493a8>:0)
Microsoft.ML.OnnxCatalog.ApplyOnnxModel (Microsoft.ML.TransformsCatalog catalog, System.String[] outputColumnNames, System.String[] inputColumnNames, System.String modelFile, System.Nullable`1[T] gpuDeviceId, System.Boolean fallbackToCpu) (at <23755ae4f3b243f8bebd1b959e7493a8>:0)
SicknessPredictor+PredictJob.Execute () (at Assets/SicknessPredictor.cs:353)
Unity.Jobs.IJobExtensions+JobStruct`1[T].Execute (T& data, System.IntPtr additionalPtr, System.IntPtr bufferRangePatchData, Unity.Jobs.LowLevel.Unsafe.JobRanges& ranges, System.Int32 jobIndex) (at <f712b1dc50b4468388b9c5f95d0d0eaf>:0)

I should add I'm not super familiar with C# development

To reproduce

  1. Create a Unity Project
  2. Download the NuGet packages (Microsoft.ML 3.0.1, Microsoft.ML.OnnxRuntime 1.17.1, Microsoft.ML.OnnxTransformer 3.0.1) and place them and their dependencies in <project dir>/Assets/Packages
  3. Create an MLContext and call .Transforms.ApplyOnnxModel() or another method that uses the Onnx Runtime.

Urgency

This is for a thesis project I was hoping to defend on April 15th but I have a backup solution in mind if I cannot use the Onnx Runtime.

Platform

Windows

OS Version

10 Pro 22H2 19045.4170

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.17.1

ONNX Runtime API

C#

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

@github-actions github-actions bot added the platform:windows issues related to the Windows platform label Mar 23, 2024
Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Apr 23, 2024
@AIceDog
Copy link

AIceDog commented Apr 24, 2024

Same problem, do you solve it?

@github-actions github-actions bot removed the stale issues that have not been addressed in a while; categorized by a bot label Apr 24, 2024
@ledbetterj1atwit
Copy link
Author

No, I've had no progress.
Ended up running my model in python using TensorFlow and sending data back and forth with sockets(slow, bad)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:windows issues related to the Windows platform
Projects
None yet
Development

No branches or pull requests

2 participants