You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
i have ml.net
model of size 47MB (zipped)
loads in 1 second
takes about 100, 150MB of memory,
but when i convert it to onnx format using ConvertToOnnx function and empty list of input data type,
now
the model size 112MB,
loads in 6 seconds with onnxruntime versions 1.15.1 or less but when using any a newer version of onnxruntime it takes about
9 minutes to load.
memory taken (RAM) : 1,500MB then decrease to 700MB.
Notes: tried using python optimizers and simplifiers libraries the model size increased from 112MB to 150MB, load time and memory size the same i even removed some unused inputs and outputs.
To reproduce
conversion code
var ctx =new MLContext();
var transformer = ctx.Model.Load("C:\models\mlModel.zip", var out schema);
var dummy = new List();
var dummyData = _ctx.Data.LoadFromEnumerable(dummy);
using (var onnx = File.Open("C:\models\onnModel.onnx", FileMode.OpenOrCreate, FileAccess.ReadWrite))
{
_ctx.Model.ConvertToOnnx(transformer, dummyData, onnx);
}
load code
var ctx =new MLContext();
var model = ctx.transform.ApplyOnnxModel("C:\models\onnxModel.onnx");
Urgency
yes it is urgent.
Platform
Windows
OS Version
11 pro
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
tried from 1.10 to 1.16 versions
ONNX Runtime API
C#
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response
Model File
sorry but i cannot give it.
Is this a quantized model?
No
The text was updated successfully, but these errors were encountered:
ML.NET is not a part of onnxruntime. For ML.NET problem, please open issue at https://github.com/dotnet/machinelearning. If you just want to run ONNX model with C#, please refer to onnxruntime's C# document https://onnxruntime.ai/docs/get-started/with-csharp.html. I will close this issue since it's unrelated with this repo. Please feel free to repoen it if you see the same problem with onnxruntime.
Describe the issue
i have ml.net
model of size 47MB (zipped)
loads in 1 second
takes about 100, 150MB of memory,
but when i convert it to onnx format using ConvertToOnnx function and empty list of input data type,
now
the model size 112MB,
loads in 6 seconds with onnxruntime versions 1.15.1 or less but when using any a newer version of onnxruntime it takes about
9 minutes to load.
memory taken (RAM) : 1,500MB then decrease to 700MB.
Notes: tried using python optimizers and simplifiers libraries the model size increased from 112MB to 150MB, load time and memory size the same i even removed some unused inputs and outputs.
To reproduce
conversion code
var ctx =new MLContext();
var transformer = ctx.Model.Load("C:\models\mlModel.zip", var out schema);
var dummy = new List();
var dummyData = _ctx.Data.LoadFromEnumerable(dummy);
using (var onnx = File.Open("C:\models\onnModel.onnx", FileMode.OpenOrCreate, FileAccess.ReadWrite))
{
_ctx.Model.ConvertToOnnx(transformer, dummyData, onnx);
}
load code
var ctx =new MLContext();
var model = ctx.transform.ApplyOnnxModel("C:\models\onnxModel.onnx");
Urgency
yes it is urgent.
Platform
Windows
OS Version
11 pro
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
tried from 1.10 to 1.16 versions
ONNX Runtime API
C#
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response
Model File
sorry but i cannot give it.
Is this a quantized model?
No
The text was updated successfully, but these errors were encountered: