[Web] Inconsistent results between running onnx model through python and with onnxruntime-web #21275
Labels
platform:web
issues related to ONNX Runtime web; typically submitted using template
stale
issues that have not been addressed in a while; categorized by a bot
Describe the issue
I'm trying to run this model in the browser using onnxruntime-web but the results aren't consistent when compared to running the model using python.
I'm observing the following results:
Please note that transformersjs also uses an onnx runtime to do the model passes. This is severely affecting model performance and delaying a project.
To reproduce
I'm using the latest versions of onnxruntime-web and onnx. I converted the model from pytorch to onnx using huggingface optimum library. Since the onnx and pytorch models agree anyways so the computational graph of the model is exported correctly. I am manually verifying that the inputs in all cases are practically the same, using the same preprocessing steps everywhere.
Urgency
The issue is urgent as there's a project deadline quick approaching which requires this inconsistency to be resolved. I've already spent time trying to dig deeper into this issue wrt the transformersjs library but it seems to be an issue with the onnxruntime(transformersjs also uses an onnx runtime).
Please help me getting this resolved as quickly as possible, I'm really hopeful about this.
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.18.0
Execution Provider
'wasm'/'cpu' (WebAssembly CPU)
The text was updated successfully, but these errors were encountered: