[Web] InferenceSession.create
returns number as exception with model generated by torch.multinomial
#19961
Labels
platform:web
issues related to ONNX Runtime web; typically submitted using template
stale
issues that have not been addressed in a while; categorized by a bot
Describe the issue
When trying to load a model that uses
torch.multinomial
, the Web runtime returns an error that is just a number. I would expect either this to be supported or for the runtime to return a more helpful error.Console output:
To reproduce
Not all this code may be necessary, but this is a dummy case I created that reproduces the issue:
Pytorch code:
JS code:
Urgency
This isn't urgent as I suspect that there might be reasons to not support
torch.multinomial
, but it did eat up a few hours of my time so I would argue it should be fixed eventually for the next person.ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.17.1
Execution Provider
'wasm'/'cpu' (WebAssembly CPU)
The text was updated successfully, but these errors were encountered: