Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Mobile] Creating InferenceSession from Uint8Array gives an assertion error on the native side #17732

Closed
l3utterfly opened this issue Sep 28, 2023 · 12 comments
Labels
api:Javascript issues related to the Javascript API platform:mobile issues related to ONNX Runtime mobile; typically submitted using template stale issues that have not been addressed in a while; categorized by a bot

Comments

@l3utterfly
Copy link

l3utterfly commented Sep 28, 2023

Describe the issue

I have my model included as an asset in my React Native project. I am loading it into a base64 string, then encoding it as a byte array like so:

let assetPath: string;
      let base64Str: string;

      if (Platform.OS === 'android') {
          assetPath = `models/${TFIDF_MODEL_FILENAME}`;
          base64Str = await RNFS.readFileAssets(assetPath, 'base64');
      } else {
          assetPath = `${RNFS.MainBundlePath}/${TFIDF_MODEL_FILENAME}`;
          base64Str = await RNFS.readFile(assetPath, 'base64');
      }

      const bytes = base64.toByteArray(base64Str);

      // load a model
      this.session = await InferenceSession.create(bytes);

The error on the native side is:

Error: node_modules/react-native/ReactCommon/jsi\jsi/jsi-inl.h:154: facebook::jsi::ArrayBuffer facebook::jsi::Object::getArrayBuffer(facebook::jsi::Runtime &) &&: assertion "runtime.isArrayBuffer(*this)" failed

Note: this code was working with onnxruntime react native 1.15

To reproduce

  1. Put a onnx model as part of react native assets
  2. Load using RNFS readFile as base64
  3. Convert to unit8array
  4. Load using InferenceSession.create

Urgency

Show stopping issue as it stops ONNX being used at all in our project

Platform

React Native

OS Version

0.72.4

ONNX Runtime Installation

Released Package

Compiler Version (if 'Built from Source')

No response

Package Name (if 'Released Package')

onnxruntime-react-native

ONNX Runtime Version or Commit ID

1.16.0

ONNX Runtime API

JavaScript

Architecture

ARM64

Execution Provider

Default CPU

Execution Provider Library Version

No response

@l3utterfly l3utterfly added the platform:mobile issues related to ONNX Runtime mobile; typically submitted using template label Sep 28, 2023
@github-actions github-actions bot added the api:Javascript issues related to the Javascript API label Sep 28, 2023
@baijumeswani baijumeswani added platform:web issues related to ONNX Runtime web; typically submitted using template and removed platform:mobile issues related to ONNX Runtime mobile; typically submitted using template labels Sep 28, 2023
@YUNQIUGUO YUNQIUGUO added platform:mobile issues related to ONNX Runtime mobile; typically submitted using template and removed platform:web issues related to ONNX Runtime web; typically submitted using template labels Sep 28, 2023
@YUNQIUGUO
Copy link
Contributor

YUNQIUGUO commented Sep 28, 2023

thanks for the information and report.

Since this works with onnxruntime-react-native 1.15.0 and also infer from your error log, it might be related to the new JSI blob support we introduced (to replace the slow base64 encode/decode part) (#16094)

@jhen0409 with the new blob support, do you know at the user level, do they now need to provide a blob type instead of a base64 encoded Uint8array for the input of loading inference session? I thought the changes are internal to react native backend though.

@jhen0409
Copy link
Contributor

thanks for the information and report.

Since this works with onnxruntime-react-native 1.15.0 and also infer from your error log, it might be related to the new JSI blob support we introduced (to replace the slow base64 encode/decode part) (#16094)

@jhen0409 with the new blob support, do you know at the user level, do they now need to provide a blob type instead of a base64 encoded Uint8array for the input of loading inference session? I thought the changes are internal to react native backend though.

It may be broken. I think using .buffer of Uint8array to get ArrayBuffer will work.

const modelBlob = jsiHelper.storeArrayBuffer(this.#pathOrBuffer);

@YUNQIUGUO
Copy link
Contributor

thanks for the suggestion. Tested locally in my expo react native project. It works at loading model now. We will add e2e test to cover this to avoid future breakage.

@l3utterfly see if this fixes your crash. And FYI, we have a patch release 1.16.1 coming. Hopefully this will be included in the patch release version.

@skottmckay
Copy link
Contributor

Slightly confused as to how this wasn't uncovered from current unit tests or the perf testing associated with the original PR given it looks like it should always fail.

What is different about the broken usage vs. how that testing was done?

@jhen0409
Copy link
Contributor

Slightly confused as to how this wasn't uncovered from current unit tests or the perf testing associated with the original PR given it looks like it should always fail.

What is different about the broken usage vs. how that testing was done?

It looks like the current E2E tests should only use file path instead of use an array, my personal use case too.

@skottmckay
Copy link
Contributor

@jhen0409 is a potential workaround here to simply pass in a path instead of the user code doing a readFileAssets/readFile + base64.toByteArray or would that approach not handle platform specific differences?

@l3utterfly
Copy link
Author

On android if you want to read a model bundled as part of the app assets (not downloaded separately or copied into your tmp folder before reading), you can't get a valid path into the asset bundle.

So the only work around is to read the file contents and initialise an InferenceSession using the model contents.

YUNQIUGUO added a commit that referenced this issue Sep 30, 2023
### Description
<!-- Describe your changes. -->

Use `.buffer` of Uint8Array to get ArrayBuffer.

TODO: Add E2E React Native test case to cover JS level testing to avoid
future breakage.

### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->

#17732

Co-authored-by: rachguo <[email protected]>
snnn pushed a commit that referenced this issue Sep 30, 2023
### Description
<!-- Describe your changes. -->

Use `.buffer` of Uint8Array to get ArrayBuffer.

TODO: Add E2E React Native test case to cover JS level testing to avoid
future breakage.

### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->

#17732

Co-authored-by: rachguo <[email protected]>
@hungtooc
Copy link

hungtooc commented Oct 2, 2023

thanks for the suggestion. Tested locally in my expo react native project. It works at loading model now. We will add e2e test to cover this to avoid future breakage.

@l3utterfly see if this fixes your crash. And FYI, we have a patch release 1.16.1 coming. Hopefully this will be included in the patch release version.

Hi @YUNQIUGUO, could you please add e2e code for React Native CLI project, for users not use Expo

@YUNQIUGUO
Copy link
Contributor

YUNQIUGUO commented Oct 2, 2023

@hungtooc Would this pr's change : #17749 cover the example e2e usage for your react native project? It contains testing code for loading model from bytes in a traditional React Native CLI test app.

@l3utterfly
Copy link
Author

@l3utterfly see if this fixes your crash. And FYI, we have a patch release 1.16.1 coming. Hopefully this will be included in the patch release version.

Yes, this fixes my issue. However, note that I had to go into the source code of onnxruntime to add the .buffer in const modelBlob = jsiHelper.storeArrayBuffer(this.#pathOrBuffer);

Simply adding .buffer in my code here does not work: this.session = await InferenceSession.create(bytes.buffer);

@YUNQIUGUO
Copy link
Contributor

YUNQIUGUO commented Oct 3, 2023

@l3utterfly see if this fixes your crash. And FYI, we have a patch release 1.16.1 coming. Hopefully this will be included in the patch release version.

Yes, this fixes my issue. However, note that I had to go into the source code of onnxruntime to add the .buffer in const modelBlob = jsiHelper.storeArrayBuffer(this.#pathOrBuffer);

Simply adding .buffer in my code here does not work: this.session = await InferenceSession.create(bytes.buffer);

yes. Once the onnxruntime-react-native 1.16.1 patch release package is out, you should be able to run your code as it is without any changes in source code. We've added the .buffer fix in the latest patch release.

Copy link
Contributor

github-actions bot commented Nov 2, 2023

This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Nov 2, 2023
kleiti pushed a commit to kleiti/onnxruntime that referenced this issue Mar 22, 2024
### Description
<!-- Describe your changes. -->

Use `.buffer` of Uint8Array to get ArrayBuffer.

TODO: Add E2E React Native test case to cover JS level testing to avoid
future breakage.

### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->

microsoft#17732

Co-authored-by: rachguo <[email protected]>
siweic0 pushed a commit to siweic0/onnxruntime-web that referenced this issue May 9, 2024
### Description
<!-- Describe your changes. -->

Use `.buffer` of Uint8Array to get ArrayBuffer.

TODO: Add E2E React Native test case to cover JS level testing to avoid
future breakage.

### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->

microsoft#17732

Co-authored-by: rachguo <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api:Javascript issues related to the Javascript API platform:mobile issues related to ONNX Runtime mobile; typically submitted using template stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

6 participants