-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make MLTensor strongly typed? #39
Comments
Alternatively, should we simplify to drop dimensions and simply use bytes[]? https://www.w3.org/TR/webnn/#typedefdef-mlnamedarraybufferviews ATM, I don't think we support arbitrary size inputs (i.e. CNN style network still has a fixed input / output size). Validation can simply check byte length matches the model's input/output description. If the user provides an array with the correct byte length, but in a different type (e.g. uint16 -> float16), they just get wrong results. |
Do you mean using |
I mean align with WebNN and use ArrayBufferView. Since we just care about byteLength matches the expected input/output type+shape (both are known at graph build / model loading time) |
Sounds good to me. My open is whether it should support a union of Although the WebNN prototyping CL 4006509 uses this typedef as workaround for a Blink generator issue, it reminds me whether we should consider that. |
Some issues with using a dict:
dimensions
isn't validated againstTypedArray.bytesLength
. I can set arbitrary dimensions (like[4,4,4]
to a Float32Array that only has 1 element.dict<NodeName, dict<data, dimensions>>
anddict<NodeName "data", NodeName "dimensions">
, thus fails to bind JavaScript object to C++ implementation.Discovered this during WebML prototyping for https://crbug.com/1338067
See related issue in WebNN webmachinelearning/webnn#275
The text was updated successfully, but these errors were encountered: