You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have seen ongoing WebNN implementation on Windows and ChromeOS, with WebNN prototype development underway for Android and macOS. In addition to computation on CPU and GPU devices, we have also seen leading hardware manufacturers increase their support for NPUs.
Ideally, all APIs and operations defined by the WebNN specification would be fully supported on WebNN-enabled operating systems, devices. However, due to differences in software stacks and hardware, it is inevitable that some operations cannot be fully supported and require fallback via JavaScript machine learning frameworks or web applications.
Given this, web developers want to know the current device and operating system, such as WebNN GPU on Windows or WebNN CPU on Android, and which operations are supported. It should be nice to have the query mechanism in the WebNN specification to allow JavaScript code to query which operations are supported by which operating system, device and and power preference after getting the MLContext object.
The text was updated successfully, but these errors were encountered:
This seems to be a duplicate of #463 (including both ops and data types) so I'm going to close it, but if I missed any distinction please comment and we can re-open. Or add to the other issue.
We have seen ongoing WebNN implementation on Windows and ChromeOS, with WebNN prototype development underway for Android and macOS. In addition to computation on CPU and GPU devices, we have also seen leading hardware manufacturers increase their support for NPUs.
Ideally, all APIs and operations defined by the WebNN specification would be fully supported on WebNN-enabled operating systems, devices. However, due to differences in software stacks and hardware, it is inevitable that some operations cannot be fully supported and require fallback via JavaScript machine learning frameworks or web applications.
Given this, web developers want to know the current device and operating system, such as WebNN GPU on Windows or WebNN CPU on Android, and which operations are supported. It should be nice to have the query mechanism in the WebNN specification to allow JavaScript code to query which operations are supported by which operating system, device and and power preference after getting the MLContext object.
The text was updated successfully, but these errors were encountered: