diff --git a/docs/build/web.md b/docs/build/web.md index f475b9b44f24..bedd92565f3d 100644 --- a/docs/build/web.md +++ b/docs/build/web.md @@ -70,7 +70,7 @@ This support is added/removed by appending the following flags to the build comm | `--enable_wasm_threads` | build with multi-thread support | | `--enable_wasm_simd` | build with SIMD support | -ONNX Runtime Web can be built with WebGPU support via JavaScript Execution Provider (JSEP). To build with JSEP support, use flag `--use_jsep`. +ONNX Runtime Web can be built with WebGPU and WebNN support via JavaScript Execution Provider (JSEP). To build with JSEP support, use flag `--use_jsep`. Building WebNN support requires additional flag `--use_webnn`. ONNX Runtime Web can also be built to support the training APIs. To build with training APIs included, use the flag `--enable-training-apis`. @@ -95,22 +95,23 @@ in `/`, run one of the following commands to build WebAssembly: A full list of required build artifacts: -| file name | file name (renamed) | build flag used | -| --------------------------- | -------------------------------- | --------------------------------------------------------- | -| ort-wasm.js | | | -| ort-wasm.wasm | | | -| ort-wasm-threaded.js | | `--enable_wasm_threads` | -| ort-wasm-threaded.wasm | | `--enable_wasm_threads` | -| ort-wasm-threaded.worker.js | | `--enable_wasm_threads` | -| ort-wasm-simd.wasm | | `--enable_wasm_simd` | -| ort-wasm-simd-threaded.wasm | | `--enable_wasm_simd` `--enable_wasm_threads` | -| ort-wasm-simd.js | ort-wasm-simd.jsep.js | `--use_jsep` `--enable_wasm_simd` | -| ort-wasm-simd.wasm | ort-wasm-simd.jsep.wasm | `--use_jsep` `--enable_wasm_simd` | -| ort-wasm-simd-threaded.js | ort-wasm-simd-threaded.jsep.js | `--use_jsep` `--enable_wasm_simd` `--enable_wasm_threads` | -| ort-wasm-simd-threaded.wasm | ort-wasm-simd-threaded.jsep.wasm | `--use_jsep` `--enable_wasm_simd` `--enable_wasm_threads` | -| ort-training-wasm-simd.wasm | | `--enable_wasm_simd` `--enable_training_apis` | - -NOTE: WebGPU is currently supported as experimental feature for ONNX Runtime Web. The build instructions may change. Please make sure to refer to latest documents from [this gist](https://gist.github.com/fs-eire/a55b2c7e10a6864b9602c279b8b75dce) for a detailed build/consume instruction for ORT Web WebGpu. +| file name | file name (renamed) | build flag used | +| --------------------------- | -------------------------------- | ----------------------------------------------------------------------- | +| ort-wasm.js | | | +| ort-wasm.wasm | | | +| ort-wasm-threaded.js | | `--enable_wasm_threads` | +| ort-wasm-threaded.wasm | | `--enable_wasm_threads` | +| ort-wasm-threaded.worker.js | | `--enable_wasm_threads` | +| ort-wasm-simd.wasm | | `--enable_wasm_simd` | +| ort-wasm-simd-threaded.wasm | | `--enable_wasm_simd` `--enable_wasm_threads` | +| ort-wasm-simd.js | ort-wasm-simd.jsep.js | `--use_jsep` `--use_webnn` `--enable_wasm_simd` | +| ort-wasm-simd.wasm | ort-wasm-simd.jsep.wasm | `--use_jsep` `--use_webnn` `--enable_wasm_simd` | +| ort-wasm-simd-threaded.js | ort-wasm-simd-threaded.jsep.js | `--use_jsep` `--use_webnn` `--enable_wasm_simd` `--enable_wasm_threads` | +| ort-wasm-simd-threaded.wasm | ort-wasm-simd-threaded.jsep.wasm | `--use_jsep` `--use_webnn` `--enable_wasm_simd` `--enable_wasm_threads` | +| ort-training-wasm-simd.wasm | | `--enable_wasm_simd` `--enable_training_apis` | + +NOTE: WebGPU and WebNN is currently supported as experimental feature for ONNX Runtime Web. The build instructions may change. Please make sure to refer to latest documents from [webgpu gist](https://gist.github.com/fs-eire/a55b2c7e10a6864b9602c279b8b75dce) and [webnn gist](https://gist.github.com/Honry/88b87c43b3f51a6c38c10454f3599405) for a detailed build/consume instruction for ORT Web WebGPU and WebNN. + ### Minimal Build Support diff --git a/docs/get-started/with-javascript/web.md b/docs/get-started/with-javascript/web.md index 6a8d38da3535..b4991ddc0a75 100644 --- a/docs/get-started/with-javascript/web.md +++ b/docs/get-started/with-javascript/web.md @@ -50,6 +50,17 @@ import * as ort from 'onnxruntime-web/webgpu'; const ort = require('onnxruntime-web/webgpu'); ``` +If you want to use ONNX Runtime Web with WebNN support (experimental feature), you need to import as below: + +```js +// use ES6 style import syntax (recommended) +import * as ort from 'onnxruntime-web/experimental'; +``` +```js +// or use CommonJS style import syntax +const ort = require('onnxruntime-web/experimental'); +``` + For a complete table for importing, see [Conditional Importing](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/importing_onnxruntime-web#conditional-importing). ## Documentation @@ -98,4 +109,4 @@ The following are video tutorials that use ONNX Runtime Web in web applications: - \[2]: WebGPU requires Chromium v113 or later on Windows. Float16 support requires Chrome v121 or later, and Edge v122 or later. - \[3]: WebGPU requires Chromium v121 or later on Windows. - \[4]: WebGL support is in maintenance mode. It is recommended to use WebGPU for better performance. -- \[5]: Requires to launch browser with commandline flag `--enable-experimental-web-platform-features`. \ No newline at end of file +- \[5]: Requires to launch browser with commandline flag `--enable-features=WebMachineLearningNeuralNetwork`. \ No newline at end of file diff --git a/docs/tutorials/web/index.md b/docs/tutorials/web/index.md index 32b19c8dbf75..a1dcfded5a16 100644 --- a/docs/tutorials/web/index.md +++ b/docs/tutorials/web/index.md @@ -32,7 +32,7 @@ For more detail on the steps below, see the [build a web application](./build-we You can also use the onnxruntime-web package in the frontend of an electron app. - With onnxruntime-web, you have the option to use `webgl` or `webgpu` for GPU processing, and WebAssembly (`wasm`, alias to `cpu`) for CPU processing. All ONNX operators are supported by WASM but only a subset are currently supported by WebGL and WebGPU. + With onnxruntime-web, you have the option to use `webgl`, `webgpu` or `webnn` (with `deviceType` set to `gpu`) for GPU processing, and WebAssembly (`wasm`, alias to `cpu`) or `webnn` (with `deviceType` set to `cpu`) for CPU processing. All ONNX operators are supported by WASM but only a subset are currently supported by WebGL, WebGPU and WebNN. * Inference on server in JavaScript. Use the `onnxruntime-node` package.