Skip to content
This repository has been archived by the owner on Apr 18, 2023. It is now read-only.

How to Run Chromium builds with WebNN API

Christywl edited this page Sep 18, 2020 · 31 revisions

Supported Platform

Platform Version Note Platform Version Note
Windows 10.xxxx/Insider Preview Linux Ubuntu 16.04
macOS 10.13/10.14 Android 8.1.0+

Note

The Chromium Command Line are required for enabling specific WebNN backends when run with WebNN API enabled Chromium binaries.

How to Run

Windows

Backend Mapping

Chromium Command Line FAST_SINGLE_ANSWER SUSTAINED_SPEED LOW_POWER
w/o DNNL clDNN

Prerequisites

Install and update graphic driver (Intel Graphic Driver) for clDNN

Steps

  1. Unzip the Chromium binary chrome-win32.zip for Windows

  2. Open cmd.exe and go to Chrome-bin folder

  3. Run Chromium:

    chrome.exe --no-sandbox

  4. Visit webml-polyfill/examples and select examples (e.g. Image Classification)

  5. Select Backend (e.g. SUSTAINED_SPEED == clDNN backend), and then select Model (e.g. MobileNet v1)

Windows with DirectML backend

Backend Mapping

Chromium Command Line FAST_SINGLE_ANSWER SUSTAINED_SPEED LOW_POWER
--use-dml DirectML

Prerequisites

Install Windows Insider Preview

Steps

Run Chromium:

chrome.exe --no-sandbox --use-dml

and select SUSTAINED_SPEED backend

Windows with Inference Engine backends

Backend Mapping

Chromium Command Line FAST_SINGLE_ANSWER SUSTAINED_SPEED ULTRA_LOW_POWER
--use-inference-engine IE-MKLDNN IE-clDNN IE-GNA

Prerequisites

Install Openvino 2020.3 LTS version following the steps and set the environment variables.

Steps

Run Chromium:

chrome.exe --no-sandbox --use-inference-engine

and select SUSTAINED_SPEED or FAST_SINGLE_ANSWER or ULTRA_LOW_POWER

Linux

Backend Mapping

Chromium Command Line FAST_SINGLE_ANSWER SUSTAINED_SPEED LOW_POWER
w/o --use-inference-engine DNNL clDNN

Prerequisites

Please follow the How-to-Install-the-Graphics-Driver-for-OpenCL-on-Linux-to-Run-WebNN-clDNN-Backend to install the dependencies and driver.

Steps

  1. Install the Chromium binary chromium-browser-unstable_xx.x.xxxx.x-x_amd64.deb for Linux Ubuntu:

    sudo dpkg -i chromium-browser-unstable_xx.x.xxxx.x-x_amd64.deb

  2. Run Chromium:

    /usr/bin/chromium-browser-unstable --no-sandbox

  3. Visit webml-polyfill/examples and select examples (e.g. Image Classification)

  4. Select Backend (e.g. SUSTAINED_SPEED == clDNN backend), and then select Model (e.g. MobileNet v1)

Linux with Inference Engine backends

Backend Mapping

Chromium Command Line FAST_SINGLE_ANSWER SUSTAINED_SPEED LOW_POWER ULTRA_LOW_POWER
--use-inference-engine IE-MKLDNN IE-clDNN IE-MYRIAD IE-GNA

Prerequisites

Install Openvino 2020.3 LTS version following the steps and set the environment variables.

Steps

Run Chromium:

/usr/bin/chromium-browser-unstable --no-sandbox --use-inference-engine

and select SUSTAINED_SPEED or FAST_SINGLE_ANSWER or LOW_POWER or ULTRA_LOW_POWER

Note: IE-MYRIAD needs Intel® Neural Compute Stick 2

macOS

Backend Mapping

Chromium Command Line FAST_SINGLE_ANSWER SUSTAINED_SPEED LOW_POWER
w/o --use-mkldnn BNNS MPS

Steps

  1. Unzip the Chromium binary chromium-mac.zip for macOS

  2. Run Chromium:

    ./Chromium.app/Contents/MacOS/Chromium

  3. Visit webml-polyfill/examples and select examples (e.g. Image Classification)

  4. Select Backend (e.g. SUSTAINED_SPEED == MPS backend), and then select Model (e.g. MobileNet v1)

macOS with MKLDNN backend

Backend Mapping

Chromium Command Line FAST_SINGLE_ANSWER SUSTAINED_SPEED LOW_POWER
--use-dnnl DNNL

Steps

Run Chromium:

./Chromium.app/Contents/MacOS/Chromium --no-sandbox --use-dnnl

and select FAST_SINGLE_ANSWER backend

Android

Backend Mapping

Chromium Command Line FAST_SINGLE_ANSWER SUSTAINED_SPEED LOW_POWER
w/o command line NNAPI NNAPI NNAPI

Steps

  1. Install the Chromium binary ChromePublic.apk for Android:

    adb install ChromePublic.apk

  2. Open Chromium app

  3. Visit webml-polyfill/examples and select examples (e.g. Image Classification)

  4. Select Backend (e.g. SUSTAINED_SPEED == NNAPI backend), and then select Model (e.g. MobileNet v1)