Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API should be exposed to ServiceWorker #4197

Closed
MiguelsPizza opened this issue Jun 26, 2023 · 16 comments
Closed

API should be exposed to ServiceWorker #4197

MiguelsPizza opened this issue Jun 26, 2023 · 16 comments
Labels
feature request A request for a new GPU feature exposed in the API proposal
Milestone

Comments

@MiguelsPizza
Copy link

MiguelsPizza commented Jun 26, 2023

Would be nice to have and I can't think of any reason not to expose WebGPU to a service worker considering Webgl is accessible.

It's only exposed to the Window and DedicatedWorkers at the moment. Reference: https://gpuweb.github.io/gpuweb/#navigator-gpu
[Exposed=(Window, DedicatedWorker), SecureContext]

Similar issue regarding exposing webGPU to a dedicated worker:

@kainino0x kainino0x added this to the Milestone 2 milestone Jun 26, 2023
@kainino0x kainino0x added feature request A request for a new GPU feature exposed in the API proposal labels Jun 26, 2023
@kainino0x
Copy link
Contributor

Thanks for filing this, I didn't realize we didn't have an issue for this :)
I think we should do it. It seems WebGL just uses Worker. I don't know how to find an official listing of the Worker types but Chromium documents it as DedicatedWorker, SharedWorker, and ServiceWorker:
https://chromium.googlesource.com/chromium/src/+/main/third_party/blink/renderer/bindings/IDLExtendedAttributes.md#exposed

So along with this, likely we should expose it in SharedWorker too.

@MiguelsPizza
Copy link
Author

No Problem, thanks for the prompt response!

Let me know if there is anything I can do to help get this to the finish line

@libofei2004
Copy link

@MiguelsPizza Is there no way to use webgpu in webview in android?

tqchen pushed a commit to mlc-ai/web-llm that referenced this issue Aug 31, 2023
This PR updates the Chrome extension to run in two modes:
- Using the `ChatRestModule`, which uses a local/hosted server to run
the model.
- This requires the user to launch the server before using the Chrome
extension, which interacts with the server using the REST API.
- The `ChatRestModule` will be created in the background service worker,
so it should persist between interactions.
- Using the `ChatModule`, which runs the model using WebGPU.
- This does not require any additional user actions before using the
extension.
- The `ChatModule` cannot be created in the background service worker,
so it needs to be loaded from cache every time the user interacts with
the extension. This takes several seconds.

Some additional observations / limitations:
- As far as I can tell, Chrome extensions can only
[support](https://developer.chrome.com/docs/extensions/mv3/service_workers/)
background service workers, which don't currently have support for
WebGPU yet, as mentioned in
gpuweb/gpuweb#4197. This means that the WebGPU
mode will only work from the main script and not the background worker.
- This approach does not work with Manifest v3 because of the following
error:
```
Refused to evaluate a string as JavaScript because 'unsafe-eval' is not an allowed source of script in the following Content Security Policy directive: "script-src 'self'".
```
I suspect it is because we are evaluating some third party code which is
not in the extension. There seems to be a dependency on 'unsafe-eval' in
the Content Security Policy, but according to the Chrome extension
[documentation](https://developer.chrome.com/docs/extensions/mv3/manifest/content_security_policy/),
this is not allowed starting v3. I managed to get it to work in Manifest
v2, but Manifest v2 will be getting deprecated this year.
@Jasonsey
Copy link

Jasonsey commented Sep 8, 2023

hi there, as far as I know, Tensorflow.js recomend using AI models in background.js (using service worker) for chrome-extension . So, I am really looking forward to this new feature

@tqchen
Copy link

tqchen commented Nov 29, 2023

Also love to see WebGPU being exposed to service worker. This would open up quite a bit opportunities building AI extensions with WebGPU

@jimblandy
Copy link
Contributor

This seems appropriate.

The only reason not to spec this would be if, somehow, all browsers never planned to actually get around to implementing it - but that's not the case. Once Firefox has its resource consumption controls firmed up, this makes sense for us to implement.

@beaufortfrancois
Copy link
Contributor

FYI I've started #4464 that adds ServiceWorker support to the WebGPU spec.

@kainino0x
Copy link
Contributor

I don't know how to find an official listing of the Worker types but Chromium documents it as DedicatedWorker, SharedWorker, and ServiceWorker:

MDN also documents it as those 3.
https://developer.mozilla.org/en-US/docs/Web/API/WorkerGlobalScope

@tqchen
Copy link

tqchen commented Jan 27, 2024

Thank you @beaufortfrancois @kainino0x, would be great to know how we are doing in terms of support status, love to followup and try out as soon as it lands

@beaufortfrancois
Copy link
Contributor

We'll let you know when it's available behind a flag in Chrome so that you can play around.

@beaufortfrancois
Copy link
Contributor

@tqchen @MiguelsPizza You can now play with WebGPU in a service worker in Chrome Canary 123.0.6271.0 by enabling the "Experimental Web Platform features" flag at chrome://flags/#enable-experimental-web-platform-features.

Here's an extension sample I've created to demonstrate its usage: https://gist.github.com/beaufortfrancois/4795c20bc4d147e0400303d0b8ec02d6

Let me know how it goes ;)

@beaufortfrancois
Copy link
Contributor

@tqchen @MiguelsPizza Did you have a chance to experiment with Service Worker support in Chrome? Did you find some limitations or bugs we should be aware of?

@tqchen
Copy link

tqchen commented Feb 13, 2024

mlc-ai/web-llm#296 @rickzx should be able to comment more :) really excited about what it can enable

@rickzx
Copy link

rickzx commented Feb 14, 2024

@beaufortfrancois @tqchen Thanks a lot for putting the effort to support WebGPU in service worker. I was able to put up a sample Chrome extension running LLM in the service worker in the background, and a popup window as a chat assistant. It's much simpler and more elegant to implement this now with the service worker support. Haven't encountered any bugs so far.

@beaufortfrancois
Copy link
Contributor

@tqchen @MiguelsPizza @rickzx I'm happy to report that Chrome nows supports WebGPU in service worker by default as of Chrome 124.0.6359.0 (Canary at this moment) 🎉

CharlieFRuan pushed a commit to mlc-ai/web-llm that referenced this issue Mar 15, 2024
This PR updates Chrome Extension using WebGPU Service Worker README.md
following gpuweb/gpuweb#4197 (comment)
@beaufortfrancois
Copy link
Contributor

And to close the loop, here's the official announcement: https://developer.chrome.com/blog/new-in-webgpu-124?hl=en#service_workers_and_shared_workers_support

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request A request for a new GPU feature exposed in the API proposal
Projects
None yet
Development

No branches or pull requests

8 participants