-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Audio Worklet support #81
Comments
Hi, just found this really cool, what architecture do you prefer for this? What I'd recommend is to run all the synths in a Web Worker, and to communicate with a wait-free ring-buffer to a very simple This way, it works like your regular media file player (one thread decodes the audio, one thread plays it back), almost immune to glitches (except if the device is completely overloaded, of course). If this is a design that would work for you, I've written some material to help:
The only requirements for this to work are to serve the website with two headers:
so that it's put in an isolated process in the web browser. This MDN link explains why this is unfortunately necessary. Also, it would certainly be possible to keep the existing code, and to use an Also, I'd like to provide some context about the meaning of "deprecation" in the context of the Web Platform: https://lists.w3.org/Archives/Public/public-audio/2023JanMar/0003.html (tl;dr it's not going to be removed, no rush). |
Hi @padenot thanks for sharing all of this! I appreciate your insights. More than a year ago, I attempted to use Audio Worklets and I think my approach was wrong: As I recall, it felt like I was stacking up weird hacks, and wrote at the time:
Ah okay; ring buffer and shared array buffers are needed, but I still have many questions. To pick one example: The Chip Player wasm binary relies on the Emscripten virtual file system, backed by IndexedDB. For example, some MDX music files use PDX audio sample files in the same folder. The MDX C library uses file I/O to read the PDX file. (I preload the PDX into the virtual file system with a network fetch.) How do we do Audio Worklets (or Web Workers) in this case? Where the code writing audio samples also needs IndexedDB API? In my worklet branch, I stubbed all the Emscripten filesystem code to use MEMFS instead of IDBFS. But MEMFS is no good because it does not persist across sessions. If these questions reveal a misunderstanding on my part, please do share. |
Web Workers can use IndexedDB and make network requests normally, so this shouldn't be a problem. I think it's not a misunderstanding of your part, it's probably a lack of good documentation of the various moving parts here. In this model, the sound generation happens in the worker, the UI is only concerned about rendering the UI, the visualization, user interaction of course, and orchestrating all of this (e.g., start loading a tune and start playback when an entry in the browser is clicked). The If we describe a standard scenario of opening the web app and playing a tune, it would go like this (I tried to explain as many details as possible, maybe there are trivial things in there):
An alternative approach, and that look like what you've tried, is to do the sound generation within the Here, because we're using a piece of code that already does everything (IO, sound synthesis, etc., intermixed), we need to resort to running the code normally in a worker, and then playing the audio out -- but we can move everything out of the main thread to make the app very robust against load. The same architecture is used when running e.g. emulators on the web, or other piece of code where the separation between real-time digital signal processing code and everything else is not clear, maybe because back in the days, it was all single-thread in one big run loop. In short, three pieces:
|
@padenot I just wanted to say thanks again for the writeup, and I haven't forgotten about this. |
It would be really nice to use Audio Worklets.
ScriptProcessorNode renders audio on the UI thread and glitches during scrolling, window resize, etc.
This is really not acceptable for a music player and the ScriptProcessorNode deprecation warning has showed up in the Chrome console for a long time now.
Might solve some of the glitch reports too.
It's widely supported: https://caniuse.com/mdn-api_audioworklet
https://developers.google.com/web/updates/2018/06/audio-worklet-design-pattern
The text was updated successfully, but these errors were encountered: