Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.
Sign upEmbedder API for gstreamer audio setup #24068
Comments
|
Can you be more specific about the needs of magicleap here? What sort of callbacks does the embedder require from the audio playback? |
|
@jdm: When gstreamer creates an audio node, it needs the embedding app to create it and anchor it in 3D space. @xclasse's code does this by having an event that the embedder is meant to handle, but we could also do it by passing in a callback that creates the audio node. |
|
(The magicleap case is particularly complex as the node has to be created on the audio thread apparently, which the application doesn't know about at startup.) |
|
cc @xclaesse who's github handle I got wrong. |
|
So this would be the first time we need to have embedder-specified callbacks that execute in the media backend. Currently we smuggle embedder data to the gstreamer backend as usize values since they're all pointers, and that might work for function pointers as well but it's also kind of terrifying. |
|
Actually, a bunch of that smuggling might not be necessary any more - now that we initialize the GStreamer backend in particular in this code, we should be able to pass embedder callbacks there as well. |
|
Yes, it looks like we might be able to get configuration data from the embedder to gstreamer without casting to/from usize. |
I think it's worth highlighting that the "audio rendering thread" and the "backend" are currently part of the same process, and that they likely will be separated in the future, in the light of the requirements of implementing The ideal setup is for the backend to live in it's own process, and for a rendering thread to run in the content-process(with a audio worklet running there as well). Also, from what I understand its the "audio rendering thread" that creates audio nodes(although I think the terminology between the So I think this will require some coordination between script(the rendering thread), the backend(in it's own process), and the embedder. And it might be a good idea to do this in way that takes the potential future changes into account, meaning not assuming backend/rendering thread are in the same process. This would seem mostly relevant when calling directly into lower-level gstreamer functions and using callbacks, since I assume that cannot be done across process. I think it would also be interesting to know if a If not, I guess you'd have to run the code that creates a |
|
As the one who wrote the GStreamer audiosink that uses lumin::AudioNode, I could provide some details here:
|
|
@xclaesse Thanks! So that means we cannot pass a callback from the main-thread to the audio backend upon initialization, since that callback would not be executing on the main-thread. Instead, it sounds like we need to send a message from the audio backend(dealing internally with the gstreamer message-bus) to the main-thread(the "embedder" really), and asking it to create the For me the only question is whether the message sending the If not, that means we know we have to run the media backend on the "main-process", alongside the embedder... |
|
On second thoughts, it probably doesn't make sense to send the So the embedder can just store it as One thing we need to figure out is how to do the equivalent of Perhaps the backend could send a closure in the message requesting the creation of an audio node, and the node should then be passed to the closur,e which internally would then hook it up to gstreamer. This does imply the audio backend runs in the same process as the main-thread, I think. |
This came up in the context of magicleap, but it applies to any device which needs to be able to customize the audio back end.
Currently we pass in the (E)GLContext from the embedder to Servo, and then to gstreamer for use as the video target, but we don't have a similar setup for audio. This makes it tricky to support devices like magicleap's 3D audio, even though gstreamer can handle it, because Servo is sitting between gstreamer and the embedder.