-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
(HTMLMediaElementSync): HTMLMediaElement synchronisation #78
Comments
I think that addressing the context-related questions in #257 would help a great deal with this. If media element output belonged to an AudioContext whose time could be consulted and correlated with other AudioContexts, perhaps this problem would go away. |
I'm confused what synchronisation is being looked for here. Once you've obtained a MediaElementAudioSourceNode, you can filter it however you like. The Web Audio output is "synchronised" in the sense that it's running in realtime - so whatever the media element is outputting will be filtered. The media element implementation would be responsible for synchronisation. If one presupposes the layering in #257, this seems like a feature request for the implementation of / |
TPAC (cwilso): Closing since it's equivalent of asking MediaElement to expose something other than what it does AND resolve #257 |
I had a new issue mostly written when I found this one. This seems necessary to me, and I don't see how #257 totally covers it. To respond to @cwilso: filtering isn't the issue. You should be able to have audio events in a Web Audio context that are precisely synchronized with known times in a media element's playback. To do this, you'd need to know how the media element's clock corresponds with the audio context's. Presumably there's a constant offset between the two, assuming the media element is playing and its playbackRate is 1.0. Assuming the media element is playing through a MediaElementAudioSourceNode, this could be accomplished by an additional read-only attribute of the node. Though maybe it would be nice to allow synchronization when that's not the case. Maybe media elements should have an equivalent of the currentPlaybackTime attribute added in #12 (which would fall outside the scope of the Web Audio spec, I guess). So I'm requesting that this issue be reopened, or else an explanation of exactly how this functionality would be accomplished under #257 or other issues that are still open. |
@adelespinasse The original report said "there is no way to filter that audio". In short, no, this isn't always this simple; as I said previously, sure, you can naively assumethat the media element's clock will correspond with the audio context's. (If they're outputting to the same hardware device, that will likely be true.) However, synchronization is hard, when media elements may be playing progressively; if a media element pauses to buffer, for example, they will no longer be synced. This just simply is NOT a trivial problem. |
I didn't mean to imply that it was trivial! It does seem like, for some limited cases that would be very useful, it is simpler than answering the architectural issues raised in #257. For example, if a media element is connected to the audio graph through a MediaElementAudioSourceNode, then their clocks would have to be synchronized, unless something really weird is going on. Yes, starting/stopping/seeking (including from a buffer underflow) would cause a discontinuity. That just means that the application would have to cancel and reschedule any events it had scheduled. It's already possible to register an event handler for those occasions. Here's one way to do it: MediaElementAudioSourceNode would have one additional read-only numeric attribute, timeOffset. When the media element is playing, the following would hold:
where mediaTime is not quite the same as mediaElement.currentTime; it's the time within the media element's stream that will play back at audioContext.currentTime. (Actually this might be what mediaElement.currentTime is; I'm not sure I've seen a clear definition. I doubt it though.) So the value of timeOffset changes when the media starts playing, or seeks to a different currentTime, or when its playbackRate changes. During uninterrupted playback, it would not change. While not playing, it could be undefined, or unchanged from its last value when playing; I don't think it matters. There could probably be a similar mechanism for MediaStreamAudioSourceNode, except that I think MediaStreams don't have a concept of a "current time". So, to me, there seem to be two possible objections to something like this:
(1) seems like a pretty weak argument unless this is seen as a very unimportant feature. I think (2) could be a strong argument, but it depends on when this alleged unified API will occur. My general impression is that it will probably be quite a while. Is that wrong? Is progress being made on other fronts that will make this feature unnecessary? |
Audio-ISSUE-56 (HTMLMediaElementSync): HTMLMediaElement synchronisation [Web Audio API]
http://www.w3.org/2011/audio/track/issues/56
Raised by: Philip Jägenstedt
On product: Web Audio API
It appears as though once audio data has left HTMLMediaElement (via MediaElementAudioSourceNode) there is no way to filter that audio and play it back in sync with other audio or video streams.
Since the timestamps are not propagated, it does not appear possible to add effects to a particular point in the media resource timeline. For example, audio descriptions (voice synthesis of extra text cues for the visually impaired) requires ducking the main audio and mixing in additional audio at a specific time.
The text was updated successfully, but these errors were encountered: