Skip to content
This repository has been archived by the owner on Oct 7, 2021. It is now read-only.

interaction between audio elements and the Web Audio API #104

Closed
chrisguttandin opened this issue Nov 5, 2020 · 2 comments
Closed

interaction between audio elements and the Web Audio API #104

chrisguttandin opened this issue Nov 5, 2020 · 2 comments
Projects

Comments

@chrisguttandin
Copy link

Describe the issue
There seems to be no consensus so far on what an audio element should do when its sound is piped into the Web Audio API. Firefox stops playing out the sound from the audio element once it is connected to the Web Audio API. Chrome on the other hand keeps on playing the sound from audio element. Until recently it was possible to manually mute the audio element in Chrome which essentially reproduces what Firefox does automatically. The problem is that both Firefox and Chrome (since a few weeks) stop the audio element internally once it is muted which means the workaround doesn't work anymore in Chrome.

Where Is It
I'm not sure if the behavior is specified already somewhere. It could also be part of the spec which defines the media element or the one for MediaStreams ...

Additional Information
I initially filed a bug for Chrome for the same issue.

@chrisguttandin chrisguttandin added the Needs WG Review New issues filed that needs reviewing label Nov 5, 2020
@hoch
Copy link
Member

hoch commented Nov 5, 2020

Thanks for filing this, @chrisguttandin!

@rtoy rtoy removed the Needs WG Review New issues filed that needs reviewing label Nov 5, 2020
@chrisguttandin
Copy link
Author

I learned today that the behavior should depend on the connection between an audio element and the Web Audio API.

context.createMediaElementSource(audioElement)

This should re-route the audio from the audio element into the Web Audio API which means the audio element does not produce any sound anymore on its own. This is defined here: https://webaudio.github.io/web-audio-api/#mediaelementaudiosourcenode

The HTMLMediaElement MUST behave in an identical fashion after the MediaElementAudioSourceNode has been created, except that the rendered audio will no longer be heard directly, but instead will be heard as a consequence of the MediaElementAudioSourceNode being connected through the routing graph. Thus pausing, seeking, volume, src attribute changes, and other aspects of the HTMLMediaElement MUST behave as they normally would if not used with a MediaElementAudioSourceNode.

context.createMediaStreamSource(audioElement.captureStream())

This should not affect the output of the audio element. It should be controllable independently. This is defined here: https://w3c.github.io/mediacapture-fromelement/#dom-htmlmediaelement-capturestream

Whether a media element is actively rendering content (e.g., to a screen or audio device) has no effect on the content of captured streams. Muting the audio on a media element does not cause the capture to produce silence, nor does hiding a media element cause captured video to stop. Similarly, the audio level or volume of the media element does not affect the volume of captured audio.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
V2
  
Awaiting triage
Development

No branches or pull requests

3 participants