Skip to content

Commit

Permalink
Link event handlers IDL attributes to event types
Browse files Browse the repository at this point in the history
Changes in WebAudio#2498 introduce a weird phrasing for "fire an event", e.g.
"Fire an event to onstatechange EventHandler". Correct phrasing should rather
be "Fire an event named statechange".

This made me realize that the spec never associates the event handler IDL
attributes that it defines with the actual event handler event type that gets
fired. The spec also seems to assume that there will be at most one event
handler (e.g. when it says that "This AudioBuffer is only valid while in the
scope of the onaudioprocess function"), whereas `onxxx` is just one way to
listen to `xxx` events, `addEventListener` may also be used.

Specs typically make the association between event handler IDL attributes and
event handler event types explicit, e.g. as done in HTML through tables such as:
https://html.spec.whatwg.org/multipage/webappapis.html#event-handlers-on-elements,-document-objects,-and-window-objects

... or in WebRTC through text in the definition of the IDL attribute:
https://w3c.github.io/webrtc-pc/#ref-for-event-datachannel-bufferedamountlow-1

This pull request adds definitions of event types next to the definitions of
the `onxxx` IDL attributes with which they are associated and uses references
to these event types whenever the spec fires an event or mentions event
handlers.

I tried to keep the changes minimal. I was going to argue that these changes are
editorial in nature as they merely clarify something that did not create
ambiguities for implementations. Now, you seem to track all changes with
"proposed corrections", regardless of whether they're editorial or substantive
so I suspect you cannot accept these changes as-is...

At a minimum, I think that the occurrences of "fire an event to onxxx" should be
fixed (they only appear in proposed corrections so that seems doable without
creating additional proposed corrections). I can prepare a separate PR to that
effect. I can also look into creating appropriate "proposed corrections"
structures for the other changes if they seem useful. That may not be worth the
hassle.
  • Loading branch information
tidoust committed Oct 5, 2022
1 parent cc718a3 commit d8dfc30
Showing 1 changed file with 52 additions and 56 deletions.
108 changes: 52 additions & 56 deletions index.bs
Expand Up @@ -930,17 +930,18 @@ Attributes</h4>

: <dfn>onstatechange</dfn>
::
A property used to set the <code>EventHandler</code> for an
A property used to set an [=event handler=] for an
event that is dispatched to
{{BaseAudioContext}} when the state of the
AudioContext has changed (i.e. when the corresponding promise
would have resolved). An event of type
{{Event}} will be dispatched to the event
would have resolved). The event type of this event handler is
<dfn event>statechange</dfn>. An event that uses the
{{Event}} interface will be dispatched to the event
handler, which can query the AudioContext's state directly. A
newly-created AudioContext will always begin in the
<code>suspended</code> state, and a state change event will be
fired whenever the state changes to a different state. This
event is fired before the {{oncomplete}} event
event is fired before the {{complete}} event
is fired.

: <dfn>sampleRate</dfn>
Expand Down Expand Up @@ -1185,7 +1186,7 @@ Methods</h4>
<span class="synchronous">In this case an {{IndexSizeError}} MUST be thrown.</span>

<pre class=argumentdef for="BaseAudioContext/createScriptProcessor(bufferSize, numberOfInputChannels, numberOfOutputChannels)">
bufferSize: The {{ScriptProcessorNode/bufferSize}} parameter determines the buffer size in units of sample-frames. If it's not passed in, or if the value is 0, then the implementation will choose the best buffer size for the given environment, which will be constant power of 2 throughout the lifetime of the node. Otherwise if the author explicitly specifies the bufferSize, it <em class="rfc2119" title="MUST">MUST</em> be one of the following values: 256, 512, 1024, 2048, 4096, 8192, 16384. This value controls how frequently the {{ScriptProcessorNode/onaudioprocess}} event is dispatched and how many sample-frames need to be processed each call. Lower values for {{ScriptProcessorNode/bufferSize}} will result in a lower (better) <a href="#latency">latency</a>. Higher values will be necessary to avoid audio breakup and <a href="#audio-glitching">glitches</a>. It is recommended for authors to not specify this buffer size and allow the implementation to pick a good buffer size to balance between <a href="#latency">latency</a> and audio quality. If the value of this parameter is not one of the allowed power-of-2 values listed above, <span class="synchronous">an {{IndexSizeError}} <em class="rfc2119" title="MUST">MUST</em> be thrown</span>.
bufferSize: The {{ScriptProcessorNode/bufferSize}} parameter determines the buffer size in units of sample-frames. If it's not passed in, or if the value is 0, then the implementation will choose the best buffer size for the given environment, which will be constant power of 2 throughout the lifetime of the node. Otherwise if the author explicitly specifies the bufferSize, it <em class="rfc2119" title="MUST">MUST</em> be one of the following values: 256, 512, 1024, 2048, 4096, 8192, 16384. This value controls how frequently the {{ScriptProcessorNode/audioprocess}} event is dispatched and how many sample-frames need to be processed each call. Lower values for {{ScriptProcessorNode/bufferSize}} will result in a lower (better) <a href="#latency">latency</a>. Higher values will be necessary to avoid audio breakup and <a href="#audio-glitching">glitches</a>. It is recommended for authors to not specify this buffer size and allow the implementation to pick a good buffer size to balance between <a href="#latency">latency</a> and audio quality. If the value of this parameter is not one of the allowed power-of-2 values listed above, <span class="synchronous">an {{IndexSizeError}} <em class="rfc2119" title="MUST">MUST</em> be thrown</span>.
numberOfInputChannels: This parameter determines the number of channels for this node's input. The default value is 2. Values of up to 32 must be supported. <span class="synchronous">A {{NotSupportedError}} must be thrown if the number of channels is not supported.</span>
numberOfOutputChannels: This parameter determines the number of channels for this node's output. The default value is 2. Values of up to 32 must be supported. <span class="synchronous">A {{NotSupportedError}} must be thrown if the number of channels is not supported.</span>
</pre>
Expand Down Expand Up @@ -1787,7 +1788,7 @@ Constructors</h4>

1. <a href="https://html.spec.whatwg.org/multipage/media.html#queue-a-media-element-task">
queue a media element task</a> to <a spec="dom" lt="fire an event">fire an event
</a> named `statechange` at the {{AudioContext}}.
</a> named {{BaseAudioContext/statechange}} at the {{AudioContext}}.
</del>
<ins cite=#2400>
1. Attempt to <a href="#acquiring">acquire system resources</a> to use a
Expand All @@ -1808,9 +1809,8 @@ Constructors</h4>
1. Set the {{BaseAudioContext/state}} attribute of the {{AudioContext}}
to "{{AudioContextState/running}}".

1. <a spec="dom" lt="fire an event">Fire an event</a> to
{{BaseAudioContext/onstatechange}} {{EventHandler}} at the
{{AudioContext}}.
1. <a spec="dom" lt="fire an event">Fire an event</a> named
{{BaseAudioContext/statechange}} at the {{AudioContext}}.
</ins>
</div>
Note: It is unfortunately not possible to programatically notify authors
Expand Down Expand Up @@ -1919,11 +1919,12 @@ Attributes</h4>

: <dfn>onsinkchange</dfn>
::
An {{EventHandler}} for {{AudioContext/setSinkId()}}. This event will
be dispatched when changing the output device is completed.
An [=event handler=] for {{AudioContext/setSinkId()}}. The event type of
this event handler is <dfn event>sinkchange</dfn>. This event will be
dispatched when changing the output device is completed.

NOTE: This is not dispatched for the initial device selection in the
construction of {{AudioContext}}. The {{BaseAudioContext/onstatechange}}
construction of {{AudioContext}}. The {{BaseAudioContext/statechange}} event
is available to check the readiness of the initial output device.

</ins>
Expand Down Expand Up @@ -1989,7 +1990,7 @@ Methods</h4>

1. <a href="https://html.spec.whatwg.org/multipage/media.html#queue-a-media-element-task">
queue a media element task</a> to <a spec="dom" lt="fire an event">fire
an event</a> named `statechange` at the {{AudioContext}}.
an event</a> named {{BaseAudioContext/statechange}} at the {{AudioContext}}.
</div>

When an {{AudioContext}} is closed, any
Expand Down Expand Up @@ -2196,7 +2197,7 @@ Methods</h4>

1. <a href="https://html.spec.whatwg.org/multipage/media.html#queue-a-media-element-task">
queue a media element task</a> to <a spec="dom" lt="fire an event">fire
an event </a> named `statechange` at the {{AudioContext}}.
an event </a> named {{BaseAudioContext/statechange}} at the {{AudioContext}}.
</div>

<div>
Expand Down Expand Up @@ -2267,7 +2268,7 @@ Methods</h4>

1. <a href="https://html.spec.whatwg.org/multipage/media.html#queue-a-media-element-task">
queue a media element task</a> to <a spec="dom" lt="fire an event">fire
an event </a> named `statechange` at the {{AudioContext}}.
an event </a> named {{BaseAudioContext/statechange}} at the {{AudioContext}}.
</div>

While an {{AudioContext}} is suspended,
Expand Down Expand Up @@ -2360,9 +2361,8 @@ Methods</h4>
1. Set the {{BaseAudioContext/state}} attribute of the
{{AudioContext}} to "{{AudioContextState/suspended}}".

1. <a spec="dom" lt="fire an event">Fire an event</a> to
{{BaseAudioContext/onstatechange}} {{EventHandler}} at the
associated {{AudioContext}}.
1. <a spec="dom" lt="fire an event">Fire an event</a> named
{{BaseAudioContext/statechange}} at the associated {{AudioContext}}.

1. Attempt to <a href="#acquiring">acquire system resources</a> to use
a following audio output device based on {{AudioContext/[[sink ID]]}}
Expand All @@ -2381,9 +2381,8 @@ Methods</h4>

1. Resolve |p|.

1. <a spec="dom" lt="fire an event">Fire an event</a> to
{{AudioContext/onsinkchange}} {{EventHandler}} at the associated
{{AudioContext}}.
1. <a spec="dom" lt="fire an event">Fire an event</a> named
{{sinkchange}} at the associated {{AudioContext}}.

1. If |wasRunning| is true:

Expand All @@ -2399,9 +2398,8 @@ Methods</h4>
1. Set the {{BaseAudioContext/state}} attribute of the
{{AudioContext}} to "{{AudioContextState/running}}".

1. <a spec="dom" lt="fire an event">Fire an event</a> to
{{BaseAudioContext/onstatechange}} {{EventHandler}} at the
associated {{AudioContext}}.
1. <a spec="dom" lt="fire an event">Fire an event</a> named
{{BaseAudioContext/statechange}} at the associated {{AudioContext}}.
</div>

</ins>
Expand Down Expand Up @@ -2605,7 +2603,9 @@ Dictionary {{AudioTimestamp}} Members</h5>
<dl dfn-type=attribute dfn-for="AudioRenderCapacity">
: <dfn>onupdate</dfn>
::
An EventHandler for AudioRenderCapacityEvent.
The event type of this event handler is <dfn event>update</dfn>. Events
dispatched to the event handler will use the
{{AudioRenderCapacityEvent}} interface.
</dl>

<h5 id="AudioRenderCapacity-methods">
Expand All @@ -2615,14 +2615,15 @@ Dictionary {{AudioTimestamp}} Members</h5>
: <dfn>start(options)</dfn>
::
Starts metric collection and analysis. This will repeatedly
dispatch an {{AudioRenderCapacityEvent}} to
{{AudioRenderCapacity/onupdate}} EventHandler with the given
update interval in {{AudioRenderCapacityOptions}}.
<a spec="dom" lt="fire an event">fire an event</a> named
{{AudioRenderCapacity/update}} at {{AudioRenderCapacity}}, using
{{AudioRenderCapacityEvent}}, with the given update interval in
{{AudioRenderCapacityOptions}}.

: <dfn>stop()</dfn>
::
Stops metric collection and analysis. It also stops dispatching
{{AudioRenderCapacityEvent}}.
{{AudioRenderCapacity/update}} events.
</dl>

<h4 dictionary lt="audiorenderCapacityoptions" id="AudioRenderCapacityOptions">
Expand Down Expand Up @@ -2861,8 +2862,9 @@ Attributes</h4>

: <dfn>oncomplete</dfn>
::
An EventHandler of type <a href="#OfflineAudioCompletionEvent">OfflineAudioCompletionEvent</a>.
It is the last event fired on an {{OfflineAudioContext}}.
The event type of this event handler is <dfn event>complete</dfn>. The event
dispatched to the event handler will use the {{OfflineAudioCompletionEvent}}
interface. It is the last event fired on an {{OfflineAudioContext}}.
</dl>

<h4 id="OfflineAudioContext-methods">
Expand Down Expand Up @@ -2946,7 +2948,7 @@ Methods</h4>
<li><a href="https://html.spec.whatwg.org/multipage/media.html#queue-a-media-element-task">
queue a media element task</a> to
<a spec="dom" lt="fire an event">fire an event</a> named
`complete` using an instance of {{OfflineAudioCompletionEvent}}
{{OfflineAudioContext/complete}} using an instance of {{OfflineAudioCompletionEvent}}
whose `renderedBuffer` property is set to
{{[[rendered buffer]]}}.

Expand Down Expand Up @@ -3019,7 +3021,7 @@ Methods</h4>

1. <a href="https://html.spec.whatwg.org/multipage/media.html#queue-a-media-element-task">
queue a media element task</a> to <a spec="dom" lt="fire an event">fire
an event</a> named `statechange` at the {{OfflineAudioContext}}.
an event</a> named {{BaseAudioContext/statechange}} at the {{OfflineAudioContext}}.

</div>

Expand Down Expand Up @@ -4960,18 +4962,14 @@ Attributes</h4>
<dl dfn-type=attribute dfn-for="AudioScheduledSourceNode">
: <dfn>onended</dfn>
::
A property used to set the <code>EventHandler</code> (described
in <cite><a href="https://html.spec.whatwg.org/multipage/webappapis.html#eventhandler">
HTML</a></cite>[[!HTML]]) for the ended event that is
dispatched for {{AudioScheduledSourceNode}} node
A property used to set an [=event handler=] for the <dfn event>ended</dfn>
event type that is dispatched to {{AudioScheduledSourceNode}} node
types. When the source node has stopped playing (as determined
by the concrete node), an event of type {{Event}}
(described in <cite><a href="https://html.spec.whatwg.org/multipage/infrastructure.html#event">
HTML</a></cite> [[!HTML]]) will be dispatched to the event
handler.
by the concrete node), an event that uses the {{Event}} interface will be
dispatched to the event handler.

For all {{AudioScheduledSourceNode}}s, the
<code>onended</code> event is dispatched when the stop time
{{AudioScheduledSourceNode/ended}} event is dispatched when the stop time
determined by {{AudioScheduledSourceNode/stop()}} is reached.
For an {{AudioBufferSourceNode}}, the event is
also dispatched because the {{AudioBufferSourceNode/start(when, offset, duration)/duration}} has been
Expand Down Expand Up @@ -6622,7 +6620,7 @@ Attributes</h4>
number of channels equal to the
<code>numberOfInputChannels</code> parameter of the
createScriptProcessor() method. This AudioBuffer is only valid
while in the scope of the {{ScriptProcessorNode/onaudioprocess}} function.
while in the scope of the {{ScriptProcessorNode/audioprocess}} event handler functions.
Its values will be meaningless outside of this scope.

: <dfn>outputBuffer</dfn>
Expand All @@ -6631,7 +6629,7 @@ Attributes</h4>
will have a number of channels equal to the
<code>numberOfOutputChannels</code> parameter of the
createScriptProcessor() method. Script code within the scope of
the {{ScriptProcessorNode/onaudioprocess}} function is
the {{ScriptProcessorNode/audioprocess}} event handler functions are
expected to modify the {{Float32Array}} arrays
representing channel data in this AudioBuffer. Any script
modifications to this AudioBuffer outside of this scope will not
Expand Down Expand Up @@ -10257,8 +10255,8 @@ macros:
The {{ScriptProcessorNode}} is constructed with a
{{BaseAudioContext/createScriptProcessor(bufferSize, numberOfInputChannels, numberOfOutputChannels)/bufferSize}} which MUST be one of the following values: 256,
512, 1024, 2048, 4096, 8192, 16384. This value controls how
frequently the {{ScriptProcessorNode/onaudioprocess}} event is dispatched and how
many sample-frames need to be processed each call. {{ScriptProcessorNode/onaudioprocess}} events are only
frequently the {{ScriptProcessorNode/audioprocess}} event is dispatched and how
many sample-frames need to be processed each call. {{ScriptProcessorNode/audioprocess}} events are only
dispatched if the {{ScriptProcessorNode}} has at
least one input or one output connected. Lower numbers for
{{ScriptProcessorNode/bufferSize}} will result in
Expand Down Expand Up @@ -10287,17 +10285,15 @@ Attributes</h4>
: <dfn>bufferSize</dfn>
::
The size of the buffer (in sample-frames) which needs to be
processed each time {{ScriptProcessorNode/onaudioprocess}} is called.
processed each time {{ScriptProcessorNode/audioprocess}} is fired.
Legal values are (256, 512, 1024, 2048, 4096, 8192, 16384).

: <dfn>onaudioprocess</dfn>
::
A property used to set the <code>EventHandler</code> (described
in <cite><a href="https://html.spec.whatwg.org/multipage/webappapis.html#eventhandler">
HTML</a></cite>[[!HTML]]) for the {{ScriptProcessorNode/onaudioprocess}} event that
is dispatched to {{ScriptProcessorNode}} node
types. An event of type {{AudioProcessingEvent}}
will be dispatched to the event handler.
A property used to set an [=event handler=] for the
<dfn event>audioprocess</dfn> event type that is dispatched to
{{ScriptProcessorNode}} node types. The event dispatched to the event
handler uses the {{AudioProcessingEvent}} interface.
</dl>


Expand Down Expand Up @@ -11096,7 +11092,7 @@ the <a>rendering thread</a> will invoke the algorithm below:
1. <a spec=webidl lt=construct>Construct a callback function</a> from |processorCtor| with the argument
of |deserializedOptions|. If any exceptions are thrown in the callback, <a>queue a task</a> to
the <a>control thread</a> to <a spec="dom" lt="fire an event">fire an event</a> named
`processorerror` using {{ErrorEvent}} at |nodeReference|.
{{AudioWorkletNode/processorerror}} using {{ErrorEvent}} at |nodeReference|.

1. Empty the [=pending processor construction data=] slot.
</div>
Expand Down Expand Up @@ -11281,7 +11277,7 @@ Attributes</h5>
<code>constructor</code>, <code>process</code> method,
or any user-defined class method, the processor will
<a href="https://html.spec.whatwg.org/multipage/media.html#queue-a-media-element-task"> queue a media
element task</a> to <a spec="dom" lt="fire an event">fire an event</a> named `processorerror` using
element task</a> to <a spec="dom" lt="fire an event">fire an event</a> named <dfn event>processorerror</dfn> using
<a href="https://html.spec.whatwg.org/multipage/webappapis.html#the-errorevent-interface">
ErrorEvent</a> at the associated {{AudioWorkletNode}}.

Expand Down Expand Up @@ -12430,7 +12426,7 @@ task queue=] of its associated {{BaseAudioContext}}.

1. <a>Queue a task</a> to the <a>control thread</a>
<a spec="dom" lt="fire an event">fire</a> an
{{ErrorEvent}} named <code>processorerror</code> at the
{{ErrorEvent}} named {{AudioWorkletNode/processorerror}} at the
associated {{AudioWorkletNode}}.

5. If this {{AudioNode}} is a <a>destination node</a>,
Expand Down

0 comments on commit d8dfc30

Please sign in to comment.