-
Notifications
You must be signed in to change notification settings - Fork 18
/
getusermedia.html
4607 lines (3211 loc) · 157 KB
/
getusermedia.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
<!DOCTYPE html>
<!--
To publish this document, see instructions in README
-->
<html lang="en-us" xmlns="http://www.w3.org/1999/xhtml" xml:lang="en-us">
<head>
<link href="getusermedia.css" rel="stylesheet" type="text/css" />
<title>Media Capture and Streams</title>
<meta content="text/html; charset=utf-8" http-equiv="Content-Type" />
<script class="remove" src="http://www.w3.org/Tools/respec/respec-w3c-common"
type="text/javascript">
//<![CDATA[
<!-- keep this comment -->
//]]>
</script>
<script class="remove" src="getusermedia.js" type="text/javascript">
//<![CDATA[
<!-- keep this comment -->
//]]>
</script>
</head>
<body>
<section id="abstract">
<p>This document defines a set of JavaScript APIs that allow local media,
including audio and video, to be requested from a platform.</p>
</section>
<section id="sotd">
<p>This document is not complete. It is subject to major changes and, while
early experimentations are encouraged, it is therefore not intended for
implementation. The API is based on preliminary work done in the
WHATWG.</p>
</section>
<section class="informative" id="intro">
<h2>Introduction</h2>
<p>Access to multimedia streams (video, audio, or both) from local devices
(video cameras, microphones, Web cams) can have a number of uses, such as
real-time communication, recording, and surveillance.</p>
<p>This document defines the APIs used to get access to local devices that
can generate multimedia stream data. This document also defines the
MediaStream API by which JavaScript is able to manipulate the stream data
or otherwise process it.</p>
</section>
<section id="conformance">
<p>This specification defines conformance criteria that apply to a single
product: the <dfn>user agent</dfn> that implements the interfaces that it
contains.</p>
<p>Implementations that use ECMAScript to implement the APIs defined in
this specification must implement them in a manner consistent with the
ECMAScript Bindings defined in the Web IDL specification [[!WEBIDL]], as
this specification uses that specification and terminology.</p>
</section>
<section>
<h2>Terminology</h2>
<dl>
<dt><i>HTML Terms:</i>
</dt>
<dd>
<p>The <code><a href=
"http://dev.w3.org/html5/spec/webappapis.html#eventhandler">EventHandler</a></code>
interface represents a callback used for event handlers as defined in
[[!HTML5]].</p>
<p>The concepts <dfn><a href=
"http://dev.w3.org/html5/spec/webappapis.html#queue-a-task">queue a
task</a></dfn> and <dfn><a href=
"http://dev.w3.org/html5/spec/webappapis.html#fire-a-simple-event">fires
a simple event</a></dfn> are defined in [[!HTML5]].</p>
<p>The terms <dfn><a href=
"http://dev.w3.org/html5/spec/webappapis.html#event-handlers">event
handlers</a></dfn> and <dfn><a href=
"http://dev.w3.org/html5/spec/webappapis.html#event-handler-event-type">
event handler event types</a></dfn> are defined in [[!HTML5]].</p>
</dd>
<dt><dfn>source</dfn>
</dt>
<dd>
<p>A source is the "thing" providing the source of a media stream
track. The source is the broadcaster of the media itself. A source can
be a physical webcam, microphone, local video or audio file from the
user's hard drive, network resource, or static image.</p>
<p>Some sources have an identifier which <em title="must" class=
"rfc2119">must</em> be unique to the application (un-guessable by
another application) and persistent between application sessions (e.g.,
the identifier for a given source device/application must stay the
same, but not be guessable by another application). Sources that must
have an identifier are camera and microphone sources; local file
sources are not required to have an identifier. Source identifiers let
the application save, identify the availability of, and directly
request specific sources.</p>
<p>Other than the identifier, other bits of source identity are
<strong>never</strong> directly available to the application until the
user agent connects a source to a track. Once a source has been
"released" to the application (either via a permissions UI,
pre-configured allow-list, or some other release mechanism) the
application will be able discover additional source-specific
capabilities.</p>
<p>Sources <strong>do not</strong> have constraints -- tracks
have constraints. When a source is connected to a track, it
must, possibly in combination with UA processing (e.g.,
downsampling), conform to the constraints present on that
track (or set of tracks).</p>
<p>Sources will be released (un-attached) from a track when the track
is ended for any reason.</p>
<p>On the <code><a>MediaStreamTrack</a></code> object, sources are
represented by a <code><a>sourceType</a></code> attribute. The behavior
of APIs associated with the source's capabilities and settings change
depending on the source type.</p>
<p>Sources have <code><a>capabilities</a></code> and
<code><a>settings</a></code>. The capabilities and settings are "owned"
by the source and are common to any (multiple) tracks that happen to be
using the same source (e.g., if two different track objects bound to
the same source ask for the same capability or setting information,
they will get back the same answer).</p>
</dd>
<dt>
<a>Setting</a> (Source Setting)
</dt>
<dd>
<p>A setting refers to the immediate, current value of the source's
(optionally constrained) capabilities. Settings are always
read-only.</p>
<p>A source's settings can change dynamically over time due to
environmental conditions, sink configurations, or constraint changes. A
source's settings must always conform to the current set of mandatory
constraints that all of the tracks it is bound to have defined, and
should do its best to conform to the set of optional constraints
specified.</p>
<p>Although settings are a property of the source, they are
only exposed to the application through the tracks attached to
the source. The <a>Constrainable</a> interface provides this
exposure.</p>
<p>A conforming user-agent <em title="must" class="rfc2119">must</em>
support all the setting names defined in this spec.</p>
<p>As represented in this specification, a source is the
realization of a device as presented by the User Agent. Thus,
it is possible that the actual settings of the device may
differ from those presented by the User Agent. As an example,
there are some operating systems and native device APIs that
will treat a camera with a single native capture resolution as
if it can produce any resolution less than that value,
downsampling as necessary. Even though the camera technically
has only one specific width and one specific height it can
support, it is likely that the User Agent will represent this
camera as a source with a range of supported widths and
heights. To enable the application to determine when this has
occurred, tracks provide both
a <code><a>getSettings()</a></code> method (which always
returns a setting that satisfies the constraints applied to
the track) and a <code><a>getNativeSettings()</a></code>
method (which always returns, to the best of the User Agent's
determination, the actual setting of the native device). Note
that both the track settings and the native settings are
snapshots and can change without application involvement. In
particular, changes in the native settings could cause changes
in the track settings that would result in the latter values
being outside of the constraints and thus causing
overconstrained events for all affected tracks.</p>
</dd>
<dt>
<a>Capabilities</a>
</dt>
<dd>
<p>Source capabilities are the intrinsic "features" of a
source object. For each source setting, there is a
corresponding capability that describes whether it is
supported by the source and if so, what the range of supported
values are. As with settings, capabilities are exposed to the
application via the <a>Constrainable</a> interface.</p>
<p>The values of the supported capabilities must be normalized to the
ranges and enumerated types defined in this specification.</p>
<p>A <a>getCapabilities()</a> call on a track returns the same
underlying per-source capabilities for all tracks connected to
the source.</p>
<p>Source capabilities are effectively constant. Applications should be
able to depend on a specific source having the same capabilities for
any session.</p>
</dd>
<dt>
<a>Constraints</a>
</dt>
<dd>
<p>Constraints are an optional track feature for restricting
the range of allowed variability on a source. Without provided
track constraints, implementations are free to select a
source's settings from the full ranges of its supported
capabilities, and to adjust those settings at any time for any
reason.</p>
<p>Constraints are exposed on tracks via
the <a>Constrainable</a> interface, which includes an API for
dynamically changing constraints. Note
that <a>getUserMedia()</a> also permits an initial set of
constraints to be applied when the track is first
obtained.</p>
<p>It is possible for two tracks that share a unique source to
apply contradictory constraints. The <a>Constrainable</a>
interface supports the calling of an error handler when the
conflicting constraint is requested. After successful
application of constraints on a track (and its associated
source), if at any later time the track
becomes <a>overconstrained</a>, the user agent MUST change the
track to the <a>muted</a> state.</p>
<p>A correspondingly-named constraint exists for each
corresponding source setting name and capability name. In
general, user agents will have more flexibility to optimize
the media streaming experience the fewer constraints are
applied, so application authors are strongly encouraged to use
mandatory constraints sparingly.</p>
</dd>
<dt><code>RTCPeerConnection</code>
</dt>
<dd><dfn><code>RTCPeerConnection</code></dfn> is defined in
[[!WEBRTC10]].</dd>
</dl>
</section>
<section id="stream-api">
<h2>MediaStream API</h2>
<section>
<h2>Introduction</h2>
<p>The <code><a>MediaStream</a></code> interface is used to represent
streams of media data, typically (but not necessarily) of audio and/or
video content, e.g. from a local camera. The data from a
<code><a>MediaStream</a></code> object does not necessarily have a
canonical binary form; for example, it could just be "the video currently
coming from the user's video camera". This allows user agents to
manipulate media streams in whatever fashion is most suitable on the
user's platform.</p>
<p>Each <code><a>MediaStream</a></code> object can contain zero or more
tracks, in particular audio and video tracks. All tracks in a MediaStream
are intended to be synchronized when rendered. Different MediaStreams do
not need to be synchronized.</p>
<p>Each track in a MediaStream object has a corresponding
<code><a>MediaStreamTrack</a></code> object.</p>
<p>A <code><a>MediaStreamTrack</a></code> represents content comprising
one or more channels, where the channels have a defined well known
relationship to each other (such as a stereo or 5.1 audio signal).</p>
<p>A channel is the smallest unit considered in this API
specification.</p>
<p>A <code><a>MediaStream</a></code> object has an input and an output.
The input depends on how the object was created: a
<code><a>MediaStream</a></code> object generated by a <code><a href=
"#dom-navigator-getusermedia">getUserMedia()</a></code> call (which is
described later in this document), for instance, might take its input
from the user's local camera. The output of the object controls how the
object is used, e.g., what is saved if the object is written to a file or
what is displayed if the object is used in a <code>video</code>
element.</p>
<p>Each track in a <code><a>MediaStream</a></code> object can be
disabled, meaning that it is muted in the object's output. All tracks are
initially enabled.</p>
<p>A <code><a>MediaStream</a></code> can be <dfn><a>finished</a></dfn>,
indicating that its inputs have forever stopped providing data.</p>
<p>The output of a <code><a>MediaStream</a></code> object MUST correspond
to the tracks in its input. Muted audio tracks MUST be replaced with
silence. Muted video tracks MUST be replaced with blackness.</p>
<p>A new <code><a>MediaStream</a></code> object can be created from
accessible media sources (that does not require any additional
permissions) using the <code><a href=
"#dom-mediastream">MediaStream()</a></code> constructor. The constructor
argument can either be an existing <code><a>MediaStream</a></code>
object, in which case all the tracks of the given stream are added to the
new <code><a>MediaStream</a></code> object, or an array of
<code><a>MediaStreamTrack</a></code> objects. The latter form makes it
possible to compose a stream from different source streams.</p>
<p><img alt="A MediaStream" src="images/media-stream.png" width="418" />
</p>
<p>Both <code><a>MediaStream</a></code> and
<code><a>MediaStreamTrack</a></code> objects can be cloned. This allows
for greater control since the separate instances can be manipulated and
<a title="consumer">consumed</a> individually. A cloned
<code><a>MediaStream</a></code> contains clones of all member tracks from
the original stream.</p>
<p>When a <code><a>MediaStream</a></code> object is being generated from
a local file (as opposed to a live audio/video source), the user agent
SHOULD stream the data from the file in real time, not all at once. The
<code>MediaStream</code> object is also used in contexts outside
<code>getUserMedia</code>, such as [[!WEBRTC10]]. In both cases, ensuring
a realtime stream reduces the ease with which pages can distinguish live
video from pre-recorded video, which can help protect the user's
privacy.</p>
</section>
<section>
<h2>MediaStream</h2>
<p>The <dfn id="dom-mediastream"><code>MediaStream()</code></dfn>
constructor composes a new stream out of existing tracks. It takes an
optional argument of type <code><a>MediaStream</a></code> or an array of
<code><a>MediaStreamTrack</a></code> objects. <dfn id=
'mediastream-constructor'>When the constructor is invoked</dfn>, the UA
must run the following steps:</p>
<ol>
<li>
<p>Let <var>stream</var> be a newly constructed
<code><a>MediaStream</a></code> object.</p>
</li>
<li>
<p>Initialize <var>stream's</var> <code><a href=
"#dom-mediastream-id">id</a></code> attribute to a newly generated
value.</p>
</li>
<li>
<p>If the constructor's argument is present, run the sub steps that
corresponds to the argument type.</p>
<ul>
<li>
<p><code>Array</code> of <code><a>MediaStreamTrack</a></code>
objects:</p>
<p>Run the following sub steps for each
<code><a>MediaStreamTrack</a></code> in the array:</p>
<ol>
<li>
<p><em>Add track</em>: Let <var>track</var> be the
<code><a>MediaStreamTrack</a></code> about to be
processed.</p>
</li>
<li>
<p>If <var>track</var> has <a href="#track-ended">ended</a>,
then abort these steps and continue with the next track (if
any).</p>
</li>
<li>
<p>Add <var>track</var> to <var>stream</var>'s <a href=
"#track-set">track set</a>.</p>
</li>
</ol>
</li>
<li>
<p><code><a>MediaStream</a></code>:</p>
<p>Run the sub steps labeled <em>Add track</em> (above) for every
<code><a>MediaStreamTrack</a></code> in the argument stream's
<a href="#track-set">track set</a>.</p>
</li>
</ul>
</li>
<li>
<p>If <var>stream</var>'s <a href="#track-set">track set</a> is
empty, set <var>stream</var>'s <code><a href=
"#dom-mediastream-active">active</a></code> attribute to
<code>false</code>, otherwise set it to <code>true</code>.</p>
</li>
<li>
<p>Return <var>stream</var>.</p>
</li>
</ol>
<p>A <code><a>MediaStream</a></code> can have multiple audio and video
sources (e.g. because the user has multiple microphones, or because the
real source of the stream is a media resource with many media tracks).
The stream represented by a <code><a>MediaStream</a></code> thus has zero
or more tracks.</p>
<p>The tracks of a <code><a>MediaStream</a></code> are stored in a
<dfn id="track-set">track set</dfn>. The track set MUST contain the
<code><a>MediaStreamTrack</a></code> objects that correspond to the
tracks of the stream. The relative order of the tracks in the set is user
agent defined and the API will never put any requirements on the order.
The proper way to find a specific <code><a>MediaStreamTrack</a></code>
object in the set is to look it up by its <code><a href=
"#dom-mediastreamtrack-id">id</a></code>.</p>
<p>An object that reads data from the output of a
<code><a>MediaStream</a></code> is referred to as a
<code><a>MediaStream</a></code> <dfn>consumer</dfn>. The list of
<code><a>MediaStream</a></code> consumers currently include the media
elements [[HTML5]], <code>RTCPeerConnection</code> [[WEBRTC10]],
<code>MediaRecorder</code> [[mediastream-rec]] and
<code>ImageCapture</code> [[mediastream-imagecap]].</p>
<p class="note"><code><a>MediaStream</a></code> consumers must be able to
handle tracks being added and removed. This behavior is specified per
consumer.</p>
<p>A <code><a>MediaStream</a></code> object is said to be <dfn id=
"stream-inactive">MediaStream.inactive</dfn> when it does not have any
tracks or all tracks belonging to the stream have <a href=
"#track-ended">ended</a>. Otherwise the stream is active. A
<code><a>MediaStream</a></code> can start its life as inactive if it is
constructed without any tracks.</p>
<p>When a <code><a>MediaStream</a></code> goes from being active to
inactive, the user agent MUST queue a task that sets the object's
<code><a href="#dom-mediastream-active">active</a></code> attribute to
<code>false</code> and fire a simple event named <code><a href=
"#event-mediastream-inactive">inactive</a></code> at the object. When a
<code><a>MediaStream</a></code> goes from being inactive to active, the
user agent MUST queue a task that sets the object's <code><a href=
"#dom-mediastream-active">active</a></code> attribute to
<code>true</code> and fire a simple event named <code><a href=
"#event-mediastream-active">active</a></code> at the object.</p>
<p>If the stream's activity status changed due to a user request, the
task source for this <span title="concept-task">task</span> is the user
interaction task source. Otherwise the task source for this <span title=
"concept-task">task</span> is the networking task source.</p>
<dl class="idl" title="interface MediaStream : EventTarget">
<dt>Constructor()</dt>
<dd>
See the <a href="#mediastream-constructor">MediaStream constructor
algorithm</a>
</dd>
<dt>Constructor(MediaStream stream)</dt>
<dd>
See the <a href="#mediastream-constructor">MediaStream constructor
algorithm</a>
</dd>
<dt>Constructor(sequence<MediaStreamTrack> tracks)</dt>
<dd>
See the <a href="#mediastream-constructor">MediaStream constructor
algorithm</a>
</dd>
<dt>readonly attribute DOMString id</dt>
<dd>
<p>When a <code><a>MediaStream</a></code> object is created, the user
agent MUST generate a globally unique identifier string, and MUST
initialize the object's <code><a href=
"#dom-mediastream-id">id</a></code> attribute to that string. Such
strings MUST only use characters in the ranges U+0021, U+0023 to
U+0027, U+002A to U+002B, U+002D to U+002E, U+0030 to U+0039, U+0041
to U+005A, U+005E to U+007E, and MUST be 36 characters long.</p>
<!-- UUIDs have 36 characters
including hyphens; the ranges above comes from RFC4574 (the a=label:
thing in SDP) -->
<!-- described below -->
<p>The <dfn id="dom-mediastream-id"><code>id</code></dfn> attribute
MUST return the value to which it was initialized when the object was
created.</p>
</dd>
<dt>sequence<MediaStreamTrack> getAudioTracks()</dt>
<dd>
<p>Returns a sequence of <code><a>MediaStreamTrack</a></code> objects
representing the audio tracks in this stream.</p>
<p>The <dfn id=
"dom-mediastream-getaudiotracks"><code>getAudioTracks()</code></dfn>
method MUST return a sequence that represents a snapshot of all the
<code><a>MediaStreamTrack</a></code> objects in this stream's
<a href="#track-set">track set</a> whose <code><a href=
"#dom-mediastreamtrack-kind">kind</a></code> is equal to
"<code>audio</code>". The conversion from the <a href=
"#track-set">track set</a> to the sequence is user agent defined and
the order does not have to stable between calls.</p>
</dd>
<dt>sequence<MediaStreamTrack> getVideoTracks()</dt>
<dd>
<p>Returns a sequence of <code><a>MediaStreamTrack</a></code> objects
representing the video tracks in this stream.</p>
<p>The <dfn id=
"dom-mediastream-getvideotracks"><code>getVideoTracks()</code></dfn>
method MUST return a sequence that represents a snapshot of all the
<code><a>MediaStreamTrack</a></code> objects in this stream's
<a href="#track-set">track set</a> whose <code><a href=
"#dom-mediastreamtrack-kind">kind</a></code> is equal to
"<code>video</code>". The conversion from the <a href=
"#track-set">track set</a> to the sequence is user agent defined and
the order does not have to stable between calls.</p>
</dd>
<dt>MediaStreamTrack? getTrackById(DOMString trackId)</dt>
<dd>
<p>The <dfn id=
"dom-mediastream-gettrackbyid"><code>getTrackById()</code></dfn>
method MUST return the first <code><a>MediaStreamTrack</a></code>
object in this stream's <a href="#track-set">track set</a> whose
<code><a href="#dom-mediastreamtrack-id">id</a></code> is equal to
<var>trackId</var>. The method MUST return null if no track matches
the <var>trackId</var> argument.</p>
</dd>
<dt>void addTrack(MediaStreamTrack track)</dt>
<dd>
<p>Adds the given <code><a>MediaStreamTrack</a></code> to this
<code><a>MediaStream</a></code>.</p>
<p>When the <dfn id=
"dom-mediastream-addtrack"><code>addTrack()</code></dfn> method is
invoked, the user agent MUST run the following steps:</p>
<ol>
<li>
<p>Let <var>track</var> be the
<code><a>MediaStreamTrack</a></code> argument and
<var>stream</var> this <code><a>MediaStream</a></code>
object.</p>
</li>
<li>
<p>If <var>track</var> is already in <var>stream's</var> <a href=
"#track-set">track set</a>, then abort these steps.</p>
</li>
<li>
<p>Add <var>track</var> to <var>stream</var>'s <a href=
"#track-set">track set</a>.</p>
</li>
</ol>
</dd>
<dt>void removeTrack(MediaStreamTrack track)</dt>
<dd>
<p>Removes the given <code><a>MediaStreamTrack</a></code> from this
<code><a>MediaStream</a></code>.</p>
<p>When the <dfn id=
"dom-mediastream-removetrack"><code>removeTrack()</code></dfn> method
is invoked, the user agent MUST remove the track, indicated by the
method's argument, from the stream's <a href="#track-set">track
set</a>, if present.</p>
</dd>
<dt>MediaStream clone()</dt>
<dd>
<p>Clones the given <code><a>MediaStream</a></code> and all its
tracks.</p>
<p>When the <dfn id=
"dom-mediastream-clone"><code>MediaStream.clone()</code></dfn> method
is invoked, the user agent MUST run the following steps:</p>
<ol>
<li>
<p>Let <var>streamClone</var> be a newly constructed
<code><a>MediaStream</a></code> object.</p>
</li>
<li>
<p>Initialize <var>streamClone</var>'s <code><a href=
"#dom-mediastream-id">id</a></code> attribute to a newly
generated value.</p>
</li>
<li>
<p>Let <var>trackSetClone</var> be a list that contains the
result of running <code><a href=
"#dom-mediastreamtrack-clone">MediaStreamTrack.clone()</a></code>
on all the tracks in this stream.</p>
</li>
<li>
<p>Let <var>trackSetClone</var> be <var>streamClone</var>'s
<a href="#track-set">track set</a>.</p>
</li>
</ol>
</dd>
<dt>readonly attribute boolean active</dt>
<dd>
<p>The <dfn id=
"dom-mediastream-active"><code>MediaStream.active</code></dfn>
attribute returns true if the <code><a>MediaStream</a></code> is
active (see <a href="#stream-inactive">inactive</a>), and false
otherwise.</p>
<p>When a <code><a>MediaStream</a></code> object is created, its
<code><a href="#dom-mediastream-active">active</a></code> attribute
MUST be set to true, unless stated otherwise (for example by the
<code><a href="#dom-mediastream">MediaStream()</a></code> constructor
algorithm).</p>
</dd>
<dt>attribute EventHandler onactive</dt>
<dd>This event handler, of type <code><a href=
"#event-mediastream-active">active</a></code>, MUST be supported by all
objects implementing the <code><a>MediaStream</a></code>
interface.</dd>
<dt>attribute EventHandler oninactive</dt>
<dd>This event handler, of type <code><a href=
"#event-mediastream-inactive">inactive</a></code>, MUST be supported by
all objects implementing the <code><a>MediaStream</a></code>
interface.</dd>
<dt>attribute EventHandler onaddtrack</dt>
<dd>This event handler, of type <code><a href=
"#event-mediastream-addtrack">addtrack</a></code>, MUST be supported by
all objects implementing the <code><a>MediaStream</a></code>
interface.</dd>
<dt>attribute EventHandler onremovetrack</dt>
<dd>This event handler, of type <code><a href=
"#event-mediastream-removetrack">removetrack</a></code>, MUST be
supported by all objects implementing the
<code><a>MediaStream</a></code> interface.</dd>
</dl>
</section>
<section>
<h2>MediaStreamTrack</h2>
<p>A <code><a>MediaStreamTrack</a></code> object represents a media
source in the user agent. Several <code><a>MediaStreamTrack</a></code>
objects can represent the same media source, e.g., when the user chooses
the same camera in the UI shown by two consecutive calls to
<code><a href="#dom-navigator-getusermedia">getUserMedia()</a></code>
.</p>
<p>A script can indicate that a track no longer needs its source with the
<code><a href=
"#dom-mediastreamtrack-stop">MediaStreamTrack.stop()</a></code> method.
When all tracks using a source have been stopped, the given permission
for that source is revoked and the source is <dfn id=
"source-stopped">stopped</dfn>. If the data is being generated from a
live source (e.g., a microphone or camera), then the user agent SHOULD
remove any active "on-air" indicator for that source. If the data is
being generated from a prerecorded source (e.g. a video file), any
remaining content in the file is ignored. An implementation may use a per
source reference count to keep track of source usage, but the specifics
are out of scope for this specification.</p>
<section>
<h3>Life-cycle and Media Flow</h3>
<p>A <code><a>MediaStreamTrack</a></code> has three stages in its
lifecycle; <code>new</code>, <code>live</code> and <code>ended</code>.
A track begins as <code>new</code> prior to being connected to an
active source.</p>
<p>Once connected, the <code><a href=
"#event-mediastreamtrack-started">started</a></code> event fires and
the track becomes <code>live</code>. In the <code>live</code> state,
the track is active and media is available for rendering at a
<code><a>MediaStream</a></code> <a>consumer</a>.</p>
<p>A muted or disabled <code><a>MediaStreamTrack</a></code> renders
either silence (audio), black frames (video), or a
zero-information-content equivalent. For example, a video element
sourced by a muted or disabled <code><a>MediaStreamTrack</a></code>
(contained within a <code><a>MediaStream</a></code>), is playing but
the rendered content is the muted output.</p>
<p>The muted/unmuted state of a track reflects if the source provides
any media at this moment. The enabled/disabled state is under
application control and determines if the track outputs media (to its
consumers). Hence, media from the source only flows when a
<code><a>MediaStreamTrack</a></code> object is both unmuted and
enabled.</p>
<p>A <code><a>MediaStreamTrack</a></code> is <dfn id=
"track-muted">muted</dfn> when the source is temporarily unable to
provide the track with data. A track can be muted by a user. Often this
action is outside the control of the application. This could be as a
result of the user hitting a hardware switch, or toggling a control in
the operating system or browser chrome. A track can also be muted by
the user agent. For example, a track that is a member of a
<code><a>MediaStream</a></code>, received via a
<code><a>RTCPeerConnection</a></code> [[!WEBRTC10]], is muted if the
application on the other side disables the corresponding track in the
<code>MediaStream</code> being sent.</p>
<p>Applications are able to <dfn id="track-enabled">enable</dfn> or
disable a <code><a>MediaStreamTrack</a></code> to prevent it from
rendering media from the source. A muted track will however, regardless
of the enabled state, render silence and blackness. A disabled track is
logically equivalent to a muted track, from a consumer point of
view.</p>
<p>For a newly created <code><a>MediaStreamTrack</a></code> object, the
following applies. The track is always enabled unless stated otherwise
(for example when cloned) and the muted state reflects the state of the
source at the time the track is created.</p>
<p>A <code><a>MediaStreamTrack</a></code> object is said to
<em>end</em> when the source of the track is disconnected or
exhausted.</p>
<p>When a <code><a>MediaStreamTrack</a></code> object ends for any
reason (e.g., because the user rescinds the permission for the page to
use the local camera, or because the data comes from a finite file and
the file's end has been reached and the user has not requested that it
be looped, or because the UA has instructed the track to end for any
reason, it is said to be ended. When track instance <var>track</var>
ends for any reason other than the <code><a href=
"#dom-mediastreamtrack-stop">stop()</a></code> method being invoked on
the <code><a>MediaStreamTrack</a></code> object that represents
<var>track</var>, the user agent MUST queue a task that runs the
following steps:</p>
<ol>
<li>
<p>If the track's <code><a href=
"#dom-mediastreamtrack-readystate">readyState</a></code> attribute
has the value <code>ended</code> already, then abort these steps.
(The <code><a href="#dom-mediastreamtrack-stop">stop()</a></code>
method was probably called just before the track stopped for other
reasons.)</p>
</li>
<li>
<p>Set <var>track's</var> <code><a href=
"#dom-mediastreamtrack-readystate">readyState</a></code> attribute
to <code>ended</code>.</p>
</li>
<li>
<p>Detach <var>track's</var> source.</p>
<p>If no other <code><a>MediaStreamTrack</a></code> is using
the same source, the source will be <a href=
"#source-stopped">stopped</a>.</p>
</li>
<li>
<p>Fire a simple event named <code><a href=
"#event-mediastreamtrack-ended">ended</a></code> at the object.</p>
</li>
</ol>
<p>If the end of the stream was reached due to a user request, the
event source for this event is the user interaction event source.</p>
</section>
<section>
<h3>Tracks and Constraints</h3>
<p>Constraints are set on tracks and may affect sources.</p>
<p>Whether <code><a>Constraints</a></code> were provided at track
initialization time or need to be established later at runtime, the
APIs defined in the <a>Constrainable</a> Interface allow the retrieval
and manipulation of the constraints currently established on a
track.</p>
<p>Each track maintains an internal version of the
<code><a>Constraints</a></code> structure, namely a mandatory set of
constraints (no duplicates), and an optional ordered list of individual
constraint objects (may contain duplicates). The internal stored
constraint structure is exposed to the application by the
<code><a>constraints</a></code> attribute, and may be modified by the
<code><a>applyConstraints()</a></code> method.</p>
<p>When <code><a>applyConstraints()</a></code> is called, a user agent
MUST queue a task to evaluate
those changes when the task queue is next serviced. Similarly, if the
<a href=
"#widl-MediaSourceStates-sourceType"><code>sourceType</code></a>
changes, then the user agent MUST perform the same actions to
re-evaluate the constraints of each track affected by that source
change.</p>
<p>If the <code><a>MediaError</a></code> event named
'overconstrained' is thrown, the track MUST be muted until
either new satisfiable constraints are applied or the existing
constraints become satisfiable.</p>
</section>
<section>
<h3>Interface Definition</h3>
<div class="idl" title="MediaStreamTrack implements Constrainable">
</div>
<dl class="idl" title="interface MediaStreamTrack : EventTarget">
<dt>readonly attribute DOMString kind</dt>
<dd>
<p>The <dfn id=
"dom-mediastreamtrack-kind"><code>MediaStreamTrack.kind</code></dfn>
attribute MUST return the string "<code>audio</code>" if the object
represents an audio track or "<code>video</code>" if object
represents a video track.</p>
</dd>
<dt>readonly attribute DOMString id</dt>
<dd>
<p>Unless a <code><a>MediaStreamTrack</a></code> object is created
as a part a of special purpose algorithm that specifies how the