Skip to content

Commit

Permalink
Make shared worker creation deterministic
Browse files Browse the repository at this point in the history
To do this we introduced a new concept called parallel queue, which is also expected to be used by other specifications, such as Background Fetch.

Fixes #1843.
  • Loading branch information
jakearchibald authored and annevk committed Sep 21, 2017
1 parent e40797b commit 9fda90f
Showing 1 changed file with 139 additions and 15 deletions.
154 changes: 139 additions & 15 deletions source
Original file line number Diff line number Diff line change
Expand Up @@ -1876,23 +1876,136 @@ a.setAttribute('href', 'https://example.com/'); // change the content attribute
rendered to the user. These terms are not meant to imply a visual medium; they must be considered
to apply to other media in equivalent ways.</p>

<!-- should find somewhere more appropriate to put this -->
<p>The term "transparent black" refers to the color with red, green, blue, and alpha channels all
set to zero.</p>


<div w-nodev>

<p>When an algorithm B says to return to another algorithm A, it implies that A called B. Upon
returning to A, the implementation must continue from where it left off in calling B. Some
algorithms run <dfn data-export="">in parallel</dfn>; this means that the algorithm's subsequent
steps are to be run, one after another, at the same time as other logic in the specification (e.g.
at the same time as the <span>event loop</span>). This specification does not define the precise
mechanism by which this is achieved, be it time-sharing cooperative multitasking, fibers, threads,
processes, using different hyperthreads, cores, CPUs, machines, etc. By contrast, an operation
that is to run <dfn>immediately</dfn> must interrupt the currently running task, run itself, and
then resume the previously running task.</p>
<h4>Parallelism</h4>

<p>To run steps <dfn data-export="">in parallel</dfn> means those steps are to be run, one after
another, at the same time as other logic in the standard (e.g., at the same time as the
<span>event loop</span>). This standard does not define the precise mechanism by which this is
achieved, be it time-sharing cooperative multitasking, fibers, threads, processes, using different
hyperthreads, cores, CPUs, machines, etc. By contrast, an operation that is to run
<dfn>immediately</dfn> must interrupt the currently running task, run itself, and then resume the
previously running task.</p>

<p>To avoid race conditions between different <span>in parallel</span> algorithms that operate on
the same data, a <span>parallel queue</span> can be used.</p>

<p>A <dfn data-export="">parallel queue</dfn> represents a queue of algorithm steps that must be
run in series.</p>

<p>A <span>parallel queue</span> has an <dfn>algorithm queue</dfn> (a <span>queue</span>),
initially empty.</p>

<p>To <dfn data-x="enqueue the following steps" data-lt="enqueue steps|enqueue the following
steps" data-export="" data-dfn-for="parallel queue">enqueue steps</dfn> to a <span>parallel
queue</span>, <span>enqueue</span> the algorithm steps to the <span>parallel queue</span>'s
<span>algorithm queue</span>.</p>

<p>To <dfn data-x="starting a new parallel queue" data-lt="start a new parallel queue|starting a
new parallel queue" data-export="">start a new parallel queue</dfn>, run the following steps:</p>

<ol>
<li><p>Let <var>parallelQueue</var> be a new <span>parallel queue</span>.</p></li>

<li>
<p>Run the following steps <span>in parallel</span>:</p>

<ol>
<li>
<p>While true:</p>

<ol>
<li><p>Let <var>steps</var> be the result of <span data-x="dequeue">dequeueing</span> from
<var>parallelQueue</var>'s <span>algorithm queue</span>.</p></li>

<li><p>If <var>steps</var> is not nothing, then run <var>steps</var>.</p></li>

<li><p>Assert: running <var>steps</var> did not throw an exception, as steps running <span>in
parallel</span> are not allowed to throw.</p></li>
</ol>

<p class="note">Implementations are not expected to implement this as a continuously running
loop. Algorithms in standards are to be easy to understand and are not necessarily great for
battery life or performance.</p>
</li>
</ol>
</li>

<li><p>Return <var>parallelQueue</var>.</p></li>
</ol>

<p class="note">Steps running <span>in parallel</span> can themselves run other steps in <span>in
parallel</span>. E.g., inside a <span>parallel queue</span> it can be useful to run a series of
steps in parallel with the queue.</p>

<div class="example">
<p>Imagine a standard defined <var>nameList</var> (a <span>list</span>), along with a method to
add a <var>name</var> to <var>nameList</var>, unless <var>nameList</var> already <span
data-x="list contains">contains</span> <var>name</var>, in which case it rejects.</p>

<p>The following solution suffers from race conditions:</p>

<ol>
<li><p>Let <var>p</var> be a new promise.</p></li>

<li>
<p>Run the following steps <span>in parallel</span>:</p>

<ol>
<li><p>If <var>nameList</var> <span data-x="list contains">contains</span> <var>name</var>,
reject <var>p</var> with a <code>TypeError</code> and abort these steps.</p></li>

<li><p>Do some potentially lengthy work.</p></li>

<li><p><span data-x="list append">Append</span> <var>name</var> to
<var>nameList</var>.</p></li>

<li><p>Resolve <var>p</var> with undefined.</p></li>
</ol>
</li>

<li><p>Return <var>p</var>.</p></li>
</ol>

<p>Two invocations of the above could run simultaneously, meaning <var>name</var> isn't in
<var>nameList</var> during step 2.1, but it <em>might be added</em> before step 2.3 runs,
meaning <var>name</var> ends up in <var>nameList</var> twice.</p>

<p>Parallel queues solve this. The standard would let <var>nameListQueue</var> be the result of
<span>starting a new parallel queue</span>, then:</p>

<ol>
<li><p>Let <var>p</var> be a new promise.</p></li>

<li>
<p><mark><span>Enqueue the following steps</span> to <var>nameListQueue</var>:</mark></p>

<ol>
<li><p>If <var>nameList</var> <span data-x="list contains">contains</span> <var>name</var>,
reject <var>p</var> with a <code>TypeError</code> and abort these steps.</p></li>

<li><p>Do some potentially lengthy work.</p></li>

<li><p><span data-x="list append">Append</span> <var>name</var> to
<var>nameList</var>.</p></li>

<li><p>Resolve <var>p</var> with undefined.</p></li>
</ol>
</li>

<li><p>Return <var>p</var>.</p></li>
</ol>

<p>The steps would now queue and the race is avoided.</p>
</div>

<!-- should find somewhere more appropriate to put this -->
<p>The term "transparent black" refers to the color with red, green, blue, and alpha channels all
set to zero.</p>
</div>


<h4>Resources</h4>
Expand Down Expand Up @@ -2404,6 +2517,9 @@ a.setAttribute('href', 'https://example.com/'); // change the content attribute
<li>The <dfn data-x-href="https://infra.spec.whatwg.org/#stack">stack</dfn> data structure and the associated definitions for
<dfn data-x="stack push" data-x-href="https://infra.spec.whatwg.org/#stack-push">push</dfn> and
<dfn data-x="stack pop" data-x-href="https://infra.spec.whatwg.org/#stack-pop">pop</dfn></li>
<li>The <dfn data-x-href="https://infra.spec.whatwg.org/#queue">queue</dfn> data structure and the associated definitions for
<dfn data-x-href="https://infra.spec.whatwg.org/#queue-enqueue">enqueue</dfn> and
<dfn data-x-href="https://infra.spec.whatwg.org/#queue-dequeue">dequeue</dfn></li>
<li>The <dfn data-x="set" data-x-href="https://infra.spec.whatwg.org/#ordered-set">ordered set</dfn> data structure and the associated definition for
<dfn data-x="set append" data-x-href="https://infra.spec.whatwg.org/#set-append">append</dfn></li>
<li>The <dfn data-x-href="https://infra.spec.whatwg.org/#struct">struct</dfn> specification type and the associated definition for
Expand Down Expand Up @@ -97250,6 +97366,13 @@ interface <dfn>SharedWorker</dfn> : <span>EventTarget</span> {
it was assigned by the object's constructor. It represents the <code>MessagePort</code> for
communicating with the shared worker.</p>

<p>A user agent has an associated <dfn>shared worker manager</dfn> which is the result of
<span>starting a new parallel queue</span>.</p>

<p class="note">Each user agent has a single <span>shared worker manager</span> for simplicity.
Implementations could use one per <span>origin</span>; that would not be observably different and
enables more concurrency.</p>

<p>When the <dfn><code data-x="dom-SharedWorker">SharedWorker(<var>scriptURL</var>,
<var>optons</var>)</code></dfn> constructor is invoked:</p>

Expand Down Expand Up @@ -97293,7 +97416,7 @@ interface <dfn>SharedWorker</dfn> : <span>EventTarget</span> {
object a secure context?</span> on <var>outside settings</var>.</p></li>

<li>
<p>Run these substeps <span>in parallel</span>:</p>
<p><span>Enqueue the following steps</span> to the <span>shared worker manager</span>:</p>

<ol>
<li><p>Let <var>worker global scope</var> be null.</p></li>
Expand Down Expand Up @@ -97375,8 +97498,9 @@ interface <dfn>SharedWorker</dfn> : <span>EventTarget</span> {
</ol>
</li>

<li><p>Otherwise, <span>run a worker</span> given <var>worker</var>, <var>urlRecord</var>,
<var>outside settings</var>, <var>outside port</var>, and <var>options</var>.</p></li>
<li><p>Otherwise, <span>in parallel</span>, <span>run a worker</span> given <var>worker</var>,
<var>urlRecord</var>, <var>outside settings</var>, <var>outside port</var>, and
<var>options</var>.</p></li>
</ol>
</li>

Expand Down

0 comments on commit 9fda90f

Please sign in to comment.