diff --git a/source b/source index a3a39db468c..e3ddfdfb477 100644 --- a/source +++ b/source @@ -1876,23 +1876,136 @@ a.setAttribute('href', 'https://example.com/'); // change the content attribute rendered to the user. These terms are not meant to imply a visual medium; they must be considered to apply to other media in equivalent ways.

+ +

The term "transparent black" refers to the color with red, green, blue, and alpha channels all + set to zero.

+ +
-

When an algorithm B says to return to another algorithm A, it implies that A called B. Upon - returning to A, the implementation must continue from where it left off in calling B. Some - algorithms run in parallel; this means that the algorithm's subsequent - steps are to be run, one after another, at the same time as other logic in the specification (e.g. - at the same time as the event loop). This specification does not define the precise - mechanism by which this is achieved, be it time-sharing cooperative multitasking, fibers, threads, - processes, using different hyperthreads, cores, CPUs, machines, etc. By contrast, an operation - that is to run immediately must interrupt the currently running task, run itself, and - then resume the previously running task.

+

Parallelism

+ +

To run steps in parallel means those steps are to be run, one after + another, at the same time as other logic in the standard (e.g., at the same time as the + event loop). This standard does not define the precise mechanism by which this is + achieved, be it time-sharing cooperative multitasking, fibers, threads, processes, using different + hyperthreads, cores, CPUs, machines, etc. By contrast, an operation that is to run + immediately must interrupt the currently running task, run itself, and then resume the + previously running task.

+ +

To avoid race conditions between different in parallel algorithms that operate on + the same data, a parallel queue can be used.

+ +

A parallel queue represents a queue of algorithm steps that must be + run in series.

+ +

A parallel queue has an algorithm queue (a queue), + initially empty.

+ +

To enqueue steps to a parallel + queue, enqueue the algorithm steps to the parallel queue's + algorithm queue.

+ +

To start a new parallel queue, run the following steps:

+ +
    +
  1. Let parallelQueue be a new parallel queue.

  2. + +
  3. +

    Run the following steps in parallel:

    + +
      +
    1. +

      While true:

      + +
        +
      1. Let steps be the result of dequeueing from + parallelQueue's algorithm queue.

      2. + +
      3. If steps is not nothing, then run steps.

      4. + +
      5. Assert: running steps did not throw an exception, as steps running in + parallel are not allowed to throw.

      6. +
      + +

      Implementations are not expected to implement this as a continuously running + loop. Algorithms in standards are to be easy to understand and are not necessarily great for + battery life or performance.

      +
    2. +
    +
  4. + +
  5. Return parallelQueue.

  6. +
+ +

Steps running in parallel can themselves run other steps in in + parallel. E.g., inside a parallel queue it can be useful to run a series of + steps in parallel with the queue.

+ +
+

Imagine a standard defined nameList (a list), along with a method to + add a name to nameList, unless nameList already contains name, in which case it rejects.

+ +

The following solution suffers from race conditions:

+ +
    +
  1. Let p be a new promise.

  2. +
  3. +

    Run the following steps in parallel:

    + +
      +
    1. If nameList contains name, + reject p with a TypeError and abort these steps.

    2. + +
    3. Do some potentially lengthy work.

    4. + +
    5. Append name to + nameList.

    6. + +
    7. Resolve p with undefined.

    8. +
    +
  4. + +
  5. Return p.

  6. +
+ +

Two invocations of the above could run simultaneously, meaning name isn't in + nameList during step 2.1, but it might be added before step 2.3 runs, + meaning name ends up in nameList twice.

+ +

Parallel queues solve this. The standard would let nameListQueue be the result of + starting a new parallel queue, then:

+ +
    +
  1. Let p be a new promise.

  2. + +
  3. +

    Enqueue the following steps to nameListQueue:

    + +
      +
    1. If nameList contains name, + reject p with a TypeError and abort these steps.

    2. + +
    3. Do some potentially lengthy work.

    4. + +
    5. Append name to + nameList.

    6. + +
    7. Resolve p with undefined.

    8. +
    +
  4. + +
  5. Return p.

  6. +
+ +

The steps would now queue and the race is avoided.

- -

The term "transparent black" refers to the color with red, green, blue, and alpha channels all - set to zero.

+

Resources

@@ -2404,6 +2517,9 @@ a.setAttribute('href', 'https://example.com/'); // change the content attribute
  • The stack data structure and the associated definitions for push and pop
  • +
  • The queue data structure and the associated definitions for + enqueue and + dequeue
  • The ordered set data structure and the associated definition for append
  • The struct specification type and the associated definition for @@ -97250,6 +97366,13 @@ interface SharedWorker : EventTarget { it was assigned by the object's constructor. It represents the MessagePort for communicating with the shared worker.

    +

    A user agent has an associated shared worker manager which is the result of + starting a new parallel queue.

    + +

    Each user agent has a single shared worker manager for simplicity. + Implementations could use one per origin; that would not be observably different and + enables more concurrency.

    +

    When the SharedWorker(scriptURL, optons) constructor is invoked:

    @@ -97293,7 +97416,7 @@ interface SharedWorker : EventTarget { object a secure context? on outside settings.

  • -

    Run these substeps in parallel:

    +

    Enqueue the following steps to the shared worker manager:

    1. Let worker global scope be null.

    2. @@ -97375,8 +97498,9 @@ interface SharedWorker : EventTarget {
  • -
  • Otherwise, run a worker given worker, urlRecord, - outside settings, outside port, and options.

  • +
  • Otherwise, in parallel, run a worker given worker, + urlRecord, outside settings, outside port, and + options.