From 9fda90f44c7c9223de6e3755a043f67ea7c92bf5 Mon Sep 17 00:00:00 2001
From: Jake Archibald
The term "transparent black" refers to the color with red, green, blue, and alpha channels all + set to zero.
+ +When an algorithm B says to return to another algorithm A, it implies that A called B. Upon - returning to A, the implementation must continue from where it left off in calling B. Some - algorithms run in parallel; this means that the algorithm's subsequent - steps are to be run, one after another, at the same time as other logic in the specification (e.g. - at the same time as the event loop). This specification does not define the precise - mechanism by which this is achieved, be it time-sharing cooperative multitasking, fibers, threads, - processes, using different hyperthreads, cores, CPUs, machines, etc. By contrast, an operation - that is to run immediately must interrupt the currently running task, run itself, and - then resume the previously running task.
+To run steps in parallel means those steps are to be run, one after + another, at the same time as other logic in the standard (e.g., at the same time as the + event loop). This standard does not define the precise mechanism by which this is + achieved, be it time-sharing cooperative multitasking, fibers, threads, processes, using different + hyperthreads, cores, CPUs, machines, etc. By contrast, an operation that is to run + immediately must interrupt the currently running task, run itself, and then resume the + previously running task.
+ +To avoid race conditions between different in parallel algorithms that operate on + the same data, a parallel queue can be used.
+ +A parallel queue represents a queue of algorithm steps that must be + run in series.
+ +A parallel queue has an algorithm queue (a queue), + initially empty.
+ +To enqueue steps to a parallel + queue, enqueue the algorithm steps to the parallel queue's + algorithm queue.
+ +To start a new parallel queue, run the following steps:
+ +Let parallelQueue be a new parallel queue.
Run the following steps in parallel:
+ +While true:
+ +Let steps be the result of dequeueing from + parallelQueue's algorithm queue.
If steps is not nothing, then run steps.
Assert: running steps did not throw an exception, as steps running in + parallel are not allowed to throw.
Implementations are not expected to implement this as a continuously running + loop. Algorithms in standards are to be easy to understand and are not necessarily great for + battery life or performance.
+Return parallelQueue.
Steps running in parallel can themselves run other steps in in + parallel. E.g., inside a parallel queue it can be useful to run a series of + steps in parallel with the queue.
+ +Imagine a standard defined nameList (a list), along with a method to + add a name to nameList, unless nameList already contains name, in which case it rejects.
+ +The following solution suffers from race conditions:
+ +Let p be a new promise.
Run the following steps in parallel:
+ +If nameList contains name,
+ reject p with a TypeError
and abort these steps.
Do some potentially lengthy work.
Append name to + nameList.
Resolve p with undefined.
Return p.
Two invocations of the above could run simultaneously, meaning name isn't in + nameList during step 2.1, but it might be added before step 2.3 runs, + meaning name ends up in nameList twice.
+ +Parallel queues solve this. The standard would let nameListQueue be the result of + starting a new parallel queue, then:
+ +Let p be a new promise.
Enqueue the following steps to nameListQueue:
+ +If nameList contains name,
+ reject p with a TypeError
and abort these steps.
Do some potentially lengthy work.
Append name to + nameList.
Resolve p with undefined.
Return p.
The steps would now queue and the race is avoided.
The term "transparent black" refers to the color with red, green, blue, and alpha channels all - set to zero.
+MessagePort
for
communicating with the shared worker.
+ A user agent has an associated shared worker manager which is the result of + starting a new parallel queue.
+ +Each user agent has a single shared worker manager for simplicity. + Implementations could use one per origin; that would not be observably different and + enables more concurrency.
+When the SharedWorker(scriptURL,
optons)
constructor is invoked:
Run these substeps in parallel:
+Enqueue the following steps to the shared worker manager:
Let worker global scope be null.
Otherwise, run a worker given worker, urlRecord, - outside settings, outside port, and options.
Otherwise, in parallel, run a worker given worker, + urlRecord, outside settings, outside port, and + options.