@@ -15,9 +15,10 @@ The :mod:`concurrent.futures` module provides a high-level interface for
1515asynchronously executing callables.
1616
1717The asynchronous execution can be performed with threads, using
18- :class: `ThreadPoolExecutor `, or separate processes, using
19- :class: `ProcessPoolExecutor `. Both implement the same interface, which is
20- defined by the abstract :class: `Executor ` class.
18+ :class: `ThreadPoolExecutor ` or :class: `InterpreterPoolExecutor `,
19+ or separate processes, using :class: `ProcessPoolExecutor `.
20+ Each implements the same interface, which is defined
21+ by the abstract :class: `Executor ` class.
2122
2223.. include :: ../includes/wasm-notavail.rst
2324
@@ -63,7 +64,8 @@ Executor Objects
6364 setting *chunksize * to a positive integer. For very long iterables,
6465 using a large value for *chunksize * can significantly improve
6566 performance compared to the default size of 1. With
66- :class: `ThreadPoolExecutor `, *chunksize * has no effect.
67+ :class: `ThreadPoolExecutor ` and :class: `InterpreterPoolExecutor `,
68+ *chunksize * has no effect.
6769
6870 .. versionchanged :: 3.5
6971 Added the *chunksize * argument.
@@ -227,6 +229,111 @@ ThreadPoolExecutor Example
227229 print('%r page is %d bytes' % (url, len(data)))
228230
229231
232+ InterpreterPoolExecutor
233+ -----------------------
234+
235+ The :class: `InterpreterPoolExecutor ` class uses a pool of interpreters
236+ to execute calls asynchronously. It is a :class: `ThreadPoolExecutor `
237+ subclass, which means each worker is running in its own thread.
238+ The difference here is that each worker has its own interpreter,
239+ and runs each task using that interpreter.
240+
241+ The biggest benefit to using interpreters instead of only threads
242+ is true multi-core parallelism. Each interpreter has its own
243+ :term: `Global Interpreter Lock <global interpreter lock> `, so code
244+ running in one interpreter can run on one CPU core, while code in
245+ another interpreter runs unblocked on a different core.
246+
247+ The tradeoff is that writing concurrent code for use with multiple
248+ interpreters can take extra effort. However, this is because it
249+ forces you to be deliberate about how and when interpreters interact,
250+ and to be explicit about what data is shared between interpreters.
251+ This results in several benefits that help balance the extra effort,
252+ including true multi-core parallelism, For example, code written
253+ this way can make it easier to reason about concurrency. Another
254+ major benefit is that you don't have to deal with several of the
255+ big pain points of using threads, like nrace conditions.
256+
257+ Each worker's interpreter is isolated from all the other interpreters.
258+ "Isolated" means each interpreter has its own runtime state and
259+ operates completely independently. For example, if you redirect
260+ :data: `sys.stdout ` in one interpreter, it will not be automatically
261+ redirected any other interpreter. If you import a module in one
262+ interpreter, it is not automatically imported in any other. You
263+ would need to import the module separately in interpreter where
264+ you need it. In fact, each module imported in an interpreter is
265+ a completely separate object from the same module in a different
266+ interpreter, including :mod: `sys `, :mod: `builtins `,
267+ and even ``__main__ ``.
268+
269+ Isolation means a mutable object, or other data, cannot be used
270+ by more than one interpreter at the same time. That effectively means
271+ interpreters cannot actually share such objects or data. Instead,
272+ each interpreter must have its own copy, and you will have to
273+ synchronize any changes between the copies manually. Immutable
274+ objects and data, like the builtin singletons, strings, and tuples
275+ of immutable objects, don't have these limitations.
276+
277+ Communicating and synchronizing between interpreters is most effectively
278+ done using dedicated tools, like those proposed in :pep: `734 `. One less
279+ efficient alternative is to serialize with :mod: `pickle ` and then send
280+ the bytes over a shared :mod: `socket <socket> ` or
281+ :func: `pipe <os.pipe> `.
282+
283+ .. class :: InterpreterPoolExecutor(max_workers=None, thread_name_prefix='', initializer=None, initargs=(), shared=None)
284+
285+ A :class: `ThreadPoolExecutor ` subclass that executes calls asynchronously
286+ using a pool of at most *max_workers * threads. Each thread runs
287+ tasks in its own interpreter. The worker interpreters are isolated
288+ from each other, which means each has its own runtime state and that
289+ they can't share any mutable objects or other data. Each interpreter
290+ has its own :term: `Global Interpreter Lock <global interpreter lock> `,
291+ which means code run with this executor has true multi-core parallelism.
292+
293+ The optional *initializer * and *initargs * arguments have the same
294+ meaning as for :class: `!ThreadPoolExecutor `: the initializer is run
295+ when each worker is created, though in this case it is run.in
296+ the worker's interpreter. The executor serializes the *initializer *
297+ and *initargs * using :mod: `pickle ` when sending them to the worker's
298+ interpreter.
299+
300+ .. note ::
301+ Functions defined in the ``__main__ `` module cannot be pickled
302+ and thus cannot be used.
303+
304+ .. note ::
305+ The executor may replace uncaught exceptions from *initializer *
306+ with :class: `~concurrent.futures.interpreter.ExecutionFailed `.
307+
308+ The optional *shared * argument is a :class: `dict ` of objects that all
309+ interpreters in the pool share. The *shared * items are added to each
310+ interpreter's ``__main__ `` module. Not all objects are shareable.
311+ Shareable objects include the builtin singletons, :class: `str `
312+ and :class: `bytes `, and :class: `memoryview `. See :pep: `734 `
313+ for more info.
314+
315+ Other caveats from parent :class: `ThreadPoolExecutor ` apply here.
316+
317+ :meth: `~Executor.submit ` and :meth: `~Executor.map ` work like normal,
318+ except the worker serializes the callable and arguments using
319+ :mod: `pickle ` when sending them to its interpreter. The worker
320+ likewise serializes the return value when sending it back.
321+
322+ .. note ::
323+ Functions defined in the ``__main__ `` module cannot be pickled
324+ and thus cannot be used.
325+
326+ When a worker's current task raises an uncaught exception, the worker
327+ always tries to preserve the exception as-is. If that is successful
328+ then it also sets the ``__cause__ `` to a corresponding
329+ :class: `~concurrent.futures.interpreter.ExecutionFailed `
330+ instance, which contains a summary of the original exception.
331+ In the uncommon case that the worker is not able to preserve the
332+ original as-is then it directly preserves the corresponding
333+ :class: `~concurrent.futures.interpreter.ExecutionFailed `
334+ instance instead.
335+
336+
230337ProcessPoolExecutor
231338-------------------
232339
@@ -574,6 +681,26 @@ Exception classes
574681
575682 .. versionadded :: 3.7
576683
684+ .. currentmodule :: concurrent.futures.interpreter
685+
686+ .. exception :: BrokenInterpreterPool
687+
688+ Derived from :exc: `~concurrent.futures.thread.BrokenThreadPool `,
689+ this exception class is raised when one of the workers
690+ of a :class: `~concurrent.futures.InterpreterPoolExecutor `
691+ has failed initializing.
692+
693+ .. versionadded :: next
694+
695+ .. exception :: ExecutionFailed
696+
697+ Raised from :class: `~concurrent.futures.InterpreterPoolExecutor ` when
698+ the given initializer fails or from
699+ :meth: `~concurrent.futures.Executor.submit ` when there's an uncaught
700+ exception from the submitted task.
701+
702+ .. versionadded :: next
703+
577704.. currentmodule :: concurrent.futures.process
578705
579706.. exception :: BrokenProcessPool
0 commit comments