Skip to content


Switch branches/tags

Latest commit


Failed to load latest commit information.
Latest commit message
Commit time

Parsl - Parallel Scripting Library

Apache Licence V2.0 Build status Documentation Status NSF award info NSF award info NSF award info NSF award info

Parsl is a parallel programming library for Python. Parsl augments Python with simple, scalable, and flexible constructs for encoding parallelism. Developers annotate Python functions to specify opportunities for concurrent execution. These annotated functions, called apps, may represent pure Python functions or calls to external applications, whether sequential, multicore (e.g., CPU, GPU, accelerator), or multi-node MPI. Parsl further allows these calls to these apps, called tasks, to be connected by shared input/output data (e.g., Python objects or files) via which Parsl can construct a dynamic dependency graph of tasks.

Parsl includes a flexible and scalable runtime that allows it to efficiently execute Python programs in parallel. Parsl scripts are portable and can be easily moved between different execution resources: from laptops to supercomputers to clouds. When executing a Parsl program, developers first define a simple Python-based configuration that outlines where and how to execute tasks. Parsl supports various target resources including clouds (e.g., Amazon Web Services and Google Cloud), clusters (e.g., using Slurm, Torque/PBS, HTCondor, Cobalt), and container orchestration systems (e.g., Kubernetes). Parsl scripts can scale from a single core on a single computer through to hundreds of thousands of cores across many thousands of nodes on a supercomputer.

Parsl can be used to implement various parallel computing paradigms:

  • Concurrent execution of a set of tasks in a bag-of-tasks program
  • Procedural workflows in which tasks are executed following control logic
  • Parallel dataflow in which tasks are executed when their data dependencies are met
  • Heterogeneous many-task applications in which many different computing resources are used together to execute different types of computational tasks
  • Dynamic workflows in which the workflow is determined during execution
  • Interactive parallel programming through notebooks or another interactive mechanism

The latest Parsl version available on PyPi is v1.1.0.


Parsl is now available on PyPI, but first make sure you have Python3.7+

$ python3 --version

Install Parsl using pip:

$ pip3 install parsl

To run the Parsl tutorial notebooks you will need to install Jupyter:

$ pip3 install jupyter

Detailed information about setting up Jupyter with Python is available here

Note: Parsl uses an opt-in model to collect anonymous usage statistics for reporting and improvement purposes. To understand what stats are collected and enable collection please refer to the usage tracking guide


The complete parsl documentation is hosted here.

The Parsl tutorial is hosted on live Jupyter notebooks here

For Developers

  1. Download Parsl:

    $ git clone
  2. Build and Test:

    $ make   # show all available makefile targets
    $ make virtualenv # create a virtual environment
    $ source .venv/bin/activate # activate the virtual environment
    $ make deps # install python dependencies from test-requirements.txt
    $ make test # make (all) tests. Run "make config_local_test" for a faster, smaller test set.
    $ make clean # remove virtualenv and all test and build artifacts
  3. Install:

    $ cd parsl
    $ python3 install
  4. Use Parsl!


Parsl is supported in Python 3.7+. Requirements can be found here. Requirements for running tests can be found here.

Code of Conduct

Parsl seeks to foster an open and welcoming environment - Please see the Parsl Code of Conduct for more details.


We welcome contributions from the community. Please see our contributing guide.

Research notice

Please note that this repository is participating in a study into sustainability of open source projects. Data will be gathered about this repository for approximately the next 12 months, starting from June 2021.

Data collected will include number of contributors, number of PRs, time taken to close/merge these PRs, and issues closed.

For more information, please visit the informational page or download the participant information sheet.