Common Workflow Language reference implementation
Python Common Workflow Language JavaScript Makefile Shell Dockerfile
Switch branches/tags
v1.17.20160820165726 v1.16.20160810195039 v1.15.20160810152809 v1.4 v1.0 v1.0.20160811184335 v1.0.20160810200423 draft-3 draft-2.1 draft-2 2.5.20170328195758 2.5.20170327140858 2.4.20170308171942 2.3.20170302225134 2.3.20170302150835 2.2.20170222151604 2.2.20170216125639 2.2.20170126160727 2.2.20170119151016 2.2.20170111180227 2.1.20161227191302 2.1.20161227155225 2.1.20161216210732 2.0.20161122040508 1.21.20161215163938 1.21.20161206204028 1.21.20161206181442 1.21.20161202201331 1.20.20161122192122 1.20.20161122185741 1.19.20161121214634 1.18.20161005190847 1.18.20160930145650 1.0.20180809224403 1.0.20180808165120 1.0.20180806194258 1.0.20180806154020 1.0.20180721142728 1.0.20180711112827 1.0.20180622214234 1.0.20180618135130 1.0.20180615183820 1.0.20180611202326 1.0.20180525185854 1.0.20180524215209 1.0.20180523203033 1.0.20180522135731 1.0.20180521150620 1.0.20180518123035 1.0.20180508202931 1.0.20180502225535 1.0.20180501200546 1.0.20180403145700 1.0.20180330141240 1.0.20180326152342 1.0.20180322194411 1.0.20180306163216 1.0.20180306140409 1.0.20180302231433 1.0.20180225025849 1.0.20180224055035 1.0.20180224031654 1.0.20180220041300 1.0.20180211103944 1.0.20180130130340 1.0.20180116213856 1.0.20180116032016 1.0.20180111185617 1.0.20180108222053 1.0.20171227212058 1.0.20171107133715 1.0.20171017195544 1.0.20170928192020 1.0.20170927182241 1.0.20170828135420 1.0.20170822192924 1.0.20170817131858 1.0.20170815202200 1.0.20170810192106 1.0.20170727112954 1.0.20170723124118 1.0.20170525215327 1.0.20170516234254 1.0.20170510165748 1.0.20170510151339 1.0.20170413194156 1.0.20170413151007 1.0.20170329142446 1.0.20170327143622 1.0.20170308174714 1.0.20170224141733 1.0.20170217172322 1.0.20170213175853 1.0.20170119182607 1.0.20170118141124 1.0.20170112185927 1.0.20170112154257 1.0.20170111193653 1.0.20170105144051 1.0.20161227200419
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.
.github adjust flow May 25, 2017
cwlref-runner add absolute import from __future__ Jul 9, 2017
cwltool test for directories Aug 16, 2018
tests test for directories Aug 16, 2018
typeshed type ro:Folder for directories Aug 16, 2018
.coveragerc Coverage reporting (#691) Mar 22, 2018
.dockerignore Revert "Revert "conflict resolution with provenance branch"" May 23, 2018
.gitattributes Revert "Revert "conflict resolution with provenance branch"" May 23, 2018
.gitignore Merge branch 'master' into provenance Mar 24, 2018
.travis.singularity_key.txt measure coverage during conformance tests (#772) May 28, 2018
.travis.yml .travis.yml: removed unused SINGVER (#853) Aug 5, 2018 Add instructions for setting up dev env (#508) Sep 27, 2017
CWLProv.rst move prov docs out of README Jul 3, 2018
Jenkinsfile add timeout (#783) Jun 1, 2018
LICENSE.txt Squashed 'cwltool/schemas/' content from commit ee1cf15 May 4, 2016 Fix docker image name (#852) Aug 6, 2018
Makefile Fix docker image name (#852) Aug 6, 2018
README.rst Test Research Object manifest (#820) Jul 19, 2018
appveyor.yml .travis.yml: removed unused SINGVER (#853) Aug 5, 2018 Fix docker image name (#852) Aug 6, 2018 Revert "Revert "conflict resolution with provenance branch"" May 23, 2018 Revert "Revert "conflict resolution with provenance branch"" May 23, 2018
cwltool.Dockerfile add Graphviz to cwltool* Docker containers (#849) Jul 31, 2018 Kill all spawned processes on exit Jul 24, 2018 cleanup Feb 8, 2017
jenkins.bash Don't push to Docker Hub from jenkins.bash (#851) Aug 2, 2018
jenkins.bat Revert "Revert "conflict resolution with provenance branch"" May 23, 2018
mypy.ini Ignore past.builtins imports in mypy Apr 4, 2018 Fix docker image name (#852) Aug 6, 2018
requirements.txt Cap ruamel.yaml version because of breaking change in 0.15.52 (#865) Aug 9, 2018
setup.cfg prepare for py3+2 release Jul 17, 2017 Cap ruamel.yaml version because of breaking change in 0.15.52 (#865) Aug 9, 2018
test-requirements.txt Test Research Object manifest (#820) Jul 19, 2018
tox.ini mypy0.620 (#858) Aug 7, 2018 Test Research Object manifest (#820) Jul 19, 2018


Common Workflow Language tool description reference implementation

CWL conformance tests: Conformance Status Linux Status Windows Status Coverage Status

This is the reference implementation of the Common Workflow Language. It is intended to be feature complete and provide comprehensive validation of CWL files as well as provide other tools related to working with CWL.

This is written and tested for Python 2.7 and 3.x {x = 4, 5, 6}

The reference implementation consists of two packages. The cwltool package is the primary Python module containing the reference implementation in the cwltool module and console executable by the same name.

The cwlref-runner package is optional and provides an additional entry point under the alias cwl-runner, which is the implementation-agnostic name for the default CWL interpreter installed on a host.


It is highly recommended to setup virtual environment before installing cwltool:

virtualenv -p python2 venv   # Create a virtual environment, can use `python3` as well
source venv/bin/activate     # Activate environment before installing `cwltool`

Installing the official package from PyPi (will install "cwltool" package as well)

pip install cwlref-runner

If installing alongside another CWL implementation then

pip install cwltool

Or you can install from source:

git clone # clone cwltool repo
cd cwltool         # Switch to source directory
pip install .      # Install `cwltool` from source
cwltool --version  # Check if the installation works correctly

Remember, if co-installing multiple CWL implementations then you need to maintain which implementation cwl-runner points to via a symbolic file system link or another facility.

Running tests locally

  • Running basic tests (/tests):

To run the basis tests after installing cwltool execute the following:

pip install -rtest-requirements.txt
py.test --ignore cwltool/schemas/ --pyarg cwltool

To run various tests in all supported Python environments we use tox. To run the test suite in all supported Python environments first downloading the complete code repository (see the git clone instructions above) and then run the following in the terminal: pip install tox; tox

List of all environment can be seen using: tox --listenvs and running a specfic test env using: tox -e <env name> and additionally run a specific test using this format: tox -e py36-unit -- tests/

  • Running the entire suite of CWL conformance tests:

The GitHub repository for the CWL specifications contains a script that tests a CWL implementation against a wide array of valid CWL files using the cwltest program

Instructions for running these tests can be found in the Common Workflow Language Specification repository at

Run on the command line

Simple command:

cwl-runner [tool-or-workflow-description] [input-job-settings]

Or if you have multiple CWL implementations installed and you want to override the default cwl-runner use:

cwltool [tool-or-workflow-description] [input-job-settings]

Use with boot2docker

boot2docker runs Docker inside a virtual machine and it only mounts Users on it. The default behavior of CWL is to create temporary directories under e.g. /Var which is not accessible to Docker containers.

To run CWL successfully with boot2docker you need to set the --tmpdir-prefix and --tmp-outdir-prefix to somewhere under /Users:

$ cwl-runner --tmp-outdir-prefix=/Users/username/project --tmpdir-prefix=/Users/username/project wc-tool.cwl wc-job.json

Using user-space replacements for Docker

Some shared computing environments don't support Docker software containers for technical or policy reasons. As a work around, the CWL reference runner supports using alternative docker implementations on Linux with the --user-space-docker-cmd option.

One such "user space" friendly docker replacement is udocker and another is dx-docker

udocker installation:

dx-docker installation: start with the DNAnexus toolkit (see for instructions).

Run cwltool just as you normally would, but with the new option, e.g. from the conformance tests:

cwltool --user-space-docker-cmd=udocker


cwltool --user-space-docker-cmd=dx-docker

cwltool can use Singularity as a Docker container runtime, an experimental feature. Singularity will run software containers specified in DockerRequirement and therefore works with Docker images only, native Singularity images are not supported. To use Singularity as the Docker container runtime, provide --singularity command line option to cwltool.

cwltool --singularity

Tool or workflow loading from remote or local locations

cwltool can run tool and workflow descriptions on both local and remote systems via its support for HTTP[S] URLs.

Input job files and Workflow steps (via the run directive) can reference CWL documents using absolute or relative local filesytem paths. If a relative path is referenced and that document isn't found in the current directory then the following locations will be searched:

Use with GA4GH Tool Registry API

Cwltool can launch tools directly from GA4GH Tool Registry API endpoints.

By default, cwltool searches . Use --add-tool-registry to add other registries to the search path.

For example

cwltool --non-strict test.json

and (defaults to latest when a version is not specified)

cwltool --non-strict test.json

For this example, grab the test.json (and input file) from

Import as a module


import cwltool

to your script.

The easiest way to use cwltool to run a tool or workflow from Python is to use a Factory

import cwltool.factory
fac = cwltool.factory.Factory()

echo = fac.make("echo.cwl")
result = echo(inp="foo")

# result["out"] == "foo"

Leveraging SoftwareRequirements (Beta)

CWL tools may be decorated with SoftwareRequirement hints that cwltool may in turn use to resolve to packages in various package managers or dependency management systems such as Environment Modules.

Utilizing SoftwareRequirement hints using cwltool requires an optional dependency, for this reason be sure to use specify the deps modifier when installing cwltool. For instance:

$ pip install 'cwltool[deps]'

Installing cwltool in this fashion enables several new command line options. The most general of these options is --beta-dependency-resolvers-configuration. This option allows one to specify a dependency resolver's configuration file. This file may be specified as either XML or YAML and very simply describes various plugins to enable to "resolve" SoftwareRequirement dependencies.

To discuss some of these plugins and how to configure them, first consider the following hint definition for an example CWL tool.

  - package: seqtk
    - r93

Now imagine deploying cwltool on a cluster with Software Modules installed and that a seqtk module is available at version r93. This means cluster users likely won't have the binary seqtk on their PATH by default, but after sourcing this module with the command modulecmd sh load seqtk/r93 seqtk is available on the PATH. A simple dependency resolvers configuration file, called dependency-resolvers-conf.yml for instance, that would enable cwltool to source the correct module environment before executing the above tool would simply be:

- type: modules

The outer list indicates that one plugin is being enabled, the plugin parameters are defined as a dictionary for this one list item. There is only one required parameter for the plugin above, this is type and defines the plugin type. This parameter is required for all plugins. The available plugins and the parameters available for each are documented (incompletely) here. Unfortunately, this documentation is in the context of Galaxy tool requirement s instead of CWL SoftwareRequirement s, but the concepts map fairly directly.

cwltool is distributed with an example of such seqtk tool and sample corresponding job. It could executed from the cwltool root using a dependency resolvers configuration file such as the above one using the command:

cwltool --beta-dependency-resolvers-configuration /path/to/dependency-resolvers-conf.yml \
    tests/seqtk_seq.cwl \

This example demonstrates both that cwltool can leverage existing software installations and also handle workflows with dependencies on different versions of the same software and libraries. However the above example does require an existing module setup so it is impossible to test this example "out of the box" with cwltool. For a more isolated test that demonstrates all the same concepts - the resolver plugin type galaxy_packages can be used.

"Galaxy packages" are a lighter weight alternative to Environment Modules that are really just defined by a way to lay out directories into packages and versions to find little scripts that are sourced to modify the environment. They have been used for years in Galaxy community to adapt Galaxy tools to cluster environments but require neither knowledge of Galaxy nor any special tools to setup. These should work just fine for CWL tools.

The cwltool source code repository's test directory is setup with a very simple directory that defines a set of "Galaxy packages" (but really just defines one package named random-lines). The directory layout is simply:


If the galaxy_packages plugin is enabled and pointed at the tests/test_deps_env directory in cwltool's root and a SoftwareRequirement such as the following is encountered.

    - package: 'random-lines'
      - '1.0'

Then cwltool will simply find that file and source it before executing the corresponding tool. That script is only responsible for modifying the job's PATH to add the required binaries.

This is a full example that works since resolving "Galaxy packages" has no external requirements. Try it out by executing the following command from cwltool's root directory:

cwltool --beta-dependency-resolvers-configuration tests/test_deps_env_resolvers_conf.yml \
    tests/random_lines.cwl \

The resolvers configuration file in the above example was simply:

- type: galaxy_packages
  base_path: ./tests/test_deps_env

It is possible that the SoftwareRequirement s in a given CWL tool will not match the module names for a given cluster. Such requirements can be re-mapped to specific deployed packages and/or versions using another file specified using the resolver plugin parameter mapping_files. We will demonstrate this using galaxy_packages but the concepts apply equally well to Environment Modules or Conda packages (described below) for instance.

So consider the resolvers configuration file (tests/test_deps_env_resolvers_conf_rewrite.yml):

- type: galaxy_packages
  base_path: ./tests/test_deps_env
  mapping_files: ./tests/test_deps_mapping.yml

And the corresponding mapping configuraiton file (tests/test_deps_mapping.yml):

- from:
    name: randomLines
    version: 1.0.0-rc1
    name: random-lines
    version: '1.0'

This is saying if cwltool encounters a requirement of randomLines at version 1.0.0-rc1 in a tool, to rewrite to our specific plugin as random-lines at version 1.0. cwltool has such a test tool called random_lines_mapping.cwl that contains such a source SoftwareRequirement. To try out this example with mapping, execute the following command from the cwltool root directory:

cwltool --beta-dependency-resolvers-configuration tests/test_deps_env_resolvers_conf_rewrite.yml \
    tests/random_lines_mapping.cwl \

The previous examples demonstrated leveraging existing infrastructure to provide requirements for CWL tools. If instead a real package manager is used cwltool has the oppertunity to install requirements as needed. While initial support for Homebrew/Linuxbrew plugins is available, the most developed such plugin is for the Conda package manager. Conda has the nice properties of allowing multiple versions of a package to be installed simultaneously, not requiring evalated permissions to install Conda itself or packages using Conda, and being cross platform. For these reasons, cwltool may run as a normal user, install its own Conda environment and manage multiple versions of Conda packages on both Linux and Mac OS X.

The Conda plugin can be endlessly configured, but a sensible set of defaults that has proven a powerful stack for dependency management within the Galaxy tool development ecosystem can be enabled by simply passing cwltool the --beta-conda-dependencies flag.

With this we can use the seqtk example above without Docker and without any externally managed services - cwltool should install everything it needs and create an environment for the tool. Try it out with the follwing command:

cwltool --beta-conda-dependencies tests/seqtk_seq.cwl tests/seqtk_seq_job.json

The CWL specification allows URIs to be attached to SoftwareRequirement s that allow disambiguation of package names. If the mapping files described above allow deployers to adapt tools to their infrastructure, this mechanism allows tools to adapt their requirements to multiple package managers. To demonstrate this within the context of the seqtk, we can simply break the package name we use and then specify a specific Conda package as follows:

    - package: seqtk_seq
      - '1.2'

The example can be executed using the command:

cwltool --beta-conda-dependencies tests/seqtk_seq_wrong_name.cwl tests/seqtk_seq_job.json

The plugin framework for managing resolution of these software requirements as maintained as part of galaxy-lib - a small, portable subset of the Galaxy project. More information on configuration and implementation can be found at the following links:

Overriding workflow requirements at load time

Sometimes a workflow needs additional requirements to run in a particular environment or with a particular dataset. To avoid the need to modify the underlying workflow, cwltool supports requirement "overrides".

The format of the "overrides" object is a mapping of item identifier (workflow, workflow step, or command line tool) to the process requirements that should be applied.

          MESSAGE: override_value

Overrides can be specified either on the command line, or as part of the job input document. Workflow steps are identified using the name of the workflow file followed by the step name as a document fragment identifier "#id". Override identifiers are relative to the toplevel workflow document.

cwltool --overrides overrides.yml my-tool.cwl my-job.yml
input_parameter1: value1
input_parameter2: value2
          MESSAGE: override_value
cwltool my-tool.cwl my-job-with-overrides.yml

CWL Tool Control Flow

Technical outline of how cwltool works internally, for maintainers.

  1. Use CWL load_tool() to load document.
    1. Fetches the document from file or URL
    2. Applies preprocessing (syntax/identifier expansion and normalization)
    3. Validates the document based on cwlVersion
    4. If necessary, updates the document to latest spec
    5. Constructs a Process object using make_tool()` callback. This yields a CommandLineTool, Workflow, or ExpressionTool. For workflows, this recursively constructs each workflow step.
    6. To construct custom types for CommandLineTool, Workflow, or ExpressionTool, provide a custom make_tool()
  2. Iterate on the job() method of the Process object to get back runnable jobs.
    1. job() is a generator method (uses the Python iterator protocol)
    2. Each time the job() method is invoked in an iteration, it returns one of: a runnable item (an object with a run() method), None (indicating there is currently no work ready to run) or end of iteration (indicating the process is complete.)
    3. Invoke the runnable item by calling run(). This runs the tool and gets output.
    4. Output of a process is reported by an output callback.
    5. job() may be iterated over multiple times. It will yield all the work that is currently ready to run and then yield None.
  3. Workflow objects create a corresponding WorkflowJob and WorkflowJobStep objects to hold the workflow state for the duration of the job invocation.
    1. The WorkflowJob iterates over each WorkflowJobStep and determines if the inputs the step are ready.
    2. When a step is ready, it constructs an input object for that step and iterates on the job() method of the workflow job step.
    3. Each runnable item is yielded back up to top level run loop
    4. When a step job completes and receives an output callback, the job outputs are assigned to the output of the workflow step.
    5. When all steps are complete, the intermediate files are moved to a final workflow output, intermediate directories are deleted, and the output callback for the workflow is called.
  4. CommandLineTool job() objects yield a single runnable object.
    1. The CommandLineTool job() method calls make_job_runner() to create a CommandLineJob object
    2. The job method configures the CommandLineJob object by setting public attributes
    3. The job method iterates over file and directories inputs to the CommandLineTool and creates a "path map".
    4. Files are mapped from their "resolved" location to a "target" path where they will appear at tool invocation (for example, a location inside a Docker container.) The target paths are used on the command line.
    5. Files are staged to targets paths using either Docker volume binds (when using containers) or symlinks (if not). This staging step enables files to be logically rearranged or renamed independent of their source layout.
    6. The run() method of CommandLineJob executes the command line tool or Docker container, waits for it to complete, collects output, and makes the output callback.

Extension points

The following functions can be provided to main(), to load_tool(), or to the executor to override or augment the listed behaviors.

executor(tool, job_order_object, **kwargs)
  (Process, Dict[Text, Any], **Any) -> Tuple[Dict[Text, Any], Text]

A toplevel workflow execution loop, should synchronously execute a process object and return an output object.

construct_tool_object(toolpath_object, **kwargs)
  (Dict[Text, Any], **Any) -> Process

Hook to construct a Process object (eg CommandLineTool) object from a document.

  (Dict[Text, int]) -> Dict[Text, int]

Take a resource request and turn it into a concrete resource assignment.

  () -> Text

Return version string.

  (Text) -> StdFsAccess

Return a file system access object.

fetcher_constructor(cache, session)
  (Dict[unicode, unicode], requests.sessions.Session) -> Fetcher

Construct a Fetcher object with the supplied cache and HTTP session.

resolver(document_loader, document)
  (Loader, Union[Text, dict[Text, Any]]) -> Text

Resolve a relative document identifier to an absolute one which can be fetched.


Handler object for logging.