Skip to content
Tools for developing and running pipelines with the Genomics API
Branch: master
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
gce Added GCE metadata based authentication support to the ssh server (#60) Jul 3, 2018
io Escape arguments to gsutil Feb 25, 2019
migrate-pipeline Use slim cloud-sdk instead of alpine Sep 13, 2018
pipelines Update default GPU driver version to 410.79 Feb 28, 2019
ssh-server Added support for executing remote commands on the ssh server (#92) Jan 23, 2019
.travis.yml Build the docker containers using travis (#98) Feb 28, 2019 Initial release Mar 2, 2018
Dockerfile Update base image version Mar 6, 2019
LICENSE Initial release Mar 2, 2018 Drop the ssh-server Dockerfile Aug 21, 2018

Google Genomics Pipelines Tools

Build Status

This repository contains various tools that are useful when running pipelines with the Google Genomics API.

Quick Start Using Cloud Shell

  1. Enable the Genomics API and the Compute Engine API in a new or existing Google Cloud project.

  2. Start a Cloud Shell inside your project.

  3. Inside the Cloud Shell, run the command

     go get

    This command downloads and installs the pipelines tools. Note that to build these tools outside the Cloud Shell you will need the Go tool chain.

  4. Make a bucket on GCS to store the output from the pipeline:

     export BUCKET=gs://${GOOGLE_CLOUD_PROJECT}-pipelines
     gsutil mb ${BUCKET}
  5. Put some test data into the bucket:

     echo "Hello World" | gsutil cp - ${BUCKET}/input
  6. Make a pipeline script that computes the SHA1 sum of a file:

     echo 'sha1sum ${INPUT0} > ${OUTPUT0}' > sha1.script
  7. Run the script using the pipelines API:

     pipelines run --inputs=${BUCKET}/input --outputs=${BUCKET}/output sha1.script
  8. Check the generated output file:

     gsutil cat ${BUCKET}/output

That's it: you've run your first pipeline. For more information about the input formats supported by the pipelines tool, check out the source code. To learn more about the Pipelines API, consult the reference documentation.


The pipelines tool

This tool provides support for running, cancelling and inspecting pipelines.

As a simple example, to run a pipeline that prints 'hello world':

$ cat <<EOF > hello.script
echo "hello world"
$ pipelines --project=my-project run hello.script --output=gs://my-bucket/logs

After the pipeline finishes, you can inspect the output using gsutil:

$ gsutil cat gs://my-bucket/logs/output

The script file format is described in the source code for the command.

Using gcsfuse with the pipelines tool

Use --fuse flag to allow the pipelines tool to use gcsfuse to localize input files instead of copying them one by one with gsutil.

Note: Files other than those directly mentioned by the --inputs flag will be available to container, since the entire bucket is mounted.

SSH into the worker machine

The --ssh flag supported by the pipelines tool will start an ssh container in the background to allow you to log in using SSH and view logs in real time.

The migrate-pipeline tool

This tool takes a JSON encoded v1alpha2 run pipeline request and attempts to emit a v2alpha1 request that replicates the same behaviour.

For example, given a file v1.jsonpb that has a request containing a v1alpha2 ephemeral pipeline and arguments, running:

$ migrate-pipeline < v1.jsonpb

will produce a v2alpha1 request that performs the same action on standard output.


Please report problems using the issue tracker.

You can’t perform that action at this time.