Skip to content

hyperflow-wms/hyperflow

Repository files navigation

HyperFlow: a scientific workflow management system

HyperFLow

Description

HyperFlow is a Workflow Management System (WMS) dedicated for scientific workflows.

Browse the wiki pages to learn more about the HyperFlow workflow model.

Getting started

Installation

  • Install Node.js (http://nodejs.org)
  • Install Redis (http://redis.io)
  • Install HyperFlow:
    • From npm package: npm install -g @hyperflow/hyperflow
    • From github repository:
      npm install https://github.com/hyperflow-wms/hyperflow/archive/{version}.tar.gz
      (where {version} is for example v1.5.0)
    • From the master branch:
      npm install https://github.com/hyperflow-wms/hyperflow/archive/master.tar.gz

Running locally

  • Start the redis server: redis-server
  • Run example workflows using command hflow run <wf_directory>, for example:
    hflow run ./examples/Sqrsum

Running locally using Docker images

  • Use the latest Docker image for the HyperFlow engine, published in Docker Hub as hyperflowwms/hyperflow
  • You can build the image yourself: make container
  • Start redis container:
    docker run -d --name redis redis --bind 127.0.0.1
  • Run workflow via HyperFlow container, for example:
docker run -a stdout -a stderr --rm --network container:redis \
       -e HF_VAR_WORKER_CONTAINER="hyperflowwms/soykb-workflow-worker" \ 
       -e HF_VAR_WORK_DIR="$PWD/input" \ 
       -e HF_VAR_HFLOW_IN_CONTAINER="true" \
       -e HF_VAR_function="redisCommand" \
       -e REDIS_URL="redis://127.0.0.1:6379" \
       --name hyperflow \
       -v /var/run/docker.sock:/var/run/docker.sock \
       -v $PWD:/wfdir \
       --entrypoint "/bin/sh" hyperflowwms/hyperflow -c "apk add docker && hflow run /wfdir"

Where

  • hyperflowwms/soykb-worker is the name of the workflow worker container (Soykb in this case)
  • current directory contains workflow.json
  • subdirectory inputs contains workflow input data

Outputs:

  • Directory inputs will contain files generated by the workflow run
  • Directory inputs/logs-hf will contain logs of all workflow jobs

Running in a Kubernetes cluster

See HyperFlow Kubernetes deployment project for more information.

Running in a distributed infrastructure using the RabbitMQ executor (not maintained)

  • Start the RabbitMQ container: docker run -d --name rabbitmq rabbitmq:3
  • Add option -e AMQP_URL=amqp://rabbitmq
  • More information in the hyperflow-amqp-executor project
  • Warning: currently not maintained and not tested with latest HyperFlow versions

Local configuration files

You can provide workflow configuration through local configuration files:

  • workflow.config.json -- main configuration file
  • workflow.config.{name}.json -- any number of secondary configuration files

The content from all configuration files will be merged and passed to workflow functions via context.appConfig. For example for files:

workflow.config.json:
{
  "main": "mainValue"
}

workflow.config.foo.json:
{
   "secondary": "secondaryValue"
}

The following will be passed in context.appConfig:

{
  "main": "mainValue",
  "foo": {
     "secondary": "secondaryValue"
  }
}

HyperFlow server

The HyperFlow engine can be started in a server mode using command: hflow start-server

If succesfully started, the server prints its URL:

HyperFlow server started at http://localhost:38775

Workflows can be run through the HyperFlow server as follows:

hflow run --submit=<hyperflow_server_url> <workflow_dir>

Currently <workflow_dir> must be a local directory accessbile by the server. This allows running multiple workflows (concurrently) using the same instance of the HyperFlow engine.