Skip to content

Commit

Permalink
Merge pull request #9 from Multiscale-Genomics/mm-travis-integration
Browse files Browse the repository at this point in the history
Travis integration
  • Loading branch information
markmcdowall committed Jun 22, 2018
2 parents a674465 + b4fa954 commit da1b99a
Show file tree
Hide file tree
Showing 24 changed files with 973 additions and 330 deletions.
3 changes: 3 additions & 0 deletions .flake8
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[flake8]
max-line-length = 100
exclude = docs/*,.git/*,env/*,tmp/*,shims/*,.cache,.eggs,TADbit-master/*
58 changes: 58 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
# See the NOTICE file distributed with this work for additional information
# regarding copyright ownership.

# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at

# http://www.apache.org/licenses/LICENSE-2.0

# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

language: python

os: linux
dist: trusty

python:
- "2.7"

env:
matrix:
- TESTENV=docs
- TESTENV=code
- TESTENV=pylint

addons:
apt:
packages:
# system environment
# - make
# - wget
# - curl

# command to install dependencies
install:
- pip install .

# Fixed version due to errors in 1.7.3
# TODO test when 1.7.4 is available
- if [[ "$TESTENV" == "docs" ]]; then pip install sphinx==1.7.2;fi
- if [[ "$TESTENV" == "pylint" ]]; then pip install pylint;fi

before_script :
- cd ${HOME}/build/Multiscale-Genomics/mg-tool-api
# - chmod +x scripts/travis/test_runner.sh
- chmod +x scripts/travis/docs_runner.sh
- chmod +x scripts/travis/pylint_runner.sh

# command to run tests
script:
# - if [[ "$TESTENV" == "code" ]]; then ./scripts/travis/test_runner.sh; fi
- if [[ "$TESTENV" == "code" ]]; then pytest; fi
- if [[ "$TESTENV" == "docs" ]]; then ./scripts/travis/docs_runner.sh; fi
- if [[ "$TESTENV" == "pylint" ]]; then ./scripts/travis/pylint_runner.sh; fi
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# mg-tool-api

[![Documentation Status](https://readthedocs.org/projects/mg-tool-api/badge/?version=latest)](http://mg-tool-api.readthedocs.io/en/latest/?badge=latest)
[![Documentation Status](https://readthedocs.org/projects/mg-tool-api/badge/?version=latest)](http://mg-tool-api.readthedocs.io/en/latest/?badge=latest) [![Build Status](https://travis-ci.org/Multiscale-Genomics/mg-tool-api.svg?branch=master)](https://travis-ci.org/Multiscale-Genomics/mg-tool-api)

## Introduction
This library implements the specifications detailed in the
Expand All @@ -19,12 +19,12 @@ VRE.

2. Achieve vertical interoperability by using COMPSs, and allowing
developers to specify the execution enviroment requirements for each tool by
using COMPSs "constraints" decorator. Although written with task-based
programming in mind, this library allows execution of Tools outside of the
using COMPSs "constraints" decorator. Although written with task-based
programming in mind, this library allows execution of Tools outside of the
COMPSs runtime.

3. Simplify the construction of workflows, by conceiving tools such that it is
straightforward to combine them in Workflows; in particular by using COMPSs
straightforward to combine them in Workflows; in particular by using COMPSs
"task" decorator and the COMPSs runtime as the workflow scheduler.

## Implementation overview
Expand Down Expand Up @@ -59,14 +59,14 @@ The 'basic_modules' contains the basic entities of mg-tool-api:
Class that contains extra information about files.

The 'utils' module contains useful functions for performing common tasks in Tool
execution. In particular it contains 'logger', the logging facility of mg-tool-api;
execution. In particular it contains 'logger', the logging facility of mg-tool-api;
it provides a unified way of sending messages to the VRE.

See the documentation for the classes for more information.

## Examples

The "summer_demo.py" and "summer_demo2.py" examples implement workflows using PyCOMPSs.
They showcase various functionalities of the library by using the mockup Tools
The "summer_demo.py" and "summer_demo2.py" examples implement workflows using PyCOMPSs.
They showcase various functionalities of the library by using the mockup Tools
implemented in the tools_demos module.

2 changes: 1 addition & 1 deletion apps/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
#!/usr/bin/env python
"""
.. See the NOTICE file distributed with this work for additional information
regarding copyright ownership.
Expand All @@ -15,6 +14,7 @@
See the License for the specific language governing permissions and
limitations under the License.
"""

from pycompssapp import PyCOMPSsApp
from localapp import LocalApp
from workflowapp import WorkflowApp
74 changes: 41 additions & 33 deletions apps/jsonapp.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,12 +19,13 @@
# -----------------------------------------------------------------------------
# JSON-configured App
# -----------------------------------------------------------------------------
import json

from apps.workflowapp import WorkflowApp
from basic_modules.metadata import Metadata
import json


class JSONApp(WorkflowApp):
class JSONApp(WorkflowApp): # pylint: disable=too-few-public-methods
"""
JSON-configured App.
Expand All @@ -34,7 +35,11 @@ class JSONApp(WorkflowApp):
"""

def launch(self, tool_class,
# The arguments deffer between this function and the supeclass in
# basic_modules.app to provide a common interface and so that the JSON
# configuration files can be provided to generate the parameters required
# by App.
def launch(self, tool_class, # pylint: disable=too-many-locals,arguments-differ
config_path, input_metadata_path, output_metadata_path):
"""
Run a Tool with the specified inputs and configuration.
Expand Down Expand Up @@ -66,24 +71,26 @@ def launch(self, tool_class,
>>> import App, Tool
>>> app = JSONApp()
>>> # expects to find valid config.json and input_metadata.json
>>> app.launch(Tool, "/path/to/config.json", "/path/to/input_metadata.json", "/path/to/results.json")
>>> app.launch(
... Tool, "/path/to/config.json",
... "/path/to/input_metadata.json", "/path/to/results.json")
>>> # writes /path/to/results.json
"""

print "0) Unpack information from JSON"
input_IDs, arguments, output_files = self._read_config(
input_ids, arguments, output_files = self._read_config(
config_path)

input_metadata_IDs = self._read_metadata(
input_metadata_ids = self._read_metadata(
input_metadata_path)

# arrange by role
input_metadata = {}
for role, ID in input_IDs.items():
if isinstance(ID, (list, tuple)): # check allow_multiple?
input_metadata[role] = [input_metadata_IDs[el] for el in ID]
for role, input_id in input_ids.items():
if isinstance(input_id, (list, tuple)): # check allow_multiple?
input_metadata[role] = [input_metadata_ids[el] for el in input_id]
else:
input_metadata[role] = input_metadata_IDs[ID]
input_metadata[role] = input_metadata_ids[input_id]

# get paths from IDs
input_files = {}
Expand All @@ -104,14 +111,14 @@ def launch(self, tool_class,
output_files, output_metadata,
output_metadata_path)

def _read_config(self, json_path):
def _read_config(self, json_path): # pylint: disable=no-self-use
"""
Read config.json to obtain:
input_IDs: dict containing IDs of tool input files
input_ids: dict containing IDs of tool input files
arguments: dict containing tool arguments
output_files: dict containing absolute paths of tool outputs
Note that values of input_IDs may be either str or list,
Note that values of input_ids may be either str or list,
according to whether "allow_multiple" is True for the role;
in which case, the VRE will have accepted multiple input files
for that role.
Expand All @@ -122,16 +129,16 @@ def _read_config(self, json_path):
For more information see the schema for config.json.
"""
configuration = json.load(file(json_path))
input_IDs = {}
for input_ID in configuration["input_files"]:
role = input_ID["name"]
ID = input_ID["value"]
if role in input_IDs:
if not isinstance(input_IDs[role], list):
input_IDs[role] = [input_IDs[role]]
input_IDs[role].append(ID)
input_ids = {}
for input_config_id in configuration["input_files"]:
role = input_config_id["name"]
input_id = input_config_id["value"]
if role in input_ids:
if not isinstance(input_ids[role], list):
input_ids[role] = [input_ids[role]]
input_ids[role].append(input_id)
else:
input_IDs[role] = ID
input_ids[role] = input_id

output_files = {}
for output_file in configuration["output_files"]:
Expand All @@ -141,22 +148,21 @@ def _read_config(self, json_path):
for argument in configuration["arguments"]:
arguments[argument["name"]] = argument["value"]

return input_IDs, arguments, output_files
return input_ids, arguments, output_files

def _read_metadata(self, json_path):
def _read_metadata(self, json_path): # pylint: disable=no-self-use
"""
Read input_metadata.json to obtain input_metadata_IDs, a dict
Read input_metadata.json to obtain input_metadata_ids, a dict
containing metadata on each of the tool input files,
arranged by their ID.
For more information see the schema for input_metadata.json.
"""
metadata = json.load(file(json_path))
input_metadata = {}
input_source_ids = {}
for input_file in metadata:
ID = input_file["_id"]
input_metadata[ID] = Metadata(
input_id = input_file["_id"]
input_metadata[input_id] = Metadata(
data_type=input_file["data_type"],
file_type=input_file["file_type"],
file_path=input_file["file_path"],
Expand All @@ -166,8 +172,8 @@ def _read_metadata(self, json_path):
)
return input_metadata

def _write_results(self,
input_files, input_metadata,
def _write_results(self, # pylint: disable=no-self-use,too-many-arguments
input_files, input_metadata, # pylint: disable=unused-argument
output_files, output_metadata, json_path):
"""
Write results.json using information from input_files and output_files:
Expand All @@ -190,6 +196,7 @@ def _write_results(self,
For more information see the schema for results.json.
"""
results = []

def _newresult(role, path, metadata):
return {
"name": role,
Expand All @@ -204,9 +211,10 @@ def _newresult(role, path, metadata):
for role, path in output_files.items():
metadata = output_metadata[role]
if isinstance(path, (list, tuple)): # check allow_multiple?
assert (isinstance(metadata, (list, tuple)) and \
len(metadata) == len(path)) or \
isinstance(metadata, Metadata), \
assert (
isinstance(metadata, (list, tuple)) and
len(metadata) == len(path)
) or isinstance(metadata, Metadata), \
"""Wrong number of metadata entries for role {role}:
either 1 or {np}, not {nm}""".format(role=role, np=len(path), nm=len(metadata))

Expand Down
4 changes: 2 additions & 2 deletions apps/localapp.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,10 @@
# Local Filesystem App
# -----------------------------------------------------------------------------
from basic_modules.app import App
from basic_modules.metadata import Metadata
from basic_modules.metadata import Metadata # pylint: disable=unused-import


class LocalApp(App):
class LocalApp(App): # pylint: disable=too-few-public-methods
"""
Local Filesystem App.
"""
Expand Down
10 changes: 6 additions & 4 deletions apps/pycompssapp.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@
limitations under the License.
"""

from __future__ import print_function

# -----------------------------------------------------------------------------
# PyCOMPSs App
# -----------------------------------------------------------------------------
Expand All @@ -34,7 +36,7 @@
from basic_modules.app import App


class PyCOMPSsApp(App):
class PyCOMPSsApp(App): # pylint: disable=too-few-public-methods
"""
PyCOMPSsApp: uses PyCOMPSs.
"""
Expand All @@ -51,7 +53,7 @@ def _post_run(self, tool_instance, output_files, output_metadata):
# content from output_files. Then it is possible to perform any
# post operation like storing the results somewhere.
output_files, output_metadata = super(PyCOMPSsApp, self)._post_run(
tool_instance,
output_files,
output_metadata)
tool_instance,
output_files,
output_metadata)
return output_files, output_metadata
4 changes: 2 additions & 2 deletions apps/workflowapp.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,10 @@
# -----------------------------------------------------------------------------
from apps.localapp import LocalApp
from apps.pycompssapp import PyCOMPSsApp
from basic_modules.workflow import Workflow
from basic_modules.workflow import Workflow # pylint: disable=unused-import


class WorkflowApp(PyCOMPSsApp, LocalApp):
class WorkflowApp(PyCOMPSsApp, LocalApp): # pylint: disable=too-few-public-methods
"""
Workflow-aware App.
Expand Down

0 comments on commit da1b99a

Please sign in to comment.