Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AIRFLOW-4030] second attempt to add singularity to airflow #7191

Merged
merged 30 commits into from
Feb 23, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
bacc3b5
adding singularity operator and tests
vsoch Jan 16, 2020
4acdf4d
Merge branch 'master' into add/attempt-2-singularity-operator
vsoch Feb 13, 2020
c2ded3e
removing encoding pragmas and fixing up dockerfile to pass linting
vsoch Feb 14, 2020
2bd0e41
make workdir in /tmp because AIRFLOW_SOURCES not defined yet
vsoch Feb 14, 2020
29e8c99
curl needs to follow redirects with -L
vsoch Feb 14, 2020
6fb5465
Merge branch 'master' into add/attempt-2-singularity-operator
vsoch Feb 15, 2020
878499f
moving files to where they are supposed to be, more changes to mock, …
vsoch Feb 15, 2020
e7863a1
removing trailing whitespace, moving example_dag for singularity, add…
vsoch Feb 15, 2020
32b94e6
ran isort on example dags file
vsoch Feb 15, 2020
37779d3
adding missing init in example_dags folder for singularity
vsoch Feb 17, 2020
fdaeb8e
removing code from __init__.py files for singularity operator to fix …
vsoch Feb 18, 2020
dde195b
forgot to update link to singularity in operators and hooks ref
vsoch Feb 18, 2020
60a68bb
command must have been provided on init of singularity operator instance
vsoch Feb 18, 2020
95cbe87
I guess I'm required to have a task_id?
vsoch Feb 18, 2020
1ed4986
try adding working_dir to singularity operator type definitions
vsoch Feb 18, 2020
cef6c07
disable too many arguments for pylint of singularity operator init
vsoch Feb 19, 2020
eb73f3d
move pylint disable up to line 64 - doesnt catch at end of statement …
vsoch Feb 19, 2020
637d5cb
two spaces before inline comment
vsoch Feb 19, 2020
afb000b
I dont see task_id as a param for other providers, removing for singu…
vsoch Feb 20, 2020
b6ebe47
adding debug print
vsoch Feb 20, 2020
a6392e5
allow for return of just image and/or lines
vsoch Feb 20, 2020
7b7708c
dont understand how mock works, but the image should exist after its …
vsoch Feb 21, 2020
73e67ff
try removing shutil, the client should handle pull folder instead
vsoch Feb 21, 2020
17b4fcd
Merge branch 'master' into add/attempt-2-singularity-operator
vsoch Feb 22, 2020
55c6e3b
try changing pull-file to same uri that is expected to be pulled
vsoch Feb 22, 2020
0c66a2e
Merge branch 'add/attempt-2-singularity-operator' of github.com:vsoch…
vsoch Feb 22, 2020
31fd1bb
import of AirflowException moved to exceptions
vsoch Feb 22, 2020
e8ba4dd
DAG module was moved to airflow.models
vsoch Feb 22, 2020
40fbadc
ensure pull is called with pull_folder
vsoch Feb 22, 2020
c089430
Merge branch 'master' of github.com:apache/airflow into add/attempt-2…
vsoch Feb 23, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 35 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -180,6 +180,41 @@ RUN HADOOP_DISTRO="cdh" \

ENV PATH "${PATH}:/opt/hive/bin"

# Install Singularity (for Singularity executor testing)
RUN apt-get update \
&& apt-get install --no-install-recommends -y \
uuid-dev \
libgpgme11-dev \
squashfs-tools \
libseccomp-dev \
pkg-config \
cryptsetup \
&& apt-get autoremove -yqq --purge \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*

ENV GOLANG_VERSION=1.13.8
ENV SINGULARITY_VERSION=3.5.2

RUN curl -L -o go${GOLANG_VERSION}.linux-amd64.tar.gz "https://dl.google.com/go/go${GOLANG_VERSION}.linux-amd64.tar.gz" \
&& tar -C /usr/local -xzvf "go${GOLANG_VERSION}.linux-amd64.tar.gz" \
&& rm "go${GOLANG_VERSION}.linux-amd64.tar.gz"

ENV PATH="${PATH}:/usr/local/go/bin"

WORKDIR /tmp

RUN curl -L -o singularity-${SINGULARITY_VERSION}.tar.gz https://github.com/sylabs/singularity/releases/download/v${SINGULARITY_VERSION}/singularity-${SINGULARITY_VERSION}.tar.gz \
&& tar -xzf singularity-${SINGULARITY_VERSION}.tar.gz

WORKDIR /tmp/singularity

RUN ./mconfig \
&& make -C builddir \
&& make -C builddir install

WORKDIR /

# Install Minicluster
ENV MINICLUSTER_HOME="/opt/minicluster"

Expand Down
17 changes: 17 additions & 0 deletions airflow/providers/singularity/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
17 changes: 17 additions & 0 deletions airflow/providers/singularity/example_dags/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

from datetime import datetime, timedelta

from airflow.models import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.providers.singularity.operators.singularity import SingularityOperator

default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime.utcnow(),
'email': ['airflow@example.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5)
}

dag = DAG(
'singularity_sample', default_args=default_args, schedule_interval=timedelta(minutes=10))

t1 = BashOperator(
task_id='print_date',
bash_command='date',
dag=dag)

t2 = BashOperator(
task_id='sleep',
bash_command='sleep 5',
retries=3,
dag=dag)

t3 = SingularityOperator(command='/bin/sleep 30',
image='docker://busybox:1.30.1',
task_id='singularity_op_tester',
dag=dag)

t4 = BashOperator(
task_id='print_hello',
bash_command='echo "hello world!!!"',
dag=dag)

t1.set_downstream(t2)
t1.set_downstream(t3)
t3.set_downstream(t4)
17 changes: 17 additions & 0 deletions airflow/providers/singularity/operators/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
#
potiuk marked this conversation as resolved.
Show resolved Hide resolved
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
181 changes: 181 additions & 0 deletions airflow/providers/singularity/operators/singularity.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,181 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

import ast
import os
import shutil
from typing import Any, Dict, List, Optional, Union

from spython.main import Client

from airflow.exceptions import AirflowException
from airflow.models import BaseOperator
from airflow.utils.decorators import apply_defaults


class SingularityOperator(BaseOperator):
"""
Execute a command inside a Singularity container

Singularity has more seamless connection to the host than Docker, so
no special binds are needed to ensure binding content in the user $HOME
and temporary directories. If the user needs custom binds, this can
be done with --volumes

:param image: Singularity image or URI from which to create the container.
:type image: str
:param auto_remove: Delete the container when the process exits
The default is False.
:type auto_remove: bool
:param command: Command to be run in the container. (templated)
:type command: str or list
:param start_command: start command to pass to the container instance
:type start_command: string or list
:param environment: Environment variables to set in the container. (templated)
:type environment: dict
:param working_dir: Set a working directory for the instance.
:type working_dir: str
:param force_pull: Pull the image on every run. Default is False.
:type force_pull: bool
:param volumes: List of volumes to mount into the container, e.g.
``['/host/path:/container/path', '/host/path2:/container/path2']``.
:param options: other flags (list) to provide to the instance start
:type options: list
:param working_dir: Working directory to
set on the container (equivalent to the -w switch the docker client)
:type working_dir: str
"""
template_fields = ('command', 'environment',)
template_ext = ('.sh', '.bash',)

@apply_defaults
def __init__( # pylint: disable=too-many-arguments
self,
image: str,
command: Union[int, List[str]],
start_command: Optional[Union[str, List[str]]] = None,
environment: Optional[Dict[str, Any]] = None,
pull_folder: Optional[str] = None,
working_dir: Optional[str] = None,
force_pull: Optional[bool] = False,
volumes: Optional[List[str]] = None,
options: Optional[List[str]] = None,
auto_remove: Optional[bool] = False,
*args,
**kwargs) -> None:

super(SingularityOperator, self).__init__(*args, **kwargs)
self.auto_remove = auto_remove
self.command = command
self.start_command = start_command
self.environment = environment or {}
self.force_pull = force_pull
self.image = image
self.instance = None
self.options = options or []
self.pull_folder = pull_folder
self.volumes = volumes or []
self.working_dir = working_dir
self.cli = None
self.container = None

def execute(self, context):

self.log.info('Preparing Singularity container %s', self.image)
self.cli = Client

if not self.command:
raise AirflowException('You must define a command.')

# Pull the container if asked, and ensure not a binary file
if self.force_pull and not os.path.exists(self.image):
self.log.info('Pulling container %s', self.image)
image = self.cli.pull(self.image, stream=True, pull_folder=self.pull_folder)

# If we need to stream result for the user, returns lines
if isinstance(image, list):
lines = image.pop()
image = image[0]
for line in lines:
self.log.info(line)

# Update the image to be a filepath on the system
self.image = image

# Prepare list of binds
for bind in self.volumes:
self.options = self.options + ['--bind', bind]

# Does the user want a custom working directory?
if self.working_dir is not None:
self.options = self.options + ['--workdir', self.working_dir]

# Export environment before instance is run
for enkey, envar in self.environment.items():
self.log.debug('Exporting %s=%s', envar, enkey)
os.putenv(enkey, envar)
os.environ[enkey] = envar

# Create a container instance
self.log.debug('Options include: %s', self.options)
self.instance = self.cli.instance(self.image,
options=self.options,
args=self.start_command,
start=False)

self.instance.start()
self.log.info(self.instance.cmd)
self.log.info('Created instance %s from %s', self.instance, self.image)

self.log.info('Running command %s', self._get_command())
self.cli.quiet = True
result = self.cli.execute(self.instance,
self._get_command(),
return_result=True)

# Stop the instance
self.log.info('Stopping instance %s', self.instance)
self.instance.stop()

if self.auto_remove is True:
if self.auto_remove and os.path.exists(self.image):
shutil.rmtree(self.image)

# If the container failed, raise the exception
if result['return_code'] != 0:
message = result['message']
raise AirflowException(f'Singularity failed: {message}')

self.log.info('Output from command %s', result['message'])

def _get_command(self):
if self.command is not None and self.command.strip().find('[') == 0:
commands = ast.literal_eval(self.command)
else:
commands = self.command
return commands

def on_kill(self):
if self.instance is not None:
self.log.info('Stopping Singularity instance')
self.instance.stop()

# If an image exists, clean it up
if self.auto_remove is True:
if self.auto_remove and os.path.exists(self.image):
shutil.rmtree(self.image)
2 changes: 2 additions & 0 deletions docs/autoapi_templates/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -174,6 +174,8 @@ All operators are in the following packages:

airflow/providers/sftp/sensors/index

airflow/providers/singularity/operators/index

airflow/providers/slack/operators/index

airflow/providers/snowflake/operators/index
Expand Down
6 changes: 6 additions & 0 deletions docs/operators-and-hooks-ref.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1210,6 +1210,12 @@ These integrations allow you to perform various operations using various softwar
-
-

* - `Singularity <https://sylabs.io/guides/latest/user-guide/>`__
-
-
- :mod:`airflow.providers.singularity.operators.singularity`
-

* - `SQLite <https://www.sqlite.org/index.html>`__
-
- :mod:`airflow.providers.sqlite.hooks.sqlite`
Expand Down
1 change: 0 additions & 1 deletion scripts/ci/in_container/entrypoint_ci.sh
Original file line number Diff line number Diff line change
Expand Up @@ -164,7 +164,6 @@ if [[ ${RUNTIME:=""} == "kubernetes" ]]; then
export AIRFLOW_KUBERNETES_IMAGE_TAG
fi


if [[ "${ENABLE_KIND_CLUSTER}" == "true" ]]; then
export CLUSTER_NAME="airflow-python-${PYTHON_VERSION}-${KUBERNETES_VERSION}"
"${MY_DIR}/kubernetes/setup_kind_cluster.sh"
Expand Down
6 changes: 4 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -336,6 +336,7 @@ def write_version(filename: str = os.path.join(*["airflow", "git_version"])):
'blinker>=1.1',
'sentry-sdk>=0.8.0',
]
singularity = ['spython>=0.0.56']
slack = [
'slackclient>=1.0.0,<2.0.0',
]
Expand Down Expand Up @@ -424,8 +425,8 @@ def write_version(filename: str = os.path.join(*["airflow", "git_version"])):
devel_all = (all_dbs + atlas + aws + azure + celery + cgroups + datadog + devel + doc + docker + druid +
elasticsearch + gcp + grpc + jdbc + jenkins + kerberos + kubernetes + ldap + odbc + oracle +
pagerduty + papermill + password + pinot + redis + salesforce + samba + segment + sendgrid +
sentry + slack + snowflake + ssh + statsd + tableau + virtualenv + webhdfs + yandexcloud +
zendesk)
sentry + singularity + slack + snowflake + ssh + statsd + tableau + virtualenv + webhdfs +
yandexcloud + zendesk)

# Snakebite are not Python 3 compatible :'(
if PY3:
Expand Down Expand Up @@ -566,6 +567,7 @@ def do_setup():
'segment': segment,
'sendgrid': sendgrid,
'sentry': sentry,
'singularity': singularity,
'slack': slack,
'snowflake': snowflake,
'ssh': ssh,
Expand Down
Loading