Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cortex namespace python #230

Merged
merged 4 commits into from Jul 11, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
16 changes: 8 additions & 8 deletions images/onnx-serve/Dockerfile
Expand Up @@ -26,14 +26,14 @@ RUN apt-get update -qq && apt-get install -y -q \

ENV PYTHONPATH="/src:${PYTHONPATH}"

COPY pkg/workloads/lib/requirements.txt /src/lib/requirements.txt
COPY pkg/workloads/onnx_serve/requirements.txt /src/onnx_serve/requirements.txt
RUN pip3 install -r /src/lib/requirements.txt && \
pip3 install -r /src/onnx_serve/requirements.txt && \
COPY pkg/workloads/cortex/lib/requirements.txt /src/cortex/lib/requirements.txt
COPY pkg/workloads/cortex/onnx_serve/requirements.txt /src/cortex/onnx_serve/requirements.txt
RUN pip3 install -r /src/cortex/lib/requirements.txt && \
pip3 install -r /src/cortex/onnx_serve/requirements.txt && \
rm -rf /root/.cache/pip*

COPY pkg/workloads/consts.py /src/
COPY pkg/workloads/lib /src/lib
COPY pkg/workloads/onnx_serve /src/onnx_serve
COPY pkg/workloads/cortex/consts.py /src/cortex
COPY pkg/workloads/cortex/lib /src/cortex/lib
COPY pkg/workloads/cortex/onnx_serve /src/cortex/onnx_serve

ENTRYPOINT ["/usr/bin/python3", "/src/onnx_serve/api.py"]
ENTRYPOINT ["/usr/bin/python3", "/src/cortex/onnx_serve/api.py"]
12 changes: 6 additions & 6 deletions images/python-packager/Dockerfile
Expand Up @@ -23,15 +23,15 @@ RUN apt-get update -qq && apt-get install -y -q \
zlib1g-dev \
&& apt-get clean -qq && rm -rf /var/lib/apt/lists/*

COPY pkg/workloads/consts.py /src/
COPY pkg/workloads/lib /src/lib
COPY pkg/workloads/cortex/consts.py /src/cortex/
COPY pkg/workloads/cortex/lib /src/cortex/lib

COPY pkg/workloads/tf_api/requirements.txt /src/tf_api/requirements.txt
COPY pkg/workloads/cortex/tf_api/requirements.txt /src/cortex/tf_api/requirements.txt

RUN pip3 install -r /src/lib/requirements.txt && \
pip3 install -r /src/tf_api/requirements.txt && \
RUN pip3 install -r /src/cortex/lib/requirements.txt && \
pip3 install -r /src/cortex/tf_api/requirements.txt && \
rm -rf /root/.cache/pip*

ENV PYTHONPATH="/src:${PYTHONPATH}"

ENTRYPOINT ["/usr/bin/python3", "/src/lib/package.py"]
ENTRYPOINT ["/usr/bin/python3", "/src/cortex/lib/package.py"]
10 changes: 5 additions & 5 deletions images/spark/Dockerfile
Expand Up @@ -35,13 +35,13 @@ RUN sed -i "/^set -ex$/c\set -e" /opt/entrypoint.sh
# Our code
ENV PYTHONPATH="/src:${PYTHONPATH}"

COPY pkg/workloads/lib/requirements.txt /src/lib/requirements.txt
RUN pip3 install -r /src/lib/requirements.txt && \
COPY pkg/workloads/cortex/lib/requirements.txt /src/cortex/lib/requirements.txt
RUN pip3 install -r /src/cortex/lib/requirements.txt && \
rm -rf /root/.cache/pip*

COPY pkg/workloads/consts.py /src/
COPY pkg/workloads/lib /src/lib
COPY pkg/workloads/spark_job /src/spark_job
COPY pkg/workloads/cortex/consts.py /src/cortex/
COPY pkg/workloads/cortex/lib /src/cortex/lib
COPY pkg/workloads/cortex/spark_job /src/cortex/spark_job

# $SPARK_HOME/conf gets clobbered by a volume that spark-on-k8s mounts (KubernetesClientApplication.scala)
RUN cp -r $SPARK_HOME/conf $SPARK_HOME/conf-custom
Expand Down
2 changes: 1 addition & 1 deletion images/spark/run.sh
Expand Up @@ -37,7 +37,7 @@ echo ""
echo "Starting"
echo ""

/usr/bin/python3 /src/lib/package.py --workload-id=$CORTEX_WORKLOAD_ID --context=$CORTEX_CONTEXT_S3_PATH --cache-dir=$CORTEX_CACHE_DIR
/usr/bin/python3 /src/cortex/lib/package.py --workload-id=$CORTEX_WORKLOAD_ID --context=$CORTEX_CONTEXT_S3_PATH --cache-dir=$CORTEX_CACHE_DIR

# Run the intended command
/opt/entrypoint.sh "$@"
2 changes: 1 addition & 1 deletion images/test/Dockerfile
Expand Up @@ -9,7 +9,7 @@ COPY pkg/estimators /estimators

COPY images/test/run.sh /src/run.sh

WORKDIR /src
WORKDIR /src/cortex

ENTRYPOINT ["/bin/bash"]
CMD ["/src/run.sh"]
10 changes: 3 additions & 7 deletions images/test/run.sh
Expand Up @@ -18,13 +18,9 @@
err=0
trap 'err=1' ERR

cd lib
pytest
cd ..
pytest lib/test

cd spark_job
pytest test/unit
pytest test/integration
cd ..
pytest spark_job/test/unit
pytest spark_job/test/integration

test $err = 0
16 changes: 8 additions & 8 deletions images/tf-api/Dockerfile
Expand Up @@ -2,14 +2,14 @@ FROM cortexlabs/tf-base

ENV PYTHONPATH="/src:${PYTHONPATH}"

COPY pkg/workloads/lib/requirements.txt /src/lib/requirements.txt
COPY pkg/workloads/tf_api/requirements.txt /src/tf_api/requirements.txt
RUN pip3 install -r /src/lib/requirements.txt && \
pip3 install -r /src/tf_api/requirements.txt && \
COPY pkg/workloads/cortex/lib/requirements.txt /src/cortex/lib/requirements.txt
COPY pkg/workloads/cortex/tf_api/requirements.txt /src/cortex/tf_api/requirements.txt
RUN pip3 install -r /src/cortex/lib/requirements.txt && \
pip3 install -r /src/cortex/tf_api/requirements.txt && \
rm -rf /root/.cache/pip*

COPY pkg/workloads/consts.py /src/
COPY pkg/workloads/lib /src/lib
COPY pkg/workloads/tf_api /src/tf_api
COPY pkg/workloads/cortex/consts.py /src/cortex/
COPY pkg/workloads/cortex/lib /src/cortex/lib
COPY pkg/workloads/cortex/tf_api /src/cortex/tf_api

ENTRYPOINT ["/usr/bin/python3", "/src/tf_api/api.py"]
ENTRYPOINT ["/usr/bin/python3", "/src/cortex/tf_api/api.py"]
12 changes: 6 additions & 6 deletions images/tf-train-gpu/Dockerfile
Expand Up @@ -2,12 +2,12 @@ FROM cortexlabs/tf-base-gpu

ENV PYTHONPATH="/src:${PYTHONPATH}"

COPY pkg/workloads/lib/requirements.txt /src/lib/requirements.txt
RUN pip3 install -r /src/lib/requirements.txt && \
COPY pkg/workloads/cortex/lib/requirements.txt /src/cortex/lib/requirements.txt
RUN pip3 install -r /src/cortex/lib/requirements.txt && \
rm -rf /root/.cache/pip*

COPY pkg/workloads/consts.py /src/
COPY pkg/workloads/lib /src/lib
COPY pkg/workloads/tf_train /src/tf_train
COPY pkg/workloads/cortex/consts.py /src/cortex/
COPY pkg/workloads/cortex/lib /src/cortex/lib
COPY pkg/workloads/cortex/tf_train /src/cortex/tf_train

ENTRYPOINT ["/usr/bin/python3", "/src/tf_train/train.py"]
ENTRYPOINT ["/usr/bin/python3", "/src/cortex/tf_train/train.py"]
12 changes: 6 additions & 6 deletions images/tf-train/Dockerfile
Expand Up @@ -2,12 +2,12 @@ FROM cortexlabs/tf-base

ENV PYTHONPATH="/src:${PYTHONPATH}"

COPY pkg/workloads/lib/requirements.txt /src/lib/requirements.txt
RUN pip3 install -r /src/lib/requirements.txt && \
COPY pkg/workloads/cortex/lib/requirements.txt /src/cortex/lib/requirements.txt
RUN pip3 install -r /src/cortex/lib/requirements.txt && \
rm -rf /root/.cache/pip*

COPY pkg/workloads/consts.py /src/
COPY pkg/workloads/lib /src/lib
COPY pkg/workloads/tf_train /src/tf_train
COPY pkg/workloads/cortex/consts.py /src/cortex/
COPY pkg/workloads/cortex/lib /src/cortex/lib
COPY pkg/workloads/cortex/tf_train /src/cortex/tf_train

ENTRYPOINT ["/usr/bin/python3", "/src/tf_train/train.py"]
ENTRYPOINT ["/usr/bin/python3", "/src/cortex/tf_train/train.py"]
4 changes: 2 additions & 2 deletions pkg/operator/workloads/data_job.go
Expand Up @@ -95,7 +95,7 @@ func sparkSpec(workloadID string, ctx *context.Context, workloadType string, spa
Mode: sparkop.ClusterMode,
Image: &config.Cortex.SparkImage,
ImagePullPolicy: pointer.String("Always"),
MainApplicationFile: pointer.String("local:///src/spark_job/spark_job.py"),
MainApplicationFile: pointer.String("local:///src/cortex/spark_job/spark_job.py"),
RestartPolicy: sparkop.RestartPolicy{Type: sparkop.Never},
MemoryOverheadFactor: memOverheadFactor,
Arguments: []string{
Expand All @@ -106,7 +106,7 @@ func sparkSpec(workloadID string, ctx *context.Context, workloadType string, spa
" " + strings.Join(args, " ")),
},
Deps: sparkop.Dependencies{
PyFiles: []string{"local:///src/spark_job/spark_util.py", "local:///src/lib/*.py"},
PyFiles: []string{"local:///src/cortex/spark_job/spark_util.py", "local:///src/cortex/lib/*.py"},
},
Driver: sparkop.DriverSpec{
SparkPodSpec: sparkop.SparkPodSpec{
Expand Down
File renamed without changes.
Expand Up @@ -12,4 +12,4 @@
# See the License for the specific language governing permissions and
# limitations under the License.

from .context import Context
from cortex.lib.context import Context
Expand Up @@ -19,14 +19,14 @@
import importlib
from datetime import datetime
from copy import deepcopy

import consts
from lib import util
from lib.storage import S3, LocalStorage
from lib.exceptions import CortexException, UserException
from botocore.exceptions import ClientError
from lib.resources import ResourceMap
from lib.log import get_logger

from cortex import consts
from cortex.lib import util
from cortex.lib.storage import S3, LocalStorage
from cortex.lib.exceptions import CortexException, UserException
from cortex.lib.resources import ResourceMap
from cortex.lib.log import get_logger

logger = get_logger()

Expand Down
File renamed without changes.
File renamed without changes.
Expand Up @@ -18,13 +18,13 @@
import glob
from subprocess import run

from lib import util, Context
from lib.log import get_logger
from lib.exceptions import UserException, CortexException

import requirements
from packaging.requirements import Requirement

from cortex.lib import util, Context
from cortex.lib.log import get_logger
from cortex.lib.exceptions import UserException, CortexException

logger = get_logger()

LOCAL_PACKAGE_PATH = "/packages"
Expand Down
Expand Up @@ -13,10 +13,10 @@
# limitations under the License.

import collections

from lib import util
from copy import deepcopy

from cortex.lib import util


class ResourceMap(dict):
def __init__(self, resource_name_map):
Expand Down
Expand Up @@ -12,5 +12,5 @@
# See the License for the specific language governing permissions and
# limitations under the License.

from .local import LocalStorage
from .s3 import S3
from cortex.lib.storage.local import LocalStorage
from cortex.lib.storage.s3 import S3
Expand Up @@ -22,8 +22,8 @@
from pathlib import Path
import shutil

from lib import util
from lib.exceptions import CortexException
from cortex.lib import util
from cortex.lib.exceptions import CortexException


class LocalStorage(object):
Expand Down
Expand Up @@ -20,8 +20,8 @@
import json
import msgpack

from lib import util
from lib.exceptions import CortexException
from cortex.lib import util
from cortex.lib.exceptions import CortexException


class S3(object):
Expand Down
Expand Up @@ -14,7 +14,7 @@

import pytest

from lib.resources import ResourceMap
from cortex.lib.resources import ResourceMap


def test_resource_map_empty_resource():
Expand Down
Expand Up @@ -17,7 +17,7 @@
from copy import deepcopy
import pytest

from lib import util
from cortex.lib import util
import logging


Expand Down
Expand Up @@ -14,10 +14,11 @@

import os
import sys

import tensorflow as tf

from lib import util
import consts
from cortex.lib import util
from cortex import consts


CORTEX_TYPE_TO_TF_TYPE = {
Expand Down
Expand Up @@ -29,9 +29,8 @@
from copy import deepcopy
from datetime import datetime

import consts

from lib.log import get_logger
from cortex import consts
from cortex.lib.log import get_logger

logger = get_logger()

Expand Down
Expand Up @@ -22,13 +22,13 @@
from flask_api import status
from waitress import serve
import onnxruntime as rt
from lib.storage import S3
import numpy as np

import consts
from lib import util, package, Context
from lib.log import get_logger
from lib.exceptions import CortexException, UserRuntimeException, UserException
from cortex.lib.storage import S3
from cortex import consts
from cortex.lib import util, package, Context
from cortex.lib.log import get_logger
from cortex.lib.exceptions import CortexException, UserRuntimeException, UserException

logger = get_logger()
logger.propagate = False # prevent double logging (flask modifies root logger)
Expand Down
Expand Up @@ -19,14 +19,13 @@
import traceback

from pyspark.sql import SparkSession

from lib import util, Context
from lib.log import get_logger
from lib.exceptions import UserException, CortexException, UserRuntimeException
import spark_util
import pyspark.sql.functions as F
import consts

from cortex import consts
from cortex.lib import util, Context
from cortex.lib.log import get_logger
from cortex.lib.exceptions import UserException, CortexException, UserRuntimeException
from cortex.spark_job import spark_util

logger = get_logger()

Expand Down
Expand Up @@ -20,11 +20,14 @@
from pyspark.sql.dataframe import DataFrame
import pyspark.sql.functions as F

from lib import util
from lib.context import create_transformer_inputs_from_map, create_transformer_inputs_from_lists
from lib.exceptions import CortexException, UserException, UserRuntimeException
from lib.log import get_logger
import consts
from cortex.lib import util
from cortex.lib.context import (
create_transformer_inputs_from_map,
create_transformer_inputs_from_lists,
)
from cortex.lib.exceptions import CortexException, UserException, UserRuntimeException
from cortex.lib.log import get_logger
from cortex import consts

logger = get_logger()

Expand Down
Expand Up @@ -19,12 +19,13 @@
import pytest
import uuid
import os
import shutil

from pyspark import SparkConf
from pyspark.sql import SparkSession
from lib import Context
import consts
import shutil

from cortex.lib import Context
from cortex import consts


def quiet_py4j():
Expand Down