Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix docker image versioning #264

Merged
merged 5 commits into from Aug 2, 2017
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
1 change: 1 addition & 0 deletions MANIFEST.in
@@ -1,3 +1,4 @@
#include the license file
include LICENSE.txt
include README.md
include VERSION.txt
3 changes: 2 additions & 1 deletion NoopBenchDockerfile
@@ -1,4 +1,5 @@
FROM clipper/py-rpc:latest
ARG CODE_VERSION=0.2-rc1
FROM clipper/py-rpc:${CODE_VERSION}

COPY containers/python/noop_container.py /container/
COPY bench/setup_noop_bench_docker.sh /bench/
Expand Down
3 changes: 2 additions & 1 deletion NoopDockerfile
@@ -1,4 +1,5 @@
FROM clipper/py-rpc:latest
ARG CODE_VERSION=0.2-rc1
FROM clipper/py-rpc:${CODE_VERSION}

MAINTAINER Dan Crankshaw <dscrankshaw@gmail.com>

Expand Down
4 changes: 3 additions & 1 deletion PySparkContainerDockerfile
@@ -1,4 +1,5 @@
FROM clipper/py-rpc:latest
ARG CODE_VERSION=0.2-rc1
FROM clipper/py-rpc:${CODE_VERSION}

MAINTAINER Dan Crankshaw <dscrankshaw@gmail.com>

Expand All @@ -14,6 +15,7 @@ RUN curl -o /spark.tgz https://d3kbcqa49mib13.cloudfront.net/spark-2.1.1-bin-had

COPY containers/python/pyspark_container.py containers/python/pyspark_container_entry.sh /container/
COPY clipper_admin/ /lib/clipper_admin/
COPY VERSION.txt /lib/

ENV SPARK_HOME="/spark"

Expand Down
4 changes: 3 additions & 1 deletion PythonContainerDockerfile
@@ -1,4 +1,5 @@
FROM clipper/py-rpc:latest
ARG CODE_VERSION=0.2-rc1
FROM clipper/py-rpc:${CODE_VERSION}

MAINTAINER Dan Crankshaw <dscrankshaw@gmail.com>

Expand All @@ -9,6 +10,7 @@ RUN conda install -y --file /lib/python_container_conda_deps.txt

COPY containers/python/python_container.py containers/python/python_container_entry.sh /container/
COPY clipper_admin/ /lib/clipper_admin/
COPY VERSION.txt /lib/


CMD ["/container/python_container_entry.sh"]
Expand Down
3 changes: 2 additions & 1 deletion RPythonDockerfile
@@ -1,4 +1,5 @@
FROM clipper/py-rpc:latest
ARG CODE_VERSION=0.2-rc1
FROM clipper/py-rpc:${CODE_VERSION}

## Use Debian unstable via pinning -- new style via APT::Default-Release
RUN echo "deb http://http.debian.net/debian sid main" > /etc/apt/sources.list.d/debian-unstable.list \
Expand Down
3 changes: 2 additions & 1 deletion SklearnCifarDockerfile
@@ -1,4 +1,5 @@
FROM clipper/py-rpc:latest
ARG CODE_VERSION=0.2-rc1
FROM clipper/py-rpc:${CODE_VERSION}

MAINTAINER Dan Crankshaw <dscrankshaw@gmail.com>

Expand Down
3 changes: 2 additions & 1 deletion SumBenchDockerfile
@@ -1,4 +1,5 @@
FROM clipper/py-rpc:latest
ARG CODE_VERSION=0.2-rc1
FROM clipper/py-rpc:${CODE_VERSION}

COPY containers/python/sum_container.py /container/
COPY bench/setup_sum_bench_docker.sh /bench/
Expand Down
3 changes: 2 additions & 1 deletion SumDockerfile
@@ -1,4 +1,5 @@
FROM clipper/py-rpc:latest
ARG CODE_VERSION=0.2-rc1
FROM clipper/py-rpc:${CODE_VERSION}

COPY containers/python/sum_container.py /container/

Expand Down
3 changes: 2 additions & 1 deletion TensorFlowCifarDockerfile
@@ -1,4 +1,5 @@
FROM clipper/py-rpc:latest
ARG CODE_VERSION=0.2-rc1
FROM clipper/py-rpc:${CODE_VERSION}

MAINTAINER Dan Crankshaw <dscrankshaw@gmail.com>

Expand Down
1 change: 1 addition & 0 deletions VERSION.txt
@@ -0,0 +1 @@
develop
13 changes: 9 additions & 4 deletions bench/build_bench_docker_images.sh
Expand Up @@ -10,26 +10,31 @@ unset CDPATH
# the script.
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"


# Let the user start this script from anywhere in the filesystem.
cd $DIR/..

tag=$(<VERSION.txt)

if [ $# -ne 0 ] && [ $# -ne 4 ]; then
echo "Usage: ./build_bench_docker_images.sh [<sum_model_name> <sum_model_version> <noop_model_name> <noop_model_version>]"
exit 1
fi

# Build the Clipper Docker images
# Assume local clipper/py-rpc base image (if exists) or pulled image is correct

docker build -t clipper/py-rpc:$tag -f ./RPCDockerfile ./
if [ $# -eq 0 ]; then
time docker build -t clipper/sum-bench -f SumBenchDockerfile ./
time docker build -t clipper/noop-bench -f NoopBenchDockerfile ./
time docker build build --build-arg CODE_VERSION=$tag -t clipper/sum-bench:$tag -f SumBenchDockerfile ./
time docker build build --build-arg CODE_VERSION=$tag -t clipper/noop-bench:$tag -f NoopBenchDockerfile ./
else
echo $1
echo $2
echo $3
echo $4
time docker build -t clipper/sum-bench -f SumBenchDockerfile ./ --build-arg MODEL_NAME="$1" --build-arg MODEL_VERSION="$2"
time docker build -t clipper/noop-bench -f NoopBenchDockerfile ./ --build-arg MODEL_NAME="$3" --build-arg MODEL_VERSION="$4"
time docker build build --build-arg CODE_VERSION=$tag -t clipper/sum-bench:$tag -f SumBenchDockerfile ./ --build-arg MODEL_NAME="$1" --build-arg MODEL_VERSION="$2"
time docker build build --build-arg CODE_VERSION=$tag -t clipper/noop-bench:$tag -f NoopBenchDockerfile ./ --build-arg MODEL_NAME="$3" --build-arg MODEL_VERSION="$4"
fi

cd -
25 changes: 14 additions & 11 deletions bin/build_docker_images.sh
Expand Up @@ -10,29 +10,32 @@ unset CDPATH
# the script.
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"


# Let the user start this script from anywhere in the filesystem.
cd $DIR/..

tag=$(<VERSION.txt)

# Build the Clipper Docker images
time docker build -t clipper/query_frontend -f QueryFrontendDockerfile ./
time docker build -t clipper/management_frontend -f ManagementFrontendDockerfile ./
time docker build -t clipper/query_frontend:$tag -f QueryFrontendDockerfile ./
time docker build -t clipper/management_frontend:$tag -f ManagementFrontendDockerfile ./
cd -

# Build Spark JVM Container
cd $DIR/../containers/jvm
time docker build -t clipper/spark-scala-container -f SparkScalaContainerDockerfile ./
time docker build -t clipper/spark-scala-container:$tag -f SparkScalaContainerDockerfile ./
cd -

# Build the Python model containers
cd $DIR/..

# first build base image
docker build -t clipper/py-rpc -f ./RPCDockerfile ./
time docker build -t clipper/sum-container -f ./SumDockerfile ./
time docker build -t clipper/noop-container -f ./NoopDockerfile ./
time docker build -t clipper/python-container -f ./PythonContainerDockerfile ./
time docker build -t clipper/pyspark-container -f ./PySparkContainerDockerfile ./
time docker build -t clipper/sklearn_cifar_container -f ./SklearnCifarDockerfile ./
time docker build -t clipper/tf_cifar_container -f ./TensorFlowCifarDockerfile ./
time docker build -t clipper/r_python_container -f ./RPythonDockerfile ./
docker build -t clipper/py-rpc:$tag -f ./RPCDockerfile ./
time docker build --build-arg CODE_VERSION=$tag -t clipper/sum-container:$tag -f ./SumDockerfile ./
time docker build --build-arg CODE_VERSION=$tag -t clipper/noop-container:$tag -f ./NoopDockerfile ./
time docker build --build-arg CODE_VERSION=$tag -t clipper/python-container:$tag -f ./PythonContainerDockerfile ./
time docker build --build-arg CODE_VERSION=$tag -t clipper/pyspark-container:$tag -f ./PySparkContainerDockerfile ./
time docker build --build-arg CODE_VERSION=$tag -t clipper/sklearn_cifar_container:$tag -f ./SklearnCifarDockerfile ./
time docker build --build-arg CODE_VERSION=$tag -t clipper/tf_cifar_container:$tag -f ./TensorFlowCifarDockerfile ./
time docker build --build-arg CODE_VERSION=$tag -t clipper/r_python_container:$tag -f ./RPythonDockerfile ./
cd -
1 change: 1 addition & 0 deletions clipper_admin/__init__.py
Expand Up @@ -6,3 +6,4 @@
sys.exit(1)

from clipper_manager import Clipper
from version import version as __version__
34 changes: 18 additions & 16 deletions clipper_admin/clipper_manager.py
Expand Up @@ -18,6 +18,7 @@
import time
import re
from .module_dependency import ModuleDependencyAnalyzer
from .version import version as code_version

__all__ = ['Clipper']

Expand Down Expand Up @@ -99,7 +100,7 @@ class Clipper:
on `host` at the port specified by `redis_port`.
redis_persistence_path : string, optional
The directory path to which redis data should be persisted. The directory
should not already exist. If unspecified, redis will not persist data to disk.
should not already exist. If unspecified, redis will not persist data to disk.
restart_containers : bool, optional
If true, containers will restart on failure. If false, containers
will not restart automatically.
Expand Down Expand Up @@ -134,7 +135,7 @@ def __init__(self,
'--redis_port=%d' % self.redis_port
],
'image':
'clipper/management_frontend:latest',
'clipper/management_frontend:{}'.format(code_version),
'ports': [
'%d:%d' % (CLIPPER_MANAGEMENT_PORT,
CLIPPER_MANAGEMENT_PORT)
Expand All @@ -150,7 +151,7 @@ def __init__(self,
],
'depends_on': ['mgmt_frontend'],
'image':
'clipper/query_frontend:latest',
'clipper/query_frontend:{}'.format(code_version),
'ports': [
'%d:%d' % (CLIPPER_RPC_PORT, CLIPPER_RPC_PORT),
'%d:%d' % (CLIPPER_QUERY_PORT, CLIPPER_QUERY_PORT)
Expand Down Expand Up @@ -340,8 +341,8 @@ def register_application(self, name, input_type, default_output,
by the end of the latency objective.
slo_micros : int
The query latency objective for the application in microseconds.
This is the processing latency between Clipper receiving a request
and sending a response. It does not account for network latencies
This is the processing latency between Clipper receiving a request
and sending a response. It does not account for network latencies
before a request is received or after a response is sent.

If Clipper cannot process a query within the latency objective,
Expand Down Expand Up @@ -830,7 +831,7 @@ def deploy_pyspark_model(self,
print("Error saving spark model: %s" % e)
raise e

pyspark_container = "clipper/pyspark-container"
pyspark_container = "clipper/pyspark-container:{}".format(code_version)

# extract the pyspark class name. This will be something like
# pyspark.mllib.classification.LogisticRegressionModel
Expand Down Expand Up @@ -910,7 +911,8 @@ def centered_predict(inputs):
num_containers=1)
"""

default_python_container = "clipper/python-container"
default_python_container = "clipper/python-container:{}".format(
code_version)
serialization_dir = self._save_python_function(name, predict_function)

# Deploy function
Expand Down Expand Up @@ -970,8 +972,8 @@ def register_app_and_deploy_predict_function(
The version to assign the deployed model.
slo_micros : int
The query latency objective for the application in microseconds.
This is the processing latency between Clipper receiving a request
and sending a response. It does not account for network latencies
This is the processing latency between Clipper receiving a request
and sending a response. It does not account for network latencies
before a request is received or after a response is sent.
labels : list of str, optional
A list of strings annotating the model.
Expand Down Expand Up @@ -1037,8 +1039,8 @@ def register_app_and_deploy_pyspark_model(
The version to assign the deployed model.
slo_micros : int, optional
The query latency objective for the application in microseconds.
This is the processing latency between Clipper receiving a request
and sending a response. It does not account for network latencies
This is the processing latency between Clipper receiving a request
and sending a response. It does not account for network latencies
before a request is received or after a response is sent.
labels : list of str, optional
A list of strings annotating the model.
Expand Down Expand Up @@ -1242,7 +1244,7 @@ def _check_and_write_dependencies(self, environment_path, directory,
If packages listed in specified conda environment file have conflicting dependencies,
this function will warn the user and return False.

If there are no conflicting package dependencies, existence of the packages in the
If there are no conflicting package dependencies, existence of the packages in the
container conda channel is tested. The user is warned about any missing packages.
All existing conda packages are written out to `conda_dep_fname` and pip packages
to `pip_dep_fname` in the given `directory`. This function then returns True.
Expand Down Expand Up @@ -1588,13 +1590,13 @@ def deploy_R_model(self,
The name to assign this model.
version : int
The version to assign this model.
model_data :
model_data :
The trained model to add to Clipper.The type has to be rpy2.robjects.vectors.ListVector,
this is how python's rpy2 encapsulates any given R model.This model will be loaded
into the Clipper model container and provided as an argument to the
predict function each time it is called.
predict function each time it is called.
labels : list of str, optional
A set of strings annotating the model
A set of strings annotating the model
num_containers : int, optional
The number of replicas of the model to create. More replicas can be
created later as well. Defaults to 1.
Expand All @@ -1606,7 +1608,7 @@ def deploy_R_model(self,
base = importr('base')

input_type = "strings"
container_name = "clipper/r_python_container"
container_name = "clipper/r_python_container:{}".format(code_version)

with hide("warnings", "output", "running"):
fname = name.replace("/", "_")
Expand Down
5 changes: 5 additions & 0 deletions clipper_admin/version.py
@@ -0,0 +1,5 @@
import sys
import os
cur_dir = os.path.dirname(os.path.abspath(__file__))
with open(os.path.abspath(os.path.join(cur_dir, "../VERSION.txt")), 'r') as f:
version = f.read().strip()