New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tf-serving support model on NFS #688
Conversation
Thanks for your pull request. It looks like this may be your first contribution to a Google open source project (if not, look below for help). Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA). 📝 Please visit https://cla.developers.google.com/ to sign. Once you've signed (or fixed any issues), please reply here (e.g. What to do if you already signed the CLAIndividual signers
Corporate signers
|
/assign @jlewi |
/unassign jlewi |
$ NFS_PVC_NAME=nfs | ||
$ ks generate tf-serving ${MODEL_COMPONENT} --name=${MODEL_NAME} | ||
$ ks param set ${MODEL_COMPONENT} modelPath ${MODEL_PATH} | ||
$ ks param set ${MODEL_COMPONENT} modelLocate ${MODEL_LOCATE} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
modelLocate
may make more sense as modelStorageType
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! I've changed modelLocate
to modelStorageType
CLA not working now. |
user_guide.md
Outdated
|
||
``` | ||
MODEL_COMPONENT=serveInception | ||
MODEL_NAME=inception | ||
MODEL_PATH=gs://kubeflow-models/inception | ||
MODEL_STORAGE_TYPE=cloud |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is not required right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, now i delete this line and detect if 'modelStorageType'
is an attribute of $.params
, so that when model is in gs
, the user don't need to specify MODEL_STORAGE_TYPE=cloud
I signed it! |
CLAs look good, thanks! |
@@ -147,6 +147,13 @@ | |||
runAsUser: 1000, | |||
fsGroup: 1000, | |||
}, | |||
volumeMounts+: if 'modelStorageType' om $.params then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's the om
here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes! thanks for telling!
@@ -147,6 +147,13 @@ | |||
runAsUser: 1000, | |||
fsGroup: 1000, | |||
}, | |||
volumeMounts+: if 'modelStorageType' in $.params then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do you need this "if 'modelStorageType' in $.params then"? It seems redundant?
I think you can add modelStorageType into param and default to cloud or empty string.
@@ -217,6 +224,16 @@ | |||
if $.util.toBool($.params.deployHttpProxy) then | |||
$.parts.httpProxyContainer, | |||
], | |||
volumes+: if 'modelStorageType' in $.params then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reasonable. Yesterday I tried to make modelStorageType
as a param for ks generate tf-serving
but failed (so far). So I add storageType
into params
. When I use modelStorageType
as a param, it shows some error (maybe meets some conflicts).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was thinking just use a param, modelStorageType = cloud by default.
What's the error?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If using modelStorageType
like this:
params:: {
...
...(other params)
modelName: $.params.name,
modelPath: null,
modelStorageType: if "modelStorageType" in $.params then
$.params.modelStorageType
else
"cloud",
It will shows an error:
ERROR generate objects for namespace : unable to read /home/ciscoai/my-kubeflow/environments/default/main.jsonnet: RUNTIME ERROR: Max stack frames exceeded.
-------------------------------------------------
/home/ciscoai/my-kubeflow/components/serveInception.jsonnet:(15:12)-(17:4) object <anonymous>
params+: updatedParams {
name: name,
},
-------------------------------------------------
<builtin> builtin function <operator+>
-------------------------------------------------
<std>:966:25-26 thunk from <function <anonymous>>
std.objectHasEx(o, f, true),
-------------------------------------------------
<builtin> builtin function <objectHasEx>
-------------------------------------------------
-------------------------------------------------
/home/ciscoai/my-kubeflow/vendor/kubeflow/tf-serving/tf-serving.libsonnet:14:7-32 object <anonymous>
$.params.modelStorageType
-------------------------------------------------
/home/ciscoai/my-kubeflow/vendor/kubeflow/tf-serving/tf-serving.libsonnet:14:7-32 object <anonymous>
$.params.modelStorageType
-------------------------------------------------
/home/ciscoai/my-kubeflow/vendor/kubeflow/tf-serving/tf-serving.libsonnet:14:7-32 object <anonymous>
$.params.modelStorageType
-------------------------------------------------
/home/ciscoai/my-kubeflow/vendor/kubeflow/tf-serving/tf-serving.libsonnet:14:7-32 object <anonymous>
$.params.modelStorageType
-------------------------------------------------
/home/ciscoai/my-kubeflow/vendor/kubeflow/tf-serving/tf-serving.libsonnet:14:7-32 object <anonymous>
$.params.modelStorageType
-------------------------------------------------
... (skipped 481 frames)
-------------------------------------------------
<std>:971:33-35 thunk from <function <anonymous>>
if !std.primitiveEquals(ta, tb) then
-------------------------------------------------
<builtin> builtin function <primitiveEquals>
-------------------------------------------------
<std>:971:12-40 function <anonymous>
if !std.primitiveEquals(ta, tb) then
-------------------------------------------------
-------------------------------------------------
<std>:1025:45-71 function <anonymous>
for x in std.objectFields(a) if isContent(std.prune(a[x]))
-------------------------------------------------
<builtin> builtin function <flatMap>
-------------------------------------------------
<builtin> builtin function <$objectFlatMerge>
-------------------------------------------------
/home/ciscoai/my-kubeflow/components/serveInception.jsonnet:20:1-52 $
std.prune(k.core.v1.list.new(tfServing.components))
-------------------------------------------------
<extvar:__ksonnet/components>:2:19-86 object <anonymous>
serveInception: import "/home/ciscoai/my-kubeflow/components/serveInception.jsonnet",
-------------------------------------------------
During manifestation
When I replace modelStorageType
to storageType
(or other identifier):
storageType: if "modelStorageType" in $.params then
$.params.modelStorageType
else
"cloud",
It runs well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean, just doing
params:: {
modelStorageType: "cloud",
}
It will get overridden if you pass the param modelStorageType
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh ok! thanks for telling!
Thanks |
/approve |
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: pdmack The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
* tf-serving support model on NFS * modelLocate -> modelStorageType * make default modelStorageType 'cloud' * mistake om -> in * remove redundant if clause * make 'cloud' as default value of modelStorageType
As for now, tf-serving can only support model located on cloud. I changed
tf-serving.libsonnet
file, add 2 more params:modelStorageType
andnfsPVC
. For now,modelStorageType
can only detect if it'snfs
. If so, container will mount a persistentVolumeClaim(PVC) specified bynfcPVC
. We can support more choices in the future.A guideline for running nfs-based model is put on
components/k8s-model-server
folder.As a new param is added, I changed
user_guide.md
,Serve a model using Tensorflow Serving
section, and addks param set ${MODEL_COMPONENT} modelLocate ${MODEL_STORAGE_TYPE}
. Also give a quick guideline ofCreate a component for your model located on nfs
followingCreate a component for your model located on cloud
.This change is