Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dsl PipelineParam does not work under Image or Command #521

Closed
nikhilsharma93 opened this issue Dec 12, 2018 · 6 comments
Closed

dsl PipelineParam does not work under Image or Command #521

nikhilsharma93 opened this issue Dec 12, 2018 · 6 comments

Comments

@nikhilsharma93
Copy link

If I try to pass a dsl.PipelineParam to either the image or the command part of a container op, it does not parse it correctly. For instance, if I pass a param called deployimage, it ends up showing as '{{pipelineparam:op=;name=deployimage;value=}}' in the .yaml file, whereas it should show up as something like {{inputs.parameters.deployimage}}, just like it would if it were passed to arguments section of the container op.

If uploaded like this to the pipelines UI, I get a json decode error. However, if I manually edit the .yaml and change the it to the expected syntax and add a input section under container, it works.

Example code to reproduce:

import kfp.dsl as dsl


@dsl.pipeline(name='paramtest', description='Test passing params to image and command')
def deploy(deployimage=dsl.PipelineParam(name='deployimage', value='google/cloud-sdk:216.0.0'):
    op1 = dsl.ContainerOp(
            name='deploy',
            image='{}'.format(deployimage),
            command=['sh', '-c', \
                     'WORK_DIR="/home/tmp/" && mkdir -p $WORK_DIR && cd $WORK_DIR && \
                      echo {}'.format(deployimage)])


if __name__ == '__main__':
  import kfp.compiler as compiler
  compiler._compile_pipeline_function(None, __file__ + '.tar.gz')
@gaoning777
Copy link
Contributor

We didn't expose the pipelineparam for command or image due to version and sharing reasons. If the command and image are exposed to be configurable during the run time, the behavior of the pipeline would be totally different even if the dsl/yaml is not changed. For example, users of a pipeline could pass in an image that has the same entrypoint but totally different logic. Then this dsl/yaml pipeline definition(description) does not do its job.

@qimingj, @Ark-kun, @hongye-sun WDYT?

@Ark-kun
Copy link
Contributor

Ark-kun commented Dec 20, 2018

We definitely need templateable command.
I'm not so keen about replacing the image though.

@nikhilsharma93 What use cases do you have for changeable container images? My component vision is that component corresponds to a container entry point. Replacing the image => replacing component.

@qimingj
Copy link
Contributor

qimingj commented Dec 21, 2018

+1 on supporting params in command. I can also see params in container image path useful, such as taking a different tag ("dev", "prod"). Since Argo supports it I think we should too.

@nikhilsharma93
Copy link
Author

@Ark-kun at our company we have an internal software package that we use to fetch images and models, which gets updated frequently enough that it would need an updated docker image to run on.

As an example, the way I run a model deployment pipeline is to have a run.sh in my google bucket that knows what files to download to the docker image at runtime and how to run them. The pipeline .yaml file on its own knows only the path to that .sh file. I would want to keep that pipeline as it is even if our API calls to the package change, but be able to provide an updated docker image path.

@jinchihe
Copy link
Member

+1, I used the V1VolumeMount to add local volume, I want to read parameter what user input from UI. But seems the k8s workflow cannot resolve pipelineparam. Thanks.

@gaoning777
Copy link
Contributor

replace the placeholder in the command: #637

Linchin pushed a commit to Linchin/pipelines that referenced this issue Apr 11, 2023
* Create a onetime script to setup a folder to own the Kubeflow codelab
  projects

* Create a script to be used to bulk move the projects into the folder.
magdalenakuhn17 pushed a commit to magdalenakuhn17/pipelines that referenced this issue Oct 22, 2023
HumairAK pushed a commit to red-hat-data-services/data-science-pipelines that referenced this issue Mar 11, 2024
* fix long resource name;

* fix python 3.6 test
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants