-
Notifications
You must be signed in to change notification settings - Fork 119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix special compilation cases for kfp samples #91
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If possible, can we generalize the helper functions to take the below sample pipelines as well?
https://github.com/kubeflow/pipelines/blob/master/samples/tutorials/DSL%20-%20Control%20structures/DSL%20-%20Control%20structures.py
https://github.com/kubeflow/pipelines/blob/master/samples/tutorials/Data%20passing%20in%20python%20components/Data%20passing%20in%20python%20components%20-%20Files.py
Generalize for any nested pipeline or pipeline without a decorator as long as there is a config file with the necessary information to compile the pipelines
also fix test_kfp_samples_report.txt
@Tomcli I believe I have addressed your comments. I can't test on either of the linked pipelines because they have functionality that kfp-tekton can't currently compile (exithandler and input artifacts). I will make another pipeline and test on that shortly. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @drewbutlerbb4, this is great. For the other 2 pipelines in the samples folder, we can add them in when exitOp and input artifacts are ready.
/lgtm |
/assign @animeshsingh |
if dsl-compile-tekton --py "${f}" --output "${TEKTON_COMPILED_YAML_DIR}/${f##*/}.yaml" >> "${COMPILER_OUTPUTS_FILE}" 2>&1; | ||
then | ||
echo "SUCCESS: ${f##*/}" | tee -a "${COMPILER_OUTPUTS_FILE}" | ||
IS_SPECIAL=$(grep -E ${f##*/} <<< ${PIPELINES}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For the test_kfp_samples.sh
script, why do we not just generate PipelineRun
for all testdata
scripts and only change one line:
if dsl-compile-tekton --py "${f}" --output "${TEKTON_COMPILED_YAML_DIR}/${f##*/}.yaml" >> "${COMPILER_OUTPUTS_FILE}" 2>&1;
to
if dsl-compile-tekton --generate-pipelinerun --py "${f}" --output "${TEKTON_COMPILED_YAML_DIR}/${f##*/}.yaml" >> "${COMPILER_OUTPUTS_FILE}" 2>&1;
There is no need or benefit to have a mix here, since we will catch the NotImplementedError
s either way.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe I do not understand, but wouldn't this still fail to compile basic_no_decorator.py and compose.py?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi Andrew, yes, I wrote that comment before understanding the full purpose of your test_util.py
. But let me explain the my bigger point below
@@ -0,0 +1,102 @@ | |||
#!/bin/bash |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we not move these tests into our standard compiler test suite?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please move the two functions test_workflow_without_decorator
and test_nested_workflow
into the actual compiler unit test and have the test_util.py
call them and capture the output YAML into a file, like dsl-compile
does
Hi Andrew, I think the code you wrote in The We are at a time in the project where we should add all KFP unit tests in our unit test suite and use that to measure our progress. On another tangent I have been using the |
Thanks for the explanation Christian I now understand your point of view better. I agree that this could be placed in the compiler tests along with some new example pipelines and that could help improve our testing. However I am not entirely sure it shouldn't be included in the I would like to get @Tomcli opinion on your comments because, admittedly I'm not entirely sure this was the intention when I was assigned this work. That being said I'll defer to @ckadner and @Tomcli on this one. |
@ckadner I agree we can also use test_util.py as part of our unit test suite for the compiler, but we should also leverage it in test_kfp_samples.sh for the metrics because nested pipelines and no decorator pipelines need special ways to compile them. If you compile them using the regular
Therefore, I asked Andrew to take those special compile cases and generalize it. This way we can update the list of nested pipelines and no decorator pipelines in the config.yaml and compile them with the required parameters. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay, lets do both: integrate the special test handling in the report script and add 2 unit test cases :-)
else | ||
echo "FAILURE: ${f##*/}" | tee -a "${COMPILER_OUTPUTS_FILE}" | ||
python3 -m test_util ${f} ${CONFIG_FILE} | grep -E 'SUCCESS:|FAILURE:' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this should move up so we can capture the entire output of the compile run in the ${COMPILER_OUTPUTS_FILE}
, not just the SUCCESS
or FAILURE
. The python3
command should return an exit value of 0 for success and 1 for failure
Maybe we could use a ternary stile approach:
if ([ -z "${IS_SPECIAL}" ] \
&& dsl-compile-tekton --py "${f}" --output "${TEKTON_COMPILED_YAML_DIR}/${f##*/}.yaml" \
|| python3 -m test_util ${f} ${CONFIG_FILE} ) >> "${COMPILER_OUTPUTS_FILE}" 2>&1 ;
then
echo "SUCCESS: ${f##*/}" | tee -a "${COMPILER_OUTPUTS_FILE}"
else
echo "FAILURE: ${f##*/}" | tee -a "${COMPILER_OUTPUTS_FILE}"
fi
...or better.. wrap the compile part into a bash function and inside that function check for IS_SPECIAL
:
function compile_dsl {
if [ -z "${IS_SPECIAL}" ]; then
dsl-compile-tekton --py "$1" --output "$2"
else
python3 -m test_util $1 ${CONFIG_FILE} # need to produce YAML file output
fi
}
# ...
if compile_dsl "${f}" "${TEKTON_COMPILED_YAML_DIR}/${f##*/}.yaml" >> "${COMPILER_OUTPUTS_FILE}" 2>&1 ;
then
echo "SUCCESS: ${f##*/}" | tee -a "${COMPILER_OUTPUTS_FILE}"
else
echo "FAILURE: ${f##*/}" | tee -a "${COMPILER_OUTPUTS_FILE}"
fi
@@ -0,0 +1,102 @@ | |||
#!/bin/bash |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please move the two functions test_workflow_without_decorator
and test_nested_workflow
into the actual compiler unit test and have the test_util.py
call them and capture the output YAML into a file, like dsl-compile
does
@drewbutlerbb4 -- I realize I am requesting substantial changes to your PR, some of which are quite aspirational and could be scoped out into a separate PR. Since I am working on another PR in the So, 3 stages:
|
Sure that works for me |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We decided to merge this PR and incorporate further changes in subsequent PRs
Thanks @drewbutlerbb4
/lgtm
/assign @animeshsingh
/approve |
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: animeshsingh The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
Remove some conditions in DSPA
…ecrets feat(backend): Source ObjStore Creds from Env in Tekton Template
Fixes the testing of kfp samples for pipelines that require special compilation instructions.
Specifically these include;
compose.py
basic_no_decorator.py