Conversation
| targets = bakefile["group"][product]["targets"] | ||
| for t in targets: | ||
| tags.extend(bakefile["target"][t]["tags"]) |
There was a problem hiding this comment.
product can actually also be a direct target (for example: "I only want to build Hadoop 3.3.4") or a meta-group ("I want to build all versions of all products"). I think a more correct answer would be something like:
| targets = bakefile["group"][product]["targets"] | |
| for t in targets: | |
| tags.extend(bakefile["target"][t]["tags"]) | |
| for target in bakefile["group"].get(product, {}).get("targets", []): | |
| tags.extend(get_tags_for_product(bakefile, target)) | |
| tags.extend(bakefile["target"].get(product, {}).get("tags", []) |
There was a problem hiding this comment.
It might also be more correct to name the function something like get_tags_for_target.
razvan
left a comment
There was a problem hiding this comment.
After trying it locally, I think I would like to have the preflight check a first class citizen step that I can run locally the same way I can build images by calling something like:
$ ./build_product_images.py -i 23.4.0-rc2 -p superset --preflight-check
and having the script actually running preflight the same way it runs docker buildx bake
This would have the advantage that:
- (obviously) I can easily run the same check on multiple images locally.
- I can use it with
--dryto see what it's doing. - You don't need to rely on standard output for json parsing
- You can get rid of the jq step and give meaningful success (of failure) messages directly from the script.
- You would take any "business logic" out of the GH workflow file.
You could even create a separate script just for this purpose and /or turn the build_product_images.py script into a python package with multiple entry points.
|
Alternative PR: #339 |
|
superseded by the alternative PR linked above |
No description provided.