New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE REQUEST] Skip unneeded stages from multi-stages #2469
Comments
@TomSweeneyRedHat PTAL |
@grantral please correct me if necessary, but this appears to be part of the Docker buildkit functionality. @rhatdan is this something we should try adding at this point to handle this particular scenario, or should we include it in whatever work would be necessary to provide the buildkit functionality. Note, I'm practically illiterate as far as buildkit goes, I don't know a lot about it. |
Yep. |
Sure we could grab it, if anyone had time to work on it. It would be best if community could open PRs to add this feature. |
A friendly reminder that this issue had no activity for 30 days. |
A friendly reminder that this issue had no activity for 30 days. |
A friendly reminder that this issue had no activity for 30 days. |
A friendly reminder that this issue had no activity for 30 days. |
@flouthoc PTAL |
Thanks I'll take a look. |
A friendly reminder that this issue had no activity for 30 days. |
@flouthoc any progress? |
Sorry was not able to take a look. I'll take a look in coming days. |
I think we would need a design change for storing and processing stages. Afaik we don't have a easy way to identify indirect dependency of stages in a multi-stage build. We would need to store and process stages in a This is just my early proposal for this and i am think @vrothberg @nalind @rhatdan @giuseppe @mtrmac Any suggestions. |
A friendly reminder that this issue had no activity for 30 days. |
I'd describe this as a bug. Parallelization is obviously a nice to have feature, but that's (probably) missing the point of this issue. At the very least, this difference in behavior is currently a blocker for us, to migrate from docker to buildah. I imagine it would be complex to add support for parallel stages, but surely it wouldn't be particularly problematic to pre-compute the dependency graph, and omit unnecessary stages? @flouthoc any updates on this? I'm entirely unfamiliar with the codebase, but I might have a crack implementing a (simple) fix, unless there's something in progress. |
@joeycumines Sure !!! could you please share your approach. |
@grantral Thanks, this will be out in next buildah release. |
This is not in https://github.com/containers/buildah/releases/tag/v1.26.2 so I assume it will be in the next minor release? v1.27.0? |
@lucacome Yes buildah |
Yes this will be released in the next couple of weeks. By August definitely. Podman rc1 went out this week. We will cut a release of Buildah as soon as we successfully do vendor dance and merge buildah into Podman. |
I've installed |
@lucacome Works fine for me, see first stage is skipped entirely in the build. Please confirm if you are using the right version, could you share Containerfile and what do you expect to see in build output ? [root@fedora bin]# cat Dockerfile
FROM alpine
RUN echo hello
FROM alpine
RUN echo world
[root@fedora bin]# ./podman build --no-cache -t test .
[2/2] STEP 1/2: FROM alpine
[2/2] STEP 2/2: RUN echo world
world
[2/2] COMMIT test
--> 771f01f08fa
Successfully tagged localhost/test:latest
771f01f08fa20cfd1359558121eafe598541f14264c5d5700866c8587e473fc0
[root@fedora bin]# ./podman version
Client: Podman Engine
Version: 4.2.0
API Version: 4.2.0
Go Version: go1.18.3
Git Commit: 7fe5a419cfd2880df2028ad3d7fd9378a88a04f4
Built: Fri Aug 12 09:09:37 2022
OS/Arch: linux/amd64
[root@fedora bin]# |
Can we move this code behind a flag. We have use case where we want to build images so that they are available for manifest/deployment. But buildah now skips the unused target images and breaking our builds. |
@SaurabhAhuja1983 Sure, This was discussed somewhere before as well. would |
Description
docker/cli#1134
Steps to reproduce the issue:
buildah bud --target backend
Describe the results you received:
Describe the results you expected:
docker/cli#1134 (comment)
Output of
rpm -q buildah
orapt list buildah
:Output of
buildah version
:Output of
podman version
if reporting apodman build
issue:Output of
cat /etc/*release
:Output of
uname -a
:Output of
cat /etc/containers/storage.conf
:Output of
cat Dockerfile
:The text was updated successfully, but these errors were encountered: