Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

podman farm support for registries.conf settings #21352

Closed
afbjorklund opened this issue Jan 24, 2024 · 4 comments · Fixed by #21414
Closed

podman farm support for registries.conf settings #21352

afbjorklund opened this issue Jan 24, 2024 · 4 comments · Fixed by #21414
Assignees
Labels
kind/feature Categorizes issue or PR as related to a new feature.

Comments

@afbjorklund
Copy link
Contributor

Feature request description

When building with podman farm, the registries.conf is not used (for instance for insecure registries)

Suggest potential solution

The build and push works OK with podman build, but it fails with podman farm build (due to https)

Have you considered any alternatives?

The workaround (for insecure) is to use --tls-verify=false, but other settings might be harder to replicate.

Additional context

Broken out from issue:

@afbjorklund afbjorklund added the kind/feature Categorizes issue or PR as related to a new feature. label Jan 24, 2024
@afbjorklund
Copy link
Contributor Author

/assign umohnani8

@vrothberg
Copy link
Member

registries.conf is a server-side configuration. I guess that's what's happening here.

@umohnani8
Copy link
Member

Oh okay, I thought it was on the client side similar to how the auth file is on the client side and we send that info over to the server. I hadn't looked into what we did for podman push yet when remote and wrt to registries.conf, but if it is a server side configuration then nothing needs to be done here for farm apart from probably updating the docs.

@afbjorklund
Copy link
Contributor Author

afbjorklund commented Jan 25, 2024

I had changed both client and server configuration, though.

Since the local build, tag and push worked (with localhost:5000)

EDIT: "local" as in running it on the worker with ssh, or connecting to the socket over ssh with podman-remote

Not as in local to the host (I disabled the local farm builder)


More detailed output.

$ podman-remote-static --connection=lima-podman-amd64 build testbuild --tag localhost:5000/testbuild
STEP 1/2: FROM busybox
STEP 2/2: RUN true
COMMIT localhost:5000/testbuild
--> 292111b532be
Successfully tagged localhost:5000/testbuild:latest
292111b532be0e5dff9ebf2fd16551eaa1f370e77538076381cc7203b4746131
$ podman-remote-static push localhost:5000/testbuild
Getting image source signatures
Copying blob sha256:2e112031b4b923a873c8b3d685d48037e4d5ccd967b658743d93a6e56c3064b9
Copying blob sha256:182375824a4b8de5e200d6187b9e9ed8e173f0b829f3532c15cf239b4bbac247
Copying config sha256:292111b532be0e5dff9ebf2fd16551eaa1f370e77538076381cc7203b4746131
Writing manifest to image destination
$ podman-remote-static farm build --format=docker --local=false testbuild --tag localhost:5000/testbuild2
Connecting to "lima-podman-amd64"
Builder "lima-podman-amd64" ready
Farm "lima" ready
Starting build for [{linux amd64 }] at "lima-podman-amd64"
[linux/amd64@lima-podman-amd64] STEP 1/2: FROM busybox
[linux/amd64@lima-podman-amd64] STEP 2/2: RUN true
[linux/amd64@lima-podman-amd64] --> Using cache 39a4c75616b2b790fc823ee1f7f88d2d0194099c97be12a29cca6b69ce49d999
[linux/amd64@lima-podman-amd64] --> 39a4c75616b2
[linux/amd64@lima-podman-amd64] 39a4c75616b2b790fc823ee1f7f88d2d0194099c97be12a29cca6b69ce49d999
finished build for [{linux amd64 }] at "lima-podman-amd64": built 39a4c75616b2b790fc823ee1f7f88d2d0194099c97be12a29cca6b69ce49d999
Getting image source signatures
Copying blob sha256:28c83c9a0efd99200b55456e7e7e0919583996fec8240fbed0c974cfee609ca2
Copying blob sha256:2e112031b4b923a873c8b3d685d48037e4d5ccd967b658743d93a6e56c3064b9
Error: build: building: 1 error occurred:
	* pushing image {"39a4c75616b2b790fc823ee1f7f88d2d0194099c97be12a29cca6b69ce49d999" "docker-archive"} to registry: trying to reuse blob sha256:2e112031b4b923a873c8b3d685d48037e4d5ccd967b658743d93a6e56c3064b9 at destination: pinging container registry localhost:5000: Get "https://localhost:5000/v2/": http: server gave HTTP response to HTTPS client


Using a different name for the farm build, due to the bug with manifest lists versus images.

Error: build: creating manifest list "localhost:5000/testbuild": creating manifest: creating image to hold manifest list: image name "localhost:5000/testbuild:latest" is already associated with image "292111b532be0e5dff9ebf2fd16551eaa1f370e77538076381cc7203b4746131": that name is already in use

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/feature Categorizes issue or PR as related to a new feature.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants