Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fedora Silverblue + podman + DevPod does not work #970

Closed
twaananen opened this issue Mar 21, 2024 · 11 comments · Fixed by #982
Closed

Fedora Silverblue + podman + DevPod does not work #970

twaananen opened this issue Mar 21, 2024 · 11 comments · Fixed by #982
Assignees

Comments

@twaananen
Copy link

What happened?
I downloaded the DevPod AppImage, setup the docker provider to use podman instead and tried the example workspaces. None work.

What did you expect to happen instead?

I expected the example workspaces to work.

How can we reproduce the bug? (as minimally and precisely as possible)

  • Fedora Silverblue (Bazzite more specifically)
  • Devpod AppImage 0.5.4
  • podman
  • setup docker provider:
  • docker Host : /run/user/1000/podman/podman.sock
  • docker Path: /usr/bin/podman

devcontainer.json: From example workspace https://github.com/microsoft/vscode-remote-try-java

Local Environment:

  • DevPod Version: v0.5.4
  • Operating System: linux (Fedora Silverblue 39 / Bazzite)
  • ARCH of the OS: AMD64

DevPod Provider:

  • Local/remote provider: docker (podman)

Output:

[22:12:55] info Workspace vscode-remote-try-java already exists [22:12:55] debug Acquire workspace lock... [22:12:55] debug Acquired workspace lock... [22:12:55] info Creating devcontainer... [22:12:55] debug Inject and run command: '/tmp/.mount_gearlegXxe21/usr/bin/devpod-cli' helper ssh-server --stdio --debug [22:12:55] debug Attempting to create SSH client [22:12:55] debug Execute command locally [22:12:55] debug SSH client created [22:12:55] debug SSH session created [22:12:55] info Execute SSH server command: bash -c '/tmp/.mount_gearlegXxe21/usr/bin/devpod-cli' agent workspace up --workspace-info 'H4sIAAAAAAAA/7yTzW7bOhCF32XWkkjJdmJzF+QGiHHvrYNUKdpuAooc2Uz4I5CU4iDwuxeMnNjID9pVVwLBGc45Zz49wYPz96HjAlderZUFBmTjDJLojFGkkDh0ThLhbMRtDERiy3sdyWtbIEMQTmLu0biIefSP+R0f+KGiuAvOQnaYBOwJlAQGH3dCBv2b67yaTmQJGXRKxN4jMNjE2AVGiOvQrj3vNsVaxU3f8BAwhkI4Q9qGlu2sauaLRUNLsZjLti0bMV/MymoxPSnbSkopphIppc1Ucnp6yk/mtBWymdH5RBKjhHfBtfETj0mQd4OS6JMny01SJp24Rw8ZuC4qZ0O6+md1/u/F9e3VWX2ZjgPXfSolffCkUZZ0ThqeQuoD+qvxTQks+h53GSy/nJ3Xy2/L+sdtvfz/YnVTp0c+KN3tMjBcbJRNIe8yUBKPlKWsRiewyyC43o/LWKt4jZ0LKjr/eJTtmOhzlr+NYpeB8MiT41oZDJGbDhhUtJrmdJKXi7qcs8mMUfoTMtA8xJuQdL9WVGVdUVZWbDZLFXviUp4jcmkCX6ONSbF2gmtgkFynLfC4SXFG05HCuN7G2zVyr3H9fYtV+RrzSHMutIIMpHuw2nF5c/3fx561a2MeNvsu4lEjDxjISyMZaDErppABblGMgY/L1xjC8TlJftH4buXKhsh1ctNyHZIdtMMRNJerrzUw6K3aMkKI7y1JqyclpXT/yv5TBCfuIUEg+hCdSRLSQavVAcVnJN6g8J7UceiB1D+e/gnCf+0H+AUAAP//AQAA//99s0bc0gQAAA==' --debug [22:12:55] info Use /home/tommi/.devpod/agent/contexts/default/workspaces/vscode-remote-try-java as workspace dir [22:12:55] debug Created logger [22:12:55] debug Received ping from agent [22:12:55] debug Workspace Folder already exists /home/tommi/.devpod/agent/contexts/default/workspaces/vscode-remote-try-java/content [22:12:55] debug Workspace exists, skip downloading [22:12:55] debug Using docker command '/usr/bin/podman' [22:12:55] debug Use docker environment variables: [DOCKER_HOST=unix:///run/user/1000/podman/podman.sock] [22:12:55] debug Process OCI feature [22:12:55] debug Parse dev container feature in /tmp/devpod/features/fd3c8dc6d7/extracted [22:12:55] debug Prebuild hash from: [22:12:55] debug Arch: amd64 [22:12:55] debug Config: {"name":"Java","features":{"ghcr.io/devcontainers/features/java:1":{"installGradle":"false","installMaven":"true","mavenVersion":"3.8.6","version":"none"}},"image":"mcr.microsoft.com/devcontainers/java:0-17"} [22:12:55] debug DockerfileContent: [22:12:55] debug ContextHash: 8b454669e28fb2b437c5d9fdbcee183608a4106b995f6f45c5127fe55865d4c3 [22:12:55] debug Try to find prebuild image devpod-e592f195ce14b74896a3a5fc51b3f2f0 in repositories [22:12:55] debug Error trying to find local image vsc-content-956d0:devpod-e592f195ce14b74896a3a5fc51b3f2f0: inspect container: [] /usr/bin/podman: /tmp/.mount_gearlegXxe21/usr/lib/libselinux.so.1: no version information available (required by /lib64/libsubid.so.4) /usr/bin/podman: /tmp/.mount_gearlegXxe21/usr/lib/libselinux.so.1: no version information available (required by /lib64/libsemanage.so.2) Error: vsc-content-956d0:devpod-e592f195ce14b74896a3a5fc51b3f2f0: image not known exit status 125 [22:12:55] info Build with docker buildx... [22:12:55] debug Running docker /usr/bin/podman: docker buildx build -f /home/tommi/.devpod/agent/contexts/default/workspaces/vscode-remote-try-java/content/.devcontainer/.devpod-internal/Dockerfile-with-features --load -t vsc-content-956d0:devpod-e592f195ce14b74896a3a5fc51b3f2f0 --build-arg _DEV_CONTAINERS_IMAGE_USER=root --build-arg _DEV_CONTAINERS_BASE_IMAGE=mcr.microsoft.com/devcontainers/java:0-17 --build-arg BUILDKIT_INLINE_CACHE=1 --target dev_containers_target_stage /home/tommi/.devpod/agent/contexts/default/workspaces/vscode-remote-try-java/content/.devcontainer [22:12:55] info /usr/bin/podman: /tmp/.mount_gearlegXxe21/usr/lib/libselinux.so.1: no version information available (required by /lib64/libsubid.so.4) [22:12:55] info /usr/bin/podman: /tmp/.mount_gearlegXxe21/usr/lib/libselinux.so.1: no version information available (required by /lib64/libsemanage.so.2) [22:12:55] info STEP 1/11: FROM mcr.microsoft.com/devcontainers/java:0-17 AS dev_containers_target_stage [22:12:55] info STEP 2/11: USER root [22:12:55] info --> e881c4bb8bb5 [22:12:55] info STEP 3/11: COPY ./.devpod-internal/ /tmp/build-features/ [22:12:56] info --> 1edba4755780 [22:12:56] info STEP 4/11: RUN chmod -R 0755 /tmp/build-features && ls /tmp/build-features [22:12:56] info buildah-oci-runtime: /tmp/.mount_gearlegXxe21/usr/lib/libselinux.so.1: no version information available (required by /lib64/libsubid.so.4) [22:12:56] info buildah-oci-runtime: /tmp/.mount_gearlegXxe21/usr/lib/libselinux.so.1: no version information available (required by /lib64/libsemanage.so.2) [22:12:56] info error running container: from /usr/bin/crun creating container for [/bin/sh -c chmod -R 0755 /tmp/build-features && ls /tmp/build-features]: /usr/bin/crun: /tmp/.mount_gearlegXxe21/usr/lib/libsystemd.so.0: version `LIBSYSTEMD_246' not found (required by /usr/bin/crun) [22:12:56] info : exit status 1 [22:12:56] info time="2024-03-21T22:12:56+02:00" level=error msg="did not get container create message from subprocess: EOF" [22:12:56] info Error: building at STEP "RUN chmod -R 0755 /tmp/build-features && ls /tmp/build-features": while running runtime: exit status 1 [22:12:56] info exit status 1 [22:12:56] info build image [22:12:56] info github.com/loft-sh/devpod/pkg/driver/docker.(*dockerDriver).buildxBuild [22:12:56] info /home/runner/work/devpod/devpod/pkg/driver/docker/build.go:301 [22:12:56] info github.com/loft-sh/devpod/pkg/driver/docker.(*dockerDriver).BuildDevContainer [22:12:56] info /home/runner/work/devpod/devpod/pkg/driver/docker/build.go:75 [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).buildImage [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/build.go:323 [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).extendImage [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/build.go:132 [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).build [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/build.go:101 [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).runSingleContainer [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/single.go:70 [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).Up [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/run.go:120 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.(*UpCmd).devPodUp [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:386 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.(*UpCmd).up [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:160 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.(*UpCmd).Run [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:94 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.NewUpCmd.func1 [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:52 [22:12:56] info github.com/spf13/cobra.(*Command).execute [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:916 [22:12:56] info github.com/spf13/cobra.(*Command).ExecuteC [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:1044 [22:12:56] info github.com/spf13/cobra.(*Command).Execute [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:968 [22:12:56] info github.com/loft-sh/devpod/cmd.Execute [22:12:56] info /home/runner/work/devpod/devpod/cmd/root.go:90 [22:12:56] info main.main [22:12:56] info /home/runner/work/devpod/devpod/main.go:8 [22:12:56] info runtime.main [22:12:56] info /opt/hostedtoolcache/go/1.20.5/x64/src/runtime/proc.go:250 [22:12:56] info runtime.goexit [22:12:56] info /opt/hostedtoolcache/go/1.20.5/x64/src/runtime/asm_amd64.s:1598 [22:12:56] info buildx build [22:12:56] info github.com/loft-sh/devpod/pkg/driver/docker.(*dockerDriver).BuildDevContainer [22:12:56] info /home/runner/work/devpod/devpod/pkg/driver/docker/build.go:77 [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).buildImage [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/build.go:323 [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).extendImage [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/build.go:132 [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).build [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/build.go:101 [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).runSingleContainer [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/single.go:70 [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).Up [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/run.go:120 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.(*UpCmd).devPodUp [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:386 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.(*UpCmd).up [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:160 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.(*UpCmd).Run [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:94 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.NewUpCmd.func1 [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:52 [22:12:56] info github.com/spf13/cobra.(*Command).execute [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:916 [22:12:56] info github.com/spf13/cobra.(*Command).ExecuteC [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:1044 [22:12:56] info github.com/spf13/cobra.(*Command).Execute [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:968 [22:12:56] info github.com/loft-sh/devpod/cmd.Execute [22:12:56] info /home/runner/work/devpod/devpod/cmd/root.go:90 [22:12:56] info main.main [22:12:56] info /home/runner/work/devpod/devpod/main.go:8 [22:12:56] info runtime.main [22:12:56] info /opt/hostedtoolcache/go/1.20.5/x64/src/runtime/proc.go:250 [22:12:56] info runtime.goexit [22:12:56] info /opt/hostedtoolcache/go/1.20.5/x64/src/runtime/asm_amd64.s:1598 [22:12:56] info build image [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).runSingleContainer [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/single.go:78 [22:12:56] info github.com/loft-sh/devpod/pkg/devcontainer.(*runner).Up [22:12:56] info /home/runner/work/devpod/devpod/pkg/devcontainer/run.go:120 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.(*UpCmd).devPodUp [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:386 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.(*UpCmd).up [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:160 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.(*UpCmd).Run [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:94 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.NewUpCmd.func1 [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:52 [22:12:56] info github.com/spf13/cobra.(*Command).execute [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:916 [22:12:56] info github.com/spf13/cobra.(*Command).ExecuteC [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:1044 [22:12:56] info github.com/spf13/cobra.(*Command).Execute [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:968 [22:12:56] info github.com/loft-sh/devpod/cmd.Execute [22:12:56] info /home/runner/work/devpod/devpod/cmd/root.go:90 [22:12:56] info main.main [22:12:56] info /home/runner/work/devpod/devpod/main.go:8 [22:12:56] info runtime.main [22:12:56] info /opt/hostedtoolcache/go/1.20.5/x64/src/runtime/proc.go:250 [22:12:56] info runtime.goexit [22:12:56] info /opt/hostedtoolcache/go/1.20.5/x64/src/runtime/asm_amd64.s:1598 [22:12:56] info devcontainer up [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.(*UpCmd).Run [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:96 [22:12:56] info github.com/loft-sh/devpod/cmd/agent/workspace.NewUpCmd.func1 [22:12:56] info /home/runner/work/devpod/devpod/cmd/agent/workspace/up.go:52 [22:12:56] info github.com/spf13/cobra.(*Command).execute [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:916 [22:12:56] info github.com/spf13/cobra.(*Command).ExecuteC [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:1044 [22:12:56] info github.com/spf13/cobra.(*Command).Execute [22:12:56] info /home/runner/work/devpod/devpod/vendor/github.com/spf13/cobra/command.go:968 [22:12:56] info github.com/loft-sh/devpod/cmd.Execute [22:12:56] info /home/runner/work/devpod/devpod/cmd/root.go:90 [22:12:56] info main.main [22:12:56] info /home/runner/work/devpod/devpod/main.go:8 [22:12:56] info runtime.main [22:12:56] info /opt/hostedtoolcache/go/1.20.5/x64/src/runtime/proc.go:250 [22:12:56] info runtime.goexit [22:12:56] info /opt/hostedtoolcache/go/1.20.5/x64/src/runtime/asm_amd64.s:1598 [22:12:56] debug Connection to SSH Server closed [22:12:56] debug Done creating devcontainer [22:12:56] fatal Process exited with status 1 run agent command github.com/loft-sh/devpod/pkg/devcontainer/sshtunnel.ExecuteCommand.func2 /home/runner/work/devpod/devpod/pkg/devcontainer/sshtunnel/sshtunnel.go:122 runtime.goexit /opt/hostedtoolcache/go/1.20.5/x64/src/runtime/asm_amd64.s:1598

Other info:
I tried disabling SELinux with setenforce 0

@m2Giles
Copy link

m2Giles commented Mar 21, 2024

I also ran into this using the appimage.

I was able to startup a container using podman with a native package, but it would fail open in vscode. This behaviour would continue when using docker as well. The error being ENAMETOOLONG. However, it would open in the vscode browser session. I'm also on an rpm-ostree desktop as well.

image

@89luca89
Copy link
Contributor

Thanks for reporting @twaananen will investigate 👍

@jpotts10
Copy link

I also ran into this using the appimage.

I was able to startup a container using podman with a native package, but it would fail open in vscode. This behaviour would continue when using docker as well. The error being ENAMETOOLONG. However, it would open in the vscode browser session. I'm also on an rpm-ostree desktop as well.

image

Just wanted to comment saying I also experienced this issue on Bluefin-dx GTS on a fresh install in a VM two days ago.

@m2Giles
Copy link

m2Giles commented Apr 8, 2024

Still seeing this issue with the appimage. The image below just keeps looping. It works with a locally installed version of devpod.

image

@pascalbreuninger
Copy link
Member

@m2Giles we haven't released a new version yet, it should work if you run from source. We'll cut one towards the end of the week

@ghost
Copy link

ghost commented Apr 12, 2024

I'm currently using version v0.5.5 and still get this error.

I'm on Bazzite as well and also trying to use Podman.

I get the exact same error as @twaananen.

@MileaRobertStefan
Copy link

MileaRobertStefan commented Apr 15, 2024

v0.5.5

12:56:39 info main.main
12:56:39 info   D:/a/devpod/devpod/main.go:8
12:56:39 info runtime.main
12:56:39 info   C:/hostedtoolcache/windows/go/1.21.8/x64/src/runtime/proc.go:267
12:56:39 info runtime.goexit
12:56:39 info   C:/hostedtoolcache/windows/go/1.21.8/x64/src/runtime/asm_amd64.s:1650`

devcontainer.json:

// For format details, see https://aka.ms/devcontainer.json. For config options, see the README at:
// https://github.com/microsoft/vscode-dev-containers/tree/v0.231.5/containers/elixir-phoenix-pgitostgres
{
  "name": "App Elixir",

  "service": "app",

  // "build": {
  //   "dockerfile": "../Dockerfile",
  //   "context": ".."
  // },

  "dockerComposeFile": "../docker-compose.yml",
  "shutdownAction": "stopCompose",

  "workspaceFolder": "/app",

  "features": {
    // "ghcr.io/devcontainers/features/docker-in-docker:2": [],
    // "ghcr.io/devcontainers/features/go:1": [],
    "ghcr.io/devcontainers-contrib/features/elixir-asdf:2": [],
    "ghcr.io/devcontainers/features/github-cli:1": [],
    "ghcr.io/devcontainers/features/sshd:1": []
  },

  "customizations": {
    "vscode": {
      "settings": {
        "terminal.integrated.defaultProfile.linux": "bash"
      },
      "extensions": [
        "streetsidesoftware.code-spell-checker",
        "GitHub.copilot",
        "GitHub.copilot-chat",
        "github.vscode-pull-request-github",
        "jakebecker.elixir-ls",
        "phoenixframework.phoenix",
        "redhat.vscode-yaml"
      ]
    }
  },

  "postCreateCommand": "",

  // Use 'forwardPorts' to make a list of ports inside the container available locally.
  // This can be used to network with other containers or with the host.
  "forwardPorts": [4000, 8080, 80]

  // Use 'postCreateCommand' to run commands after the container is created.
  // "postCreateCommand": "mix deps.get",

  // "remoteUser": "vscode"
}

I'm trying to setup an elixir environment.

@deniseschannon deniseschannon added the bug Something isn't working label May 8, 2024 — with Linear
@deniseschannon deniseschannon removed bug Something isn't working kind/bug labels May 8, 2024
@Pra3t0r5
Copy link

Im having the same issue on Bluefin (Fedora Silverblue)
Screenshot from 2024-05-27 16-33-47

@zentasumu
Copy link

zentasumu commented Jul 24, 2024

Issue is still present.

2024-07-24T19:54:27,386077704+08:00

AppImage version 0.5.18, running on Fedora 40 Atomic Sway (Sericea).

@midoriiro
Copy link

Still present in 0.5.19 on SilverBlue. I did not encountered this issue on Sericea few day ago on 0.5.18 and 0.5.19. Until this is fixed, a workaround can set Docker binary in your Docker provider as podman-remote. Since is not depend on conmon.so that should do the trick. Not ideal but is work fine.

@3timeslazy
Copy link

3timeslazy commented Aug 19, 2024

For everyone struggling with selinux related "Permission denied" error, try adding the following in your devcontainer.json

{
    // some fields

    "workspaceMount": "",
    "workspaceFolder": "/workspaces/${localWorkspaceFolderBasename}",
    "runArgs": [
        // other args
        "--volume=${localWorkspaceFolder}:/workspaces/${localWorkspaceFolderBasename}:Z"
    ]
}

This will make devpod (and any other devcontainer implementation, I guess) to mount everything with :Z suffix which will relabel the files

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.