Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of memory when building multi-platform images #621

Closed
tk-nguyen opened this issue May 21, 2022 · 7 comments
Closed

Out of memory when building multi-platform images #621

tk-nguyen opened this issue May 21, 2022 · 7 comments

Comments

@tk-nguyen
Copy link

Hello, I'm having some issues building arm64 images with this action. Strangely, the build was fine in some previous jobs.

Expected behaviour

Build should finish normally.

Actual behaviour

Build got killed, probably because of memory issues.

Configuration

Here's the workflow:

on:
  pull_request:
  push:
    paths:
      - "src/**"
      - "Cargo*"
      - "Dockerfile"
    tags:
      - "v*.*.*"

name: Build and push image to GitHub Container Registry

jobs:
  build-and-push:
    name: Build and push image
    runs-on: ubuntu-latest

    steps:
      - name: Checkout repository
        uses: actions/checkout@v3
        with:
          submodules: "recursive"

      - name: Set up QEMU
        uses: docker/setup-qemu-action@v2

      - name: Set up Docker Buildx
        uses: docker/setup-buildx-action@v2

      - name: Login to GitHub Container Registry
        uses: docker/login-action@v2
        with:
          registry: ghcr.io
          username: ${{ github.actor }}
          password: ${{ secrets.GITHUB_TOKEN }}

      - name: Set some variables for the image
        run: |
          echo "IMAGE_VERSION=${GITHUB_REF_NAME#v}" >> $GITHUB_ENV
          echo "IMAGE_NAME=${GITHUB_REPOSITORY,,}" >> $GITHUB_ENV
      - name: Build and push the image
        uses: docker/build-push-action@v3
        with:
          context: .
          platforms: linux/amd64,linux/arm64
          push: true
          cache-from: |
            type=gha
          cache-to: |
            type=gha
          tags: |
            ghcr.io/${{ env.IMAGE_NAME }}:latest
            ghcr.io/${{ env.IMAGE_NAME }}:${{ env.IMAGE_VERSION }}

Logs

2022-05-21T15:25:01.4848764Z #18 [linux/arm64 build 4/4] RUN cargo build --release
2022-05-21T15:25:53.8174393Z #18 1988.6    Compiling dtoa v0.4.8
2022-05-21T15:26:00.8748428Z #18 1995.6    Compiling rustc-demangle v0.1.21
2022-05-21T15:26:40.5438038Z #18 2035.3    Compiling crc32fast v1.3.2
2022-05-21T15:26:47.6095100Z #18 2042.4    Compiling rustls v0.20.6

<lots of logs.......>

2022-05-21T16:33:55.5166199Z #18 6070.3    Compiling color-eyre v0.5.11
2022-05-21T16:34:15.0620335Z #18 6089.9    Compiling reqwest v0.11.10
2022-05-21T16:34:52.0416710Z #18 6126.9    Compiling reqwest v0.9.24
2022-05-21T16:46:00.6039013Z #18 6795.4    Compiling ddg v0.5.0 (https://github.com/tk-nguyen/ddg#caf14098)
2022-05-21T16:47:39.2254588Z #18 6894.1    Compiling scraper v0.13.0
2022-05-21T16:53:17.2139199Z #18 7232.0 Killed
2022-05-21T16:53:17.3063516Z #18 ERROR: process "/bin/sh -c cargo build --release" did not complete successfully: exit code: 137
2022-05-21T16:53:17.3822671Z ------
2022-05-21T16:53:17.3823888Z  > [linux/arm64 build 4/4] RUN cargo build --release:
2022-05-21T16:53:17.3824683Z  tracing-error v0.1.2
2022-05-21T16:53:17.3825783Z   Compiling color-spantrace v0.1.6
2022-05-21T16:53:17.3826179Z   Compiling hyper-rustls v0.23.0
2022-05-21T16:53:17.3826561Z #18 6058.7    Compiling hyper-rustls v0.17.1
2022-05-21T16:53:17.3826928Z   Compiling color-eyre v0.5.11
2022-05-21T16:53:17.3827185Z #18 6089.9    Compiling reqwest v0.11.10
2022-05-21T16:53:17.3827454Z #18 6126.9    Compiling reqwest v0.9.24
2022-05-21T16:53:17.3828217Z   Compiling ddg v0.5.0 (https://github.com/tk-nguyen/ddg#caf14098)
2022-05-21T16:53:17.3828546Z #18 6894.1    Compiling scraper v0.13.0
2022-05-21T16:53:17.3828790Z #18 7232.0 Killed

I also tried to build locally using buildx, and arm64 builds also got killed because the host machine ran out of memory. I saw people were getting similar errors on docker/buildx#292 and docker/buildx#359. There is a proposal here but there has been no updates since. Is there any other way I can limit the memory of the builders?

@yaleman
Copy link

yaleman commented Oct 21, 2022

I'm having this issue when running "cargo fetch", the container gets OOMkilled before it even gets through that. Has anyone figured out a workaround?

luander added a commit to luander/ddnsrs that referenced this issue Oct 26, 2022
szabodanika added a commit to szabodanika/microbin that referenced this issue Nov 7, 2022
Make builds run in parallel because build is failing with out of memory error. See docker/build-push-action#654 (comment) and docker/build-push-action#621
szabodanika added a commit to szabodanika/microbin that referenced this issue Nov 7, 2022
Try to fix out of memory error on GH Actions build. See docker/build-push-action#654 (comment) and docker/build-push-action#621
szabodanika added a commit to szabodanika/microbin that referenced this issue Nov 7, 2022
Fix out of memory error on GH Actions build and automate GitHub Release with artifacts ( #47). See docker/build-push-action#654 (comment) and docker/build-push-action#621
szabodanika added a commit to szabodanika/microbin that referenced this issue Nov 7, 2022
Fix out of memory error on GH Actions build and automate GitHub Release with artifacts ( #47). See docker/build-push-action#654 (comment) and docker/build-push-action#621
szabodanika added a commit to szabodanika/microbin that referenced this issue Nov 8, 2022
Fix out of memory error on GH Actions build and automate GitHub Release with artifacts ( #47). See docker/build-push-action#654 (comment) and docker/build-push-action#621
szabodanika added a commit to szabodanika/microbin that referenced this issue Nov 8, 2022
Fix out of memory error on GH Actions build and automate GitHub Release with artifacts ( #47). See docker/build-push-action#654 (comment) and docker/build-push-action#621
@MrHash
Copy link

MrHash commented Nov 29, 2022

I found your blog @yaleman and for me CARGO_NET_GIT_FETCH_WITH_CLI=true also resolves the OOM 137 errors during the cargo fetch stage while using buildx.

@tk-nguyen
Copy link
Author

For anyone who stumbles across here, it seems to be a problem with libgit2 for rust. Either set CARGO_NET_GIT_FETCH_WITH_CLI=true as above or or put

[net]
git-fetch-with-cli = true 

in the .cargo/config.toml file at the root of your project.
Reference: https://doc.rust-lang.org/cargo/reference/config.html#netgit-fetch-with-cli

cbs228 added a commit to cbs228/sameold that referenced this issue Feb 19, 2023
Add workaround for docker/build-push-action#621. Fixes failing
ARM Linux builds.

See Also

1. <docker/build-push-action#621>
cbs228 added a commit to cbs228/sameold that referenced this issue Feb 19, 2023
Add workaround for docker/build-push-action#621. Fixes failing
ARM Linux builds.

See Also

1. <docker/build-push-action#621>
cbs228 added a commit to cbs228/sameold that referenced this issue Feb 19, 2023
Add workaround for docker/build-push-action#621. Fixes failing
ARM Linux builds.

See Also

1. <docker/build-push-action#621>
cbs228 added a commit to cbs228/sameold that referenced this issue Feb 20, 2023
Add workaround for docker/build-push-action#621. Fixes failing
ARM Linux builds.

Longer-term, we may want to explore pre-downloading with
`cargo vendor` or seeing if the Rust `cross` command is any
better.

See Also

1. <docker/build-push-action#621>
cbs228 added a commit to cbs228/sameold that referenced this issue Feb 25, 2023
Instead of downloading sources once per job, or trying to keep
cargo caches for them separately, use *one* dependency cache
that is generated with `cargo vendor`. The vendored sources
are platform-independent and support all targets and all
features.

This is a substantial speed boost for our Docker Actions on
non-x86_64 platforms, and it is also a workaround for
docker/build-push-action#621.

See Also

1. <docker/build-push-action#621>
@cbs228
Copy link

cbs228 commented Feb 25, 2023

Either set CARGO_NET_GIT_FETCH_WITH_CLI=true as above

This solution works, but the Git executable needs to be part of the Docker image used for the build. You will either need to use a full Debian (not slim) image or add Git at build-time via the system package manager.

There is a faster way: execute cargo vendor on the Github Runner first. Then make sure the .cargo/ and vendor/ directories become part of your source tree for the Docker build. The vendor command downloads sources for all targets and features, and the resulting directories are OS- and architecture-independent.

My workflow takes advantage of the Github cache action. I run cargo vendor in its own job, before my main run matrix, and cache the result for later jobs. This substantially decreases the load on the Cargo infrastructure… and it's also faster.

Some things of note:

  • If your repo contains Cargo.lock, you can pin your cache to it. You must protect your Cargo.lock from EOL conversion—perhaps with .gitattributes.

  • If it does not, you will have to generate one and include it with your cache. Pin your cache to something else, like the Git SHA1.

  • If you build with a Dockerfile, it may be prudent to support building without a vendor cache as well.

EDIT: The full Debian images have Git, but the -slim and alpine images do not.

@tk-nguyen
Copy link
Author

This solution works, but the Git executable needs to be part of the Docker image used for the build. The official rust images do not have Git. While it can be added, this introduces another dependency on the Debian or other packaging servers.

I don't think it's correct. The official Rust docker image (debian-based) is built from buildpacks-dep:bullseye, which is based on buildpacks-dep:bullseye-scm, which installs git as seen here: https://github.com/docker-library/buildpack-deps/blob/598c0e2a0ff9ea3a4b8cb30e95007157d542da74/debian/bullseye/scm/Dockerfile#L10-L11. This is the case for all debian versions.

Maybe you meant the alpine-based Rust image? As I can see it doesn't install git by default: https://github.com/rust-lang/docker-rust/blob/ee45fe4468bb81df3006fbfbef57adaa5df5af83/1.67.1/alpine3.17/Dockerfile#L3-L5

@cbs228
Copy link

cbs228 commented Feb 26, 2023

Maybe you meant the alpine-based Rust image? As I can see it doesn't install git by default: https://github.com/rust-lang/docker-rust/blob/ee45fe4468bb81df3006fbfbef57adaa5df5af83/1.67.1/alpine3.17/Dockerfile#L3-L5

Thanks for checking this for me. You are correct! My build was using both the slim-buster and alpine variants, and neither of them have git. (slim-bullseye doesn't have git, either.) The full Debian images do have git, so CARGO_NET_GIT_FETCH_WITH_CLI=true should work "out of the box."

I still believe my approach will result in faster and more efficient CI runs in many cases—especially when dependencies need to be (re)fetched or rebuilt. It is, however, slightly more complex.

@crazy-max
Copy link
Member

You might also be interested by https://github.com/crazy-max/rust-docker-cross/blob/main/Dockerfile to build multi-platform image using cross compilation with xx-cargo. More info: https://github.com/tonistiigi/xx/#rust

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants