-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Permission denied creating tar for Apt files #324
Comments
It has nothing to do with the problem itself, but you should think a little more about caching. Downloading and decompressing are also costly. Considering those, is 35 seconds really long? In my opinion, it's not. |
And the current setup doesn't guarantee that the contents of |
Hmm I suppose 35 seconds isn't that long, but I'm not sure what the total Apt time will be if we start installing more packages. Semaphore (another CI/CD platform) recommends 10 minutes as the maximum time for continuous integration. Any longer, and developers start changing their behaviour to work around the build system. So depending on what your current times are, I think that saving 30 seconds or 1 minute can help. |
In my opinion, there is basically no need to run an update. It's probably faster if you don't do it. Also, the package manager is not a build system; it probably won't be slower than you think. (Unless you specify that explicitly.) |
I fixed a similar issue by setting the restore target's permissions with This took a 16min build down to 1min, because a full compile step for a utility binary could be skipped; no need to install the language's runtime, clone the remote repo, then build and install the binary. Saving time matters for GH Actions. Anything that can be done to reduce the time for them to complete, even if it's only a few seconds, can have a big impact. |
I was able to at least successfully save a cache of the downloaded - name: Cache APT packages
uses: actions/cache@v2
with:
path: |
/var/cache/apt/archives/**.deb
!/var/cache/apt/archives/partial
!/var/cache/apt/archives/lock
key: ${{ runner.os }}-apt However, when attempting to restore this cache, the attempt to extract these cached files still results in
If we could just have an option for running the |
I have a workflow that opens a Nix shell. It has a lot of small dependencies, and it takes about 4 minutes to download them from various Nix binary caches. Seems like it would be faster to restore from github's cache. Is there some reason the restore can't just run as root? |
Caching system dependencies installed using This feature would be very useful to a lot of people! 👍 |
Hitting the same problem with the nix store, I found this workaround, which has unblocked us for the moment: https://github.com/WP2Static/wp2static-integration-tests/blob/900fb6e0f8d2c2b8656dfa3d9ac475b66d119806/.github/workflows/test.yml#L40-L49 |
This issue is stale because it has been open for 200 days with no activity. Leave a comment to avoid closing this issue in 5 days. |
|
I wonder if anyone tried messing the file permissions on the host before caching occurs, so the current user can rewrite the system cache. It is bit of security nightmare but it might work and it would happen only on CI. |
I feel joshmgross's comment here does reason why we can't use sudo for tar |
Because of permissions I feel this is not possible. Workaround can be to download binaries in a directory and cache that. While restoring install binaries from that directory. Or you can do what @ssbarnea suggested. |
FWIW, I saw that this action seems to offer caching for apt installs: https://github.com/awalsh128/cache-apt-pkgs-action. I'm not affiliated and don't know how the action implements caching, but I think it might be of interest here |
A quick look at that seems to indicate that it tries to cache the package files after installation, which seems extremely risky. I'd like to simply cache the results of |
It seems flakes are going to take over the world, so we're going to try them out for this. The thing that seems actually interesting is that since flakes require the `nix` command (rather than the set of `nix-*` commands we were using), it means we might be able to use the `--store` argument to specify a directory that can be used for caching. The idea is that maybe we can get around the issues pointed out in actions/cache#749 and actions/cache#324 (comment). That's the hope, anyway. Let's see if we can make it happen.
It seems flakes are going to take over the world, so we're going to try them out for this. The thing that is actually interesting about this change is that since flakes require the `nix` command (rather than the set of `nix-*` commands we were using), we can use the `--store` argument to specify a directory that can be used for caching. This way, we can get around the issues pointed out in actions/cache#749 and actions/cache#324 (comment). We'll move on to caching other stuff after this. But for now, just caching the `nix` stuff seems to knock about a minute off of CI. It was between 3m 30s and 4m. Now it's sitting around 2m 45s. That's ~20% decrease in CI time! Here's hoping future caching efforts yield other useful gains. Branch: joneshf/switch-over-to-using-nix-flakes Pull-Request: #83
This issue is stale because it has been open for 200 days with no activity. Leave a comment to avoid closing this issue in 5 days. |
Everything runs inside a docker. Therefore, there is no reason not to permit it, so we can do what needs to be done. |
This is also affecting us. Also agree that some sort of |
Also getting hit by this. Don't know why this directly is being hit. It would be awesome to be able to pass args to tar or otherwise ignore files that can't be cached. The default behavior should be to just skip the files or warn about them. Seems like an easy change. |
With the separate save/restore actions, you can now do something like - name: Calculate apt sources hash
id: apt-list
run: |
echo -n hash= >> $GITHUB_OUTPUT
tar -cf - --sort=name /var/lib/apt/lists | sha256sum | cut -f 1 -d ' ' >> $GITHUB_OUTPUT
- name: Restore apt archives
uses: actions/cache/restore@v3
with:
path: apt-archives
key: ${{ env.ImageOS }}-apt-${{ steps.apt-list.outputs.hash }}
restore-keys: |
${{ env.ImageOS }}-apt-
- name: Install OS dependencies
run: |
mkdir -p apt-archives
sudo cp -a apt-archives /var/cache/apt/archives
sudo apt-get install -y my-package
cp -a /var/cache/apt/archives/*.deb apt-archives
- name: Save apt archives
uses: actions/cache/save@v3
with:
path: apt-archives
key: ${{ env.ImageOS }}-apt-${{ steps.apt-cache.outputs.hash }} |
I've tried to cache
Workflow code:
How to fix it? |
… path list" This reverts commit 900a453. Refs: <actions/cache#324>
Based on @joshmgross's latest comment in #133 (comment), here's my use case, same as @evandrocoan #133 (comment)...
caching Apt list and package files in Ubuntu
Example workflow
The Apt step here takes about 35 seconds without caching. I'm hoping the cache will improve that. I'm also not 100% sure about the paths because I haven't been able to test it yet.
Error during post job
The text was updated successfully, but these errors were encountered: