-
Notifications
You must be signed in to change notification settings - Fork 131
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Golang caching does not cache dependencies on action builds #1535
Comments
Hey! Thanks for filing this. For everyone's context we currently have to ways of consuming the digger action. The first is through downloading prebuilt binaries and invoking We build digger binaries on release and we have a convention that release names start with "v". So that is what we check for in the action as well. If it starts with a "v" then we download the available binary, otherwise we clone the repo and build from source. If you wish to use digger with always latest we have a release called "vLatest" which is always a release pointing to the latest release branch .. Now to your issue, which is about slow builds from cache due to no caching. I haven't investigated much but I did realise how slow the builds take when you are working and testing a feature. Usually I sometimes create a temp "v-xyz" release just to make my tests quickly but I know its not great. I'm not sure about using a third party action for this because I have some considerations there, especially with something critical like the toolchain itself. Would much rather have the solution built into native GHA step or another thing that we custom build somehow .. Thoughts ? |
I hear you about adding another custom action from an unknown source, I am not a fan of that either for production areas. I was thinking more that since the action is relatively simply to port the steps either as a composite in digger itself or just applying the steps into the workflow directly. Adjacent to the issue above is how pushing versions to workflows are handled: Why do it this way? So how are we going about this inside the action: First I addressed the golang build caching. Though this really only addresses branches after the change in "second" This is what the code looks like (I redacted our info) - name: run digger
if: ${{ startsWith(github.action_ref, 'v') || github.action_ref == 'latest' }}
env:
actionref: ${{ github.action_ref }}
PLAN_UPLOAD_DESTINATION: ${{ inputs.upload-plan-destination }}
GOOGLE_STORAGE_LOCK_BUCKET: ${{ inputs.google-lock-bucket }}
GOOGLE_STORAGE_PLAN_ARTEFACT_BUCKET: ${{ inputs.upload-plan-destination-gcp-bucket }}
AWS_S3_BUCKET: ${{ inputs.upload-plan-destination-s3-bucket }}
ACTIVATE_VENV: ${{ inputs.setup-checkov == 'true' }}
DISABLE_LOCKING: ${{ inputs.disable-locking == 'true' }}
DIGGER_TOKEN: ${{ inputs.digger-token }}
DIGGER_ORGANISATION: ${{ inputs.digger-organisation }}
DIGGER_HOSTNAME: ${{ inputs.digger-hostname }}
DIGGER_FILENAME: ${{ inputs.digger-filename }}
ACCUMULATE_PLANS: ${{ inputs.post-plans-as-one-comment == 'true' }}
REPORTING_STRATEGY: ${{ inputs.reporting-strategy }}
INPUT_DIGGER_PROJECT: ${{ inputs.project }}
INPUT_DIGGER_MODE: ${{ inputs.mode }}
INPUT_DIGGER_COMMAND: ${{ inputs.command }}
INPUT_DRIFT_DETECTION_SLACK_NOTIFICATION_URL: ${{ inputs.drift-detection-slack-notification-url }}
NO_BACKEND: ${{ inputs.no-backend }}
TF_PLUGIN_CACHE_DIR: ${{ github.workspace }}/cache
TERRAGRUNT_PROVIDER_CACHE: ${{ inputs.cache-dependencies == 'true' && 1 || 0 }}
TERRAGRUNT_PROVIDER_CACHE_DIR: ${{ github.workspace }}/cache
id: digger
shell: bash
run: |
if [[ ${actionref} -eq "latest" ]]; then
RELEASE_URL=$(curl -s -H "Authorization: token $(REDACTED)_GITHUB_TOKEN" \
https://api.github.com/repos/our_org/our_repo/releases/latest | \
jq -r '.assets[] | select(.name | contains("digger-cli-${{ runner.os }}-${{ runner.arch }}")) | .url')
curl -sL -H "Authorization: token $(REDACTED)_GITHUB_TOKEN" -H "Accept: application/octet-stream" -H "X-GitHub-Api-Version: 2022-11-28" $RELEASE_URL -o digger
else
curl -sL -H "Authorization: token $(REDACTED)_GITHUB_TOKEN" https://github.com/our_org/our_repo/releases/download/${actionref}/digger-cli-${{ runner.os }}-${{ runner.arch }} -o digger
fi
chmod +x digger
PATH=$PATH:$(pwd)
cd $GITHUB_WORKSPACE
digger |
This does require an extra step to creating an org secret that has "Read Access" to the internal digger fork and passing that as a new env variable in the digger_workflow.yaml |
I have an alternative idea for your case since you want to keep some kind of "latest" release always up to date what if you created an automation action for releasing that:
Now instead of releasing via github releases you can instead invoke the "releaser action" In this way your teams can be consuming latest by simply using "latest" or "v-latest" and you have a stable release process based on github too. I saw this done in opentofu and I'm thinking to do the same here as well. Currently we have a release called "vLatest" which I make sure to update on every release but planning to automate that part as well |
@motatoes that's what we do, but you already have this working in your repo. Oddly, you already had this built in as automated in 0.4.32 then you removed it in 0.4.33? It did exactly what I wanted in creating a latest tag. https://github.com/diggerhq/digger/blob/v0.4.32/.github/workflows/cli_latest.yml That does what you suggest, the issue we had was in the action it did a build always instead of just downloading that binary, which is what I fixed in the action. |
I did update it some to use modern output directives in GHA, and to handle private repos: updated name: Update latest tag for every new latest release
on:
release:
types:
- released
permissions:
contents: write
actions: read
pull-requests: write
issues: read
discussions: read
jobs:
update_latest_tag:
runs-on: ubuntu-latest
steps:
- name: Check out repository
uses: actions/checkout@v4
- name: Check if the latest release
id: check_latest_release
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
latest_release=$(curl -s -H "Authorization: token $GITHUB_TOKEN" https://api.github.com/repos/${{ github.repository }}/releases/latest | jq -r '.tag_name')
echo "Latest release: $latest_release"
echo "Current release: ${{ github.ref }}"
if [[ "refs/tags/$latest_release" == "${{ github.ref }}" ]]; then
echo "is_latest=true" >> $GITHUB_OUTPUT
else
echo "is_latest=false" >> $GITHUB_OUTPUT
fi
- name: Update latest tag
if: steps.check_latest_release.outputs.is_latest == 'true'
uses: EndBug/latest-tag@latest
with:
ref: latest
description: Latest tag
force-branch: false
- name: Upload release asset
if: steps.check_latest_release.outputs.is_latest == 'true'
uses: softprops/action-gh-release@v2
with:
files: LICENSE
token: ${{ secrets.GITHUB_TOKEN }} |
@ben-of-codecraft this is cool! We removed it because I thought nobody was using the latest tag and we were planning to do that thing with "vLatest" release, but clearly we are mistaken. We can bring it back again using what you have up there if of interest. As for caching, if we manage to extract what we need from that third party action and have some native logic in our action.yml I'm all for it too. Contributions welcomed! |
I'll see what I can do to help out with the caching part, and for the latest tag, I'd recommend that approach, as it's more prevalent in package management systems and package registries, making it easy for most programmers. |
Also, please bring back latest tag action 🙏 |
If not using a versioned request Golang will not cache. This creates a longer builds as dependencies have to be downloaded every time. This is a bug with how composite actions and setup-go work together that has been filed.
I found an alternate action that does effectively the same thing without using the built-in setup-go caching.
https://github.com/magnetikonline/action-golang-cache/blob/main/action.yaml
Unexpected Result:
![image](https://private-user-images.githubusercontent.com/110695027/335352856-bb43cef3-7080-49ad-a47b-8ae48f52e3de.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjEyMjkwMDUsIm5iZiI6MTcyMTIyODcwNSwicGF0aCI6Ii8xMTA2OTUwMjcvMzM1MzUyODU2LWJiNDNjZWYzLTcwODAtNDlhZC1hNDdiLThhZTQ4ZjUyZTNkZS5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwNzE3JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDcxN1QxNTA1MDVaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT04M2UzZWIxZWNkNGMyOTY4YTU3MzVmZWNhODA0NDY2MDRjMWI3ZTMwZGVmOTY2NzZkZWY2MGYxYzNlMDQwNGY2JlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.UphiMiopW0VilgjOD7zGn7-6jSfPRafCcuky2K3wCoI)
Updated action code:
With the changes above result:
Overall this took repeat builds from ~2-3 min down to 30-40s
The text was updated successfully, but these errors were encountered: