-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Windows and ARM support to prerelease pipeline (NR-52415) #147
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for all the effort!
I simply left a couple of nits.
The pre-release action was tested using setting up the workflow file to trigger the workflow when we push on this branch and using s3 testing environment (check details in 9aa18c4). The pre-release was created and the assets where sent to testing s3, as it shown here: https://github.com/newrelic/newrelic-prometheus-exporters-packages/actions/runs/3836093769 The installability test failled because it checks if the version corresponding to the tag was installed. However, that latests version is not in the staging s3 because the testing one was used instead. Both the complete pre-release workflow and the release one (which was modified to include the s3-publish-schema generation) will be completely tested when we release one of the integrations, in a follow-up PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not approving this PR as I was the one who worked on it for a couple of weeks, but I am leaving comments that I can see now with a fresh view.
Thanks for taking care!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM I just left a small NIT
@sigilioso Since we have all the context now, can you check if we are already addressing #104 here? |
It was not completely addressed, but after 8fa0e82 it is. |
with: | ||
go-version: ${{env.GO_VERSION}} | ||
version: v1.13.1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we replace this as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, I missed that one and also missed this comment before merging, I'll replace it in a follow-up PR. Thanks!
name: Create the publish schema | ||
runs-on: ubuntu-latest | ||
if: ${{ needs.check_exporter_preconditions.outputs.CREATE_RELEASE == 'true' }} | ||
needs: [ check_exporter_preconditions ] | ||
steps: | ||
- uses: actions/checkout@v3 | ||
- run: make create-publish-schema-${{ needs.check_exporter_preconditions.outputs.INTEGRATION_NAME }} | ||
- uses: actions/upload-artifact@v3 | ||
with: | ||
upload_url: ${{ needs.create_prerelease.outputs.UPLOAD_URL }} | ||
asset_path: ${{ github.workspace }}\src\repo\exporters\${{ steps.vars.outputs.NAME }}\target\packages\${{ steps.vars.outputs.PACKAGE_NAME }}-${{ matrix.goarch }}.${{ steps.vars.outputs.VERSION }}.msi | ||
asset_name: ${{ steps.vars.outputs.PACKAGE_NAME }}-${{ matrix.goarch }}.${{ steps.vars.outputs.VERSION }}.msi | ||
asset_content_type: msi package | ||
name: publish-schema | ||
retention-days: 1 | ||
path: | | ||
scripts/pkg/s3-publish-schema-tmp.yml |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why creating the publish schema in a separate job, uploading it and then downloading it?
isn't easier just to call later on directly:
make create-publish-schema-${{ needs.check_exporter_preconditions.outputs.INTEGRATION_NAME }}
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another long story...
The publish action did not support local-custom
when I worked in the pipeline. In fact, I had to create the feature: newrelic/infrastructure-publish-action#142
The first iteration was to upload the artifact using the action and use the API to get the URL of the artifact and use it with the publish action. A nasty quick and dirty hack but that should do the trick.
I learned that we cannot do that. The artifact that you upload are not immediately accessible via API. If you list the artifacts before and after uploading and artifact using the action there are no changes. Artifacts are only available AFTER the workflow has finished.
Link to the docs but I'll save you a click:
Downloading or deleting artifacts
During a workflow run, you can use the download-artifact action to download artifacts that were previously uploaded in the same workflow run.
After a workflow run has been completed, you can download or delete artifacts on GitHub or using the REST API.
Note that "After a workflow run has been completed".
How does the action to download fetch the artifacts? After reverse engineer the action calls a URL like ${config_variables_1.getRuntimeUrl()}_apis/pipelines/workflows/${config_variables_1.getWorkFlowRunId()}/artifacts?api-version=${getApiVersion()}
. The function getRuntimeUrl()
simply looks for the environment variable ACTIONS_RUNTIME_URL
.
The environment variable ACTIONS_RUNTIME_URL
is only available INSIDE an action but not in a step that runs a script. If I run a step with set
that variable does not exist. These internal environment variables are undocumented: nektos/act#329
Someone created an action that dumps the environment variables from inside an action and you can see that environment variable there: https://gist.github.com/sonots/070f1059b71dd2dec8e43ec513c3f8f0
Then you'll ask, why did it survive after this failure?
It was really useful to have it uploaded for debugging purposes as you can fetch it and see how it templates just in case the yq
that you have locally is not the same yq
that GitHub is using (yes, there are two yq
called yq
in Python and Go).
It parallelizes (a bit) the jobs and removes the need to check out the code (again) in the last job when the artifacts are uploaded making that job half a minute shorter.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks for taking care of it!
696b499
to
052bcaf
Compare
There are lots of things that I had to tweak to make this pipeline work.
It has been tested in a fork repo and the pipeline worked up to upload everything to the S3 bucket.
According to the CAOS team, that action test that the repo signing is correct, and the pipeline cannot test it as the fork had a fake GPG key made for this PR.
I am creating this PR without that part working correctly as it is hard to test in the repo as it requires the secrets for package signing in order to test the repo signing.
Further effort to make this work is still needed at this point.