-
-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixed code coverage aggregation and reporting #1868
Conversation
This improved coverage, but apparently codecov is still flaky... There are quite frequent 404 errors in the coverage reporting. This happened in the ros2 tests for instance: See https://community.codecov.com/t/upload-issues-unable-to-locate-build-via-github-actions-api/3954. We have |
The upload also failed for the Arduino tests: https://github.com/lf-lang/lingua-franca/actions/runs/5389668535/jobs/9783967097?pr=1868 |
I added a retry mechanism, let's see how that goes. |
2000 milliseconds seems like a pretty short retry interval. Maybe try increasing that by an order of magnitude or two? (Edit: This is motivated by the failure in the Zephyr tests, which appears to be a transient network issue.) Also, we might want to set |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for addressing this, @cmnrd!
The problem seems to be more than a simple network issue. See https://community.codecov.com/t/upload-issues-unable-to-locate-build-via-github-actions-api/3954. They mention two "solutions":
I was under the impression we were already doing 1., but apparently we did not. This PR now does both 1. and 2. You might be right about the retry delay. I increased it to 20 seconds and 5 retries.
Recently codecov hasn't been so reliable which caused problems in particular on master because the coverage reported in the badge on the main page jumped up and down. We haven't really noticed because the jobs did not fail. If the jobs fail, we know that we need to restart it. I think this is also a good behavior in PRs. We should pay more attention to the code coverage, but for this we need to be able to trust the numbers. |
I am a bit stuck I here. After endless debugging I learned that GitHub does not give access to secrets in pull requests... (https://securitylab.github.com/research/github-actions-preventing-pwn-requests/). That is why the codecov token is not used... We have two options:
1. Is bad, but not too bad... It allows basically anyone to upload reports to our codecov repo. Attackers could mess with our coverage data, but this doesn't seem to be a very attractive attack scenario. @lhstrh Do you think it is Ok to put the codecov token in plain text into the workflow configuration? |
I'm inclined to just remove the optimization where we skip CI on |
Yes, but this would mean we don't see coverage information in code review. I think that we should start to pay more attention to test coverage when reviewing PRs. (Don't add be untested code). BTW, I am not sure if this is clear, but the skipping of tests on master was not responsible for the issues with code coverage that we saw. |
I agree... But we did see code coverage in the PRs before. What has changed?
Yes, that is clear. |
Codecov has issues with rate limits on GitHubs API. Some of the coverage reports are not properly submitted. We can ignore this (which we did before) but then the resulting coverage number is unreliable as we don't know if it is complete. CodeCov recommends to specify the token as this allows them to do less API calls (they use the token to match the repository and don't need to send requests to GitHub to find out). This reduces the likelyhood of errors significantly. |
OK, I guess it's unlikely that anyone would be interested in manipulating our code coverage report, so there wouldn't be much harm in putting the token in the yaml, provided that it does not give access to other, more sensitive privileges. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! How we see whether it works?
It works if there are no CI errors. |
There are some small problems in our current test coverage reporting. For instance, the diagram tests implemented in
cil/lfd
do not contribute to the coverage incore
(where the diagram synthesis is implemented). This PR updates the gradle setup to use the jacoco aggregation plugin. While this plugin automates the test coverage aggregation across subprojects, it gives less control than the simple jacoco plugin. Therefore, we now filter packages out during the reporting to codecov.