Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

We don't seem to get detail on test results? #183

Closed
max-sixty opened this issue Nov 6, 2021 · 11 comments
Closed

We don't seem to get detail on test results? #183

max-sixty opened this issue Nov 6, 2021 · 11 comments

Comments

@max-sixty
Copy link

I really liked the idea of this aciton, and added it to xarray.

Unfortunately — unless I'm missing it — we don't seem to be getting one of the big benefits of this: immediate detail on what failed in tests.

If we're doing something wrong in our configs, I'd love to know.

ref pydata/xarray#5946

Thank you!

@EnricoMi
Copy link
Owner

EnricoMi commented Nov 6, 2021

Sure, can you point me to the commit that has test failures? Then I can have a look what went wrong.

@max-sixty
Copy link
Author

Thanks a lot @EnricoMi !

Here's an example: pydata/xarray#5873

I see there that there are no test failures reported in the comment. So possibly our config is incorrect. I thought I copied it from your (very clear) readme, but it's likely I made a mistake. I just had another look but can't see what — if you spot anything then greatly appreciated.

@EnricoMi
Copy link
Owner

EnricoMi commented Nov 6, 2021

Yeah, found another one: pydata/xarray#5734

The comment got overwritten by a later run, which had no test failures. Check the third last edit of the comment:
image

image

It links a check, and that one should contain some failures, but it does not: https://github.com/pydata/xarray/runs/4122341392

The log of the respective action runs does not show any problems:
https://github.com/pydata/xarray/runs/4122335672?check_suite_focus=true#step:4:18

The strange thing it is that it creates multiple annotations (as can be seen in the log), but there are only 4 annotations. The first annotation says 13454 to 14159, clearly all earlier annotations got lost. The first annotation would be the failure.

So the GitHub API somehow swallows some annotations. This is not good.

image

@EnricoMi
Copy link
Owner

EnricoMi commented Nov 6, 2021

In your example, the comment also got overwritten by a later run, the second last edit has a failure:
image
It links to this check: https://github.com/pydata/xarray/runs/3941131935

And that one has failure annotations:
image

That check has all annotations.

@EnricoMi
Copy link
Owner

EnricoMi commented Nov 6, 2021

@max-sixty your setup looks good, it looks like an issue with the GitHub API. However, I recommend to use the latest setup: pydata/xarray#5947

@max-sixty
Copy link
Author

@EnricoMi thank you so much! Let's give this a whirl!

@EnricoMi
Copy link
Owner

EnricoMi commented Nov 7, 2021

I reckon this is a transient issue with GitHub API. Lets monitor the annotations for a while if we see more instances where some are missing (for commits with and without failures).

@EnricoMi
Copy link
Owner

@max-sixty I have looked through a few dozens of check results in your project and all have the expected annotations. If you spot one run that produces incomplete annotations, simply rerun the publish workflow. This can be done without rerunning the tests themself as the workflows are separated. If this occurs too frequently, then this issue becomes reproducible and we could do some debugging.

@max-sixty
Copy link
Author

Great! Thanks a lot @EnricoMi ! I'll close this for now and reopen if we see this again, if that works for you.

@EnricoMi
Copy link
Owner

@max-sixty I think I have found the issue may explain your observation: #215 When there are more than 50 annotations (test failures, test name list, ...) than only the last modulo 50 annotations appear. It is now clear how this happened, and it is fixed in master. I am about to release a new version.

@max-sixty
Copy link
Author

Great! Thanks @EnricoMi !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants