Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci: set a timeout for testing step of ten minutes #1133

Merged
merged 1 commit into from
Oct 24, 2023

Conversation

JKRhb
Copy link
Member

@JKRhb JKRhb commented Oct 19, 2023

This might require some discussion/fine-tuning, but since some of our tests are currently hanging for a long time on a regular basis before they are eventually canceled (after reaching the maximum of six hours), this PR proposes limiting the actual testing step to a maximum of five minutes, so that the CI workflow is aborted much earlier than it is now.

I chose five minutes for now because in general, the testing step seems to take about three minutes at maximum, but we could also consider setting it to a higher value – I guess anything lower than 360 would already be an improvement πŸ˜…

I think a change like this will also be good for receiving feedback on failed tests much earlier than is currently the case, as any failing test present in a PR might cause this behavior.

@codecov
Copy link

codecov bot commented Oct 19, 2023

Codecov Report

All modified and coverable lines are covered by tests βœ…

Comparison is base (c14d4dd) 77.17% compared to head (0274659) 77.14%.
Report is 8 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #1133      +/-   ##
==========================================
- Coverage   77.17%   77.14%   -0.03%     
==========================================
  Files          80       80              
  Lines       16245    16247       +2     
  Branches     1531     1538       +7     
==========================================
- Hits        12537    12534       -3     
- Misses       3685     3690       +5     
  Partials       23       23              

see 12 files with indirect coverage changes

β˜” View full report in Codecov by Sentry.
πŸ“’ Have feedback on the report? Share it here.

@JKRhb
Copy link
Member Author

JKRhb commented Oct 19, 2023

The new behavior is very well demonstrated by this failed first attempt: https://github.com/eclipse-thingweb/node-wot/actions/runs/6580508670/attempts/1

@danielpeintner
Copy link
Member

danielpeintner commented Oct 20, 2023

I chose five minutes for now because in general, the testing step seems to take about three minutes at maximum, but we could also consider setting it to a higher value – I guess anything lower than 360 would already be an improvement πŸ˜…

I think the windows runs take about 7 to 8 minutes. Maybe a timeout of 10 to 15 minutes makes more sense. What do you think?

@JKRhb
Copy link
Member Author

JKRhb commented Oct 20, 2023

I chose five minutes for now because in general, the testing step seems to take about three minutes at maximum, but we could also consider setting it to a higher value – I guess anything lower than 360 would already be an improvement πŸ˜…

I think the windows runs take about 7 to 8 minutes. Maybe a timeout of 10 to 15 minutes makes more sense. What do you think?

Note that the timeout added here is only for the testing step, where the last run in this PR had the following durations:

  • MacOS, Node 18: 3m 8s
  • MacOS, Node 20: 3m 26s
  • Windows, Node 18: 2m 8s
  • Windows, Node 20: 1m 58s
  • Linux, Node 18: 1m 35s
  • Linux, Node 20: 1m 16s

So I think 5 minutes would be fine here, however, increasing the timeout to 10 minutes (or maybe something like 7?) wouldn't hurt, I suppose.

@JKRhb
Copy link
Member Author

JKRhb commented Oct 20, 2023

Or would a timeout of 10 minutes for the whole run be better here?

@relu91
Copy link
Member

relu91 commented Oct 21, 2023

CI timeouts should be a safe net if anything else is failing. 5 min is reasonably high given the timing that we know now but I would be happier to increase it to 10m or 15m. 1m timeout could be used as a default for unit tests. This hanging problem is definitely something out of the ordinary that should be solved in the tests.

@JKRhb JKRhb changed the title ci: set a timeout for testing step of five minutes ci: set a timeout for testing step of ten minutes Oct 21, 2023
@JKRhb
Copy link
Member Author

JKRhb commented Oct 21, 2023

CI timeouts should be a safe net if anything else is failing. 5 min is reasonably high given the timing that we know now but I would be happier to increase it to 10m or 15m. 1m timeout could be used as a default for unit tests. This hanging problem is definitely something out of the ordinary that should be solved in the tests.

I completely agree with you here! I now set the timeout to 10 minutes for the testing step, I hope that this is a good compromise as a fail safe for avoiding that the tests hang for multiple hours after failing.

In general, I think that theoritically all of the tests should already time out after one minute or so but for some reason the testing step does not abort the process but keeps on going until GitHub Actions pulls the plug, so to say. This is definitely something we need to investigate – maybe it is also somehow related to this problem with using asynchronous callbacks with EventListeners or a similar obscure phenomenon in Node πŸ€”

Copy link
Member

@danielpeintner danielpeintner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The timeout looks good to me.

Anyhow, we need to investigate what causes the issue

Copy link
Member

@relu91 relu91 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to go, thanks @JKRhb

@relu91 relu91 merged commit 247b0d4 into eclipse-thingweb:master Oct 24, 2023
12 checks passed
@JKRhb JKRhb deleted the timeout-test branch October 24, 2023 09:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants