Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limit number of parallel GHA runs, print env info in tests #52

Merged
merged 9 commits into from
Feb 9, 2021

Conversation

krfricke
Copy link
Collaborator

@krfricke krfricke commented Feb 4, 2021

Unfortunately GHA runs are sometimes aborted due to memory limits. Reducing the number of concurrent trials increases our test time to 2x, but at least these errors stop coming up.

@krfricke krfricke requested a review from amogkam February 4, 2021 14:12
@krfricke
Copy link
Collaborator Author

krfricke commented Feb 4, 2021

Well, it seems it doesn't solve the problem. Consider this a draft PR, I'll look into how to fix this later

Copy link
Contributor

@amogkam amogkam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting- great catch!

@amogkam
Copy link
Contributor

amogkam commented Feb 4, 2021

Though it still looks like 2 of the jobs are failing.

@krfricke
Copy link
Collaborator Author

krfricke commented Feb 4, 2021

Yes :-( Maybe we can move this to buildkite?

@amogkam
Copy link
Contributor

amogkam commented Feb 4, 2021

Ah sorry, I missed your comment about it not solving the problem. Do you think that it has to do with using the ray cluster fixture with unittest.testCase? Perhaps we should try switching back to pytest for test_colocation.

@krfricke
Copy link
Collaborator Author

krfricke commented Feb 4, 2021

I don't think this is related to that, the tests were flaky before that change. I mean, we can try, I'll create a revert PR just in case

@krfricke
Copy link
Collaborator Author

krfricke commented Feb 4, 2021

Doesn't seem to be possible from within GitHub. I might try it manually tomorrow

@amogkam
Copy link
Contributor

amogkam commented Feb 9, 2021

@krfricke should we merge this?

@krfricke
Copy link
Collaborator Author

krfricke commented Feb 9, 2021

Yeah let's do it, though it doesn't solve the problem completely. But it is a good start in any case. Thanks for reminding me!

@krfricke krfricke merged commit 3d13c95 into ray-project:master Feb 9, 2021
@krfricke krfricke deleted the fix-pyarrow branch February 9, 2021 21:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants