-
Notifications
You must be signed in to change notification settings - Fork 544
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Restore spark tests #2201
Comments
@datapythonista can you give a link to read on why tests were turned off? I only see there were some "space concerns"... I'm interested in bringing back OmniSci testing, can I help with something here? |
I don't have a particular link, we got the CI red, and we had to merge 6 or 7 PRs to fix it, since there were several problems going on at the same time. There were two different things affecting omnisci tests, disk space and the conda solver. For disk space, the approach was to split them in groups, so not all docker images were loaded in the same build. #2194 was the initial fix to that problem Then, when splitting the tests, we were having the second problem with omnisci. Resolving the environment with After getting the CI green, I had a look, and apparently, requiring the latest version of I readded For some reason, #2205 is failing when If you want to work on the branch of #2205, and see if you can identify the problem, that would be great. Otherwise I'll try to have a look myself later this week. |
Spark tests are running again as of #2937. Closing. |
To fix the CI, it was needed to temporary remove the tests for spark and omnisci, in #2194.
The original problem was that the CI was using more than the 10Gb of available disk space. After splitting the tests in two groups, to avoid downloading too much backend stuff in a single build, omnisci and spark give problems.
In the case of omnisci, the problem is that installing
pymap
on top of the rest of the libraries increase the conda environment resolution by something like 30 minutes or more. We've got like 50 dependencies, and after spending a decent amount of time trying to see if pinning something solves the problem, I couldn't find anything.For spark, we get some errors that seem unrelated to the split. It's difficult to tell if these were working recently, since the CI has been having problems for a while.
An example of spark failure:
The text was updated successfully, but these errors were encountered: