-
Notifications
You must be signed in to change notification settings - Fork 591
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CI: Restore PySpark tests #2212
CI: Restore PySpark tests #2212
Conversation
It looks like the CI will fail before it even gets to the tests, until #2211 is in -- In the meantime, troubleshooting progress notes: Currently, these PySpark tests fail (locally, but probably also will in the CI here too):
In requirements-3.6-dev.yml and requirements-3.7-dev.yml, changing
to
makes the tests pass (locally) for 3.6 and 3.7 However, when Will continue troubleshooting |
@timothydijamco so what we want to do is have a separate .yaml for each of the 3 backend sets, so there are no conflicts :-> IOW simply add one with as permission deps as you can |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm refactoring all this code in #2205. The problem with Spark is not that much adding it back, but that the spark tests are failing (that's why they were disabled). May be you want to have a look at that?
@@ -81,7 +96,10 @@ jobs: | |||
displayName: 'Setup BigQuery credentials' | |||
condition: eq(variables['System.PullRequest.IsFork'], 'False') | |||
|
|||
- bash: make start PYTHON_VERSION=$PYTHON_VERSION BACKENDS="${BACKENDS}" | |||
- bash: | | |||
if [ ! -z "${BACKENDS}" ]; then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think -n
is the opposite of -z
;)
@timothydijamco I think you'll want to close this PR. #2205 has been merged, and I don't think the changes here are relevant anymore. To restore spark, it should be "as easy" as to fix the spark tests, and then just remove the Closing this, but let me know if you disagree and want to continue here, happy to reopen. |
@datapythonista I'll do this (below) now for the re-enabling aspect since #2205 is in:
@jreback |
Sure, but I think it'll be easier to start in a new PR, the conflicts here won't be trivial to fix. |
@datapythonista |
See #2201.
This PR is for restoring/troubleshooting the PySpark tests (but not the Spark tests) in CI (the Spark tests have issues on their own).
Locally I'm seeing the same failing PySpark tests that were mentioned in #2201, and I expect the CI for this PR to also fail with the same tests for now
Trying to do this in parallel with #2205 which has other CI config improvements I'd guess we want (probably some merging in the future)