Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-48116][INFRA][3.5] Run pyspark-pandas* only in PR builder and Daily Python CIs #46482

Closed

Conversation

dongjoon-hyun
Copy link
Member

@dongjoon-hyun dongjoon-hyun commented May 8, 2024

What changes were proposed in this pull request?

This PR aims to run pyspark-pandas* of branch-3.5 only in PR builder and Daily Python CIs. In other words, only the commit builder will skip it by default. Please note that all PR builders is not consuming ASF resources and they provides lots of test coverage everyday.

branch-3.5 Python Daily CI runs all Python tests including pyspark-pandas like the following.

"pyspark": "true",
"pyspark-pandas": "true"

Why are the changes needed?

To reduce GitHub Action usage to meet ASF INFRA policy.

Although pandas is an optional package in PySpark, this is essential for PySpark users and we have 6 test pipelines which requires lots of resources. We need to optimize the job concurrently level to less than or equal to 20 while keeping the test capability as much as possible.

# PySpark dependencies (optional)
numpy
pyarrow<13.0.0
pandas

  • pyspark-pandas
  • pyspark-pandas-slow
  • pyspark-pandas-connect
  • pyspark-pandas-slow-connect

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Manual review.

Was this patch authored or co-authored using generative AI tooling?

No.

@github-actions github-actions bot added the INFRA label May 8, 2024
@dongjoon-hyun
Copy link
Member Author

Could you review this backporting PR, @viirya ?

Since this is applied to master branch successfully, I'm trying to backport this to branch-3.5.

@dongjoon-hyun
Copy link
Member Author

Thank you so much, @viirya !

dongjoon-hyun added a commit that referenced this pull request May 8, 2024
…d Daily Python CIs

### What changes were proposed in this pull request?

This PR aims to run `pyspark-pandas*` of `branch-3.5` only in PR builder and Daily Python CIs. In other words, only the commit builder will skip it by default. Please note that all PR builders is not consuming ASF resources and they provides lots of test coverage everyday.

`branch-3.5` Python Daily CI runs all Python tests including `pyspark-pandas` like the following.

https://github.com/apache/spark/blob/21548a8cc5c527d4416a276a852f967b4410bd4b/.github/workflows/build_branch35_python.yml#L43-L44

### Why are the changes needed?

To reduce GitHub Action usage to meet ASF INFRA policy.
- https://infra.apache.org/github-actions-policy.html

    > All workflows MUST have a job concurrency level less than or equal to 20. This means a workflow cannot have more than 20 jobs running at the same time across all matrices.

Although `pandas` is an **optional** package in PySpark, this is essential for PySpark users and we have **6 test pipelines** which requires lots of resources. We need to optimize the job concurrently level to `less than or equal to 20` while keeping the test capability as much as possible.

https://github.com/apache/spark/blob/a762f3175fcdb7b069faa0c2bfce93d295cb1f10/dev/requirements.txt#L4-L7

- pyspark-pandas
- pyspark-pandas-slow
- pyspark-pandas-connect
- pyspark-pandas-slow-connect

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Manual review.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #46482 from dongjoon-hyun/SPARK-48116-3.5.

Authored-by: Dongjoon Hyun <dhyun@apple.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
@dongjoon-hyun
Copy link
Member Author

Merged to branch-3.5.

@dongjoon-hyun dongjoon-hyun deleted the SPARK-48116-3.5 branch May 8, 2024 20:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
2 participants