Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-48204][INFRA] Fix release script for Spark 4.0+ #46484

Closed
wants to merge 1 commit into from

Conversation

cloud-fan
Copy link
Contributor

What changes were proposed in this pull request?

Before Spark 4.0, Scala 2.12 was primary and Scala 2.13 was secondary. The release scripts build more packages (hadoop3, without-hadoop, pyspark, sparkr) for the primary Scala version but only one package for the secondary. However, Spark 4.0 removes Scala 2.12 support and the release script needs to be updated accordingly.

Why are the changes needed?

to make the release scripts work

Does this PR introduce any user-facing change?

no

How was this patch tested?

manual.

Was this patch authored or co-authored using generative AI tooling?

no

@cloud-fan
Copy link
Contributor Author

Note: it does not fix all the issues. The next issue I'm debugging is the pyspark version number mismatch. The script produces pyspark packages with version 4.0.0.dev0 but at the end it tries to find 4.0.0.dev.

cc @HyukjinKwon @dongjoon-hyun

@dongjoon-hyun
Copy link
Member

Ack!

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM.

@dongjoon-hyun
Copy link
Member

dongjoon-hyun commented May 8, 2024

Feel free to merge and proceed to the remaining release work, @cloud-fan .
I can help you actively.

@dongjoon-hyun
Copy link
Member

Merged to master.

JacobZheng0927 pushed a commit to JacobZheng0927/spark that referenced this pull request May 11, 2024
### What changes were proposed in this pull request?

Before Spark 4.0, Scala 2.12 was primary and Scala 2.13 was secondary. The release scripts build more packages (hadoop3, without-hadoop, pyspark, sparkr) for the primary Scala version but only one package for the secondary. However, Spark 4.0 removes Scala 2.12 support and the release script needs to be updated accordingly.

### Why are the changes needed?

to make the release scripts work

### Does this PR introduce _any_ user-facing change?

no

### How was this patch tested?

manual.

### Was this patch authored or co-authored using generative AI tooling?

no

Closes apache#46484 from cloud-fan/re.

Authored-by: Wenchen Fan <wenchen@databricks.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants