Skip to content

Conversation

@dongjoon-hyun
Copy link
Member

What changes were proposed in this pull request?

This PR aims to increase spark.kubernetes.allocation.batch.size to 20 from 10 in Apache Spark 4.2.0.

Why are the changes needed?

Since Apache Spark 4.0.0, Apache Spark uses 10 as the default value of executor allocation batch size. This PR aims to increase it further in 2025.

Does this PR introduce any user-facing change?

Yes, the users will see faster Spark job resource allocation. The migration guide is updated correspondingly.

How was this patch tested?

Pass the CIs.

Was this patch authored or co-authored using generative AI tooling?

No.

@dongjoon-hyun
Copy link
Member Author

Thank you, @HyukjinKwon . Merged to master for Apache Spark 4.2.0.

@dongjoon-hyun dongjoon-hyun deleted the SPARK-54422 branch November 19, 2025 22:31
huangxiaopingRD pushed a commit to huangxiaopingRD/spark that referenced this pull request Nov 25, 2025
…to 20

### What changes were proposed in this pull request?

This PR aims to increase `spark.kubernetes.allocation.batch.size` to 20 from 10 in Apache Spark 4.2.0.

### Why are the changes needed?

Since Apache Spark 4.0.0, Apache Spark uses `10` as the default value of executor allocation batch size. This PR aims to increase it further in 2025.
- apache#49681

### Does this PR introduce _any_ user-facing change?

Yes, the users will see faster Spark job resource allocation. The migration guide is updated correspondingly.

### How was this patch tested?

Pass the CIs.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes apache#53134 from dongjoon-hyun/SPARK-54422.

Authored-by: Dongjoon Hyun <dongjoon@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants