Skip to content

Conversation

@pratham76
Copy link

What changes were proposed in this pull request?

This PR adds documentation for the minimum allowable executor memory value (450M) to prevent configuration errors especially when using offHeap memory configurations.

Why are the changes needed?

Currently, Spark enforces a minimum executor memory of 450M as reserved memory to prevent OOMs, through validation logic in the below snippets

private val RESERVED_SYSTEM_MEMORY_BYTES = 300 * 1024 * 1024

val minSystemMemory = (reservedMemory * 1.5).ceil.toLong

However, this constraint is not documented in the configuration guide or in the code-level configuration definition. The lack of documentation makes it hard for users to configure off-heap memory (spark.memory.offHeap.size) along with executor memory, leading to validation failures.

Does this PR introduce any user-facing change?

Only Doc changes

How was this patch tested?

CI

Was this patch authored or co-authored using generative AI tooling?

No

@pratham76 pratham76 marked this pull request as ready for review November 12, 2025 17:13
@pratham76
Copy link
Author

@dongjoon-hyun Could you please review? Thanks!

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for making a PR. However, I'm not sure what is hard for users because we have a directive error message already, as you mentioned, like at least 471859200.

The lack of documentation makes it hard for users to configure off-heap memory (spark.memory.offHeap.size) along with executor memory, leading to validation failures.

org.apache.spark.SparkIllegalArgumentException:
[INVALID_EXECUTOR_MEMORY] Executor memory 1048576 must be at least 471859200.

@pratham76
Copy link
Author

Thank you for making a PR. However, I'm not sure what is hard for users because we have a directive error message already, as you mentioned, like at least 471859200.

The lack of documentation makes it hard for users to configure off-heap memory (spark.memory.offHeap.size) along with executor memory, leading to validation failures.

org.apache.spark.SparkIllegalArgumentException:
[INVALID_EXECUTOR_MEMORY] Executor memory 1048576 must be at least 471859200.

Yes I agree that we have an error msg indicating users of this minimum requirement, But just thought it would be better if we document it as well so that users are aware of it beforehand.

@dongjoon-hyun
Copy link
Member

Got it.

private[spark] val EXECUTOR_MEMORY = ConfigBuilder(SparkLauncher.EXECUTOR_MEMORY)
.doc("Amount of memory to use per executor process, in MiB unless otherwise specified.")
.doc("Amount of memory to use per executor process, in MiB unless otherwise specified, " +
"and with minimum allowable value of 450M")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

450M -> 450m.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

with minimum allowable value -> with the minimum value

Amount of memory to use per executor process, in the same format as JVM memory strings with
a size unit suffix ("k", "m", "g" or "t") (e.g. <code>512m</code>, <code>2g</code>).
a size unit suffix ("k", "m", "g" or "t") (e.g. <code>512m</code>, <code>2g</code>,
and with a minimum allowable value of <code>450M</code>).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

450M -> 450m.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditt. with minimum allowable value -> with the minimum value

@pratham76
Copy link
Author

Thank you for your comments. Have addressed them.

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM. Thank you, @pratham76 . Merged to master for Apache Spark 4.2.0.

zifeif2 pushed a commit to zifeif2/spark that referenced this pull request Nov 22, 2025
### What changes were proposed in this pull request?

This PR adds documentation for the minimum allowable executor memory value (450M) to prevent configuration errors especially when using offHeap memory configurations.

### Why are the changes needed?

Currently, Spark enforces a minimum executor memory of 450M as reserved memory to prevent OOMs, through validation logic in the below snippets
https://github.com/apache/spark/blob/06013a90b5857b9981cedac099875e8d95b9f59e/core/src/main/scala/org/apache/spark/memory/UnifiedMemoryManager.scala#L264

https://github.com/apache/spark/blob/06013a90b5857b9981cedac099875e8d95b9f59e/core/src/main/scala/org/apache/spark/memory/UnifiedMemoryManager.scala#L466

However, this constraint is not documented in the configuration guide or in the code-level configuration definition. The lack of documentation makes it hard for users to configure off-heap memory (spark.memory.offHeap.size) along with executor memory, leading to validation failures.

### Does this PR introduce _any_ user-facing change?

Only Doc changes

### How was this patch tested?

CI

### Was this patch authored or co-authored using generative AI tooling?

No

Closes apache#53017 from pratham76/fix-docs.

Authored-by: Pratham Manja <prathammanja@ibm.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants