-
Notifications
You must be signed in to change notification settings - Fork 29.1k
SPARK-2400 : fix spark.yarn.max.executor.failures explaination #1282
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Merged build triggered. |
|
Merged build started. |
|
Merged build finished. All automated tests passed. |
|
All automated tests passed. |
docs/running-on-yarn.md
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we haven't been listing deprecated configs as we don't want people to continue to use them.
It would be nice if we had somewhere people could go to see the list of deprecated configs and new mappings though. But that is a separate jira.
|
Please file a jira to match this change. |
|
Merged build triggered. |
|
Merged build started. |
|
Merged build finished. All automated tests passed. |
|
All automated tests passed. |
|
Thanks looks good. +1 |
According to
```scala
private val maxNumExecutorFailures = sparkConf.getInt("spark.yarn.max.executor.failures",
sparkConf.getInt("spark.yarn.max.worker.failures", math.max(args.numExecutors * 2, 3)))
```
default value should be numExecutors * 2, with minimum of 3, and it's same to the config
`spark.yarn.max.worker.failures`
Author: CrazyJvm <crazyjvm@gmail.com>
Closes apache#1282 from CrazyJvm/yarn-doc and squashes the following commits:
1a5f25b [CrazyJvm] remove deprecated config
c438aec [CrazyJvm] fix style
86effa6 [CrazyJvm] change expression
211f130 [CrazyJvm] fix html tag
2900d23 [CrazyJvm] fix style
a4b2e27 [CrazyJvm] fix configuration spark.yarn.max.executor.failures
According to
```scala
private val maxNumExecutorFailures = sparkConf.getInt("spark.yarn.max.executor.failures",
sparkConf.getInt("spark.yarn.max.worker.failures", math.max(args.numExecutors * 2, 3)))
```
default value should be numExecutors * 2, with minimum of 3, and it's same to the config
`spark.yarn.max.worker.failures`
Author: CrazyJvm <crazyjvm@gmail.com>
Closes apache#1282 from CrazyJvm/yarn-doc and squashes the following commits:
1a5f25b [CrazyJvm] remove deprecated config
c438aec [CrazyJvm] fix style
86effa6 [CrazyJvm] change expression
211f130 [CrazyJvm] fix html tag
2900d23 [CrazyJvm] fix style
a4b2e27 [CrazyJvm] fix configuration spark.yarn.max.executor.failures
…discovery is disabled (#1282) * [CARMEL-6675] Support to enable decommission nodes when hive server2 service discovery is disabled
…dled as par of mapr-spark-3.5.5.0.202503201540-1.noarch.rpm (apache#1282)
According to
default value should be numExecutors * 2, with minimum of 3, and it's same to the config
spark.yarn.max.worker.failures