-
Notifications
You must be signed in to change notification settings - Fork 28.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-20605][Core][Yarn][Mesos] Deprecate not used AM and executor port configuration #17866
Conversation
Change-Id: I1280b8d803e22bd2084bdb4f49580c7955a2f476
Test build #76475 has finished for PR 17866 at commit
|
CC @squito |
gosh I don't know about this off the top of my head -- maybe @tgravescs or @vanzin know? if not I'll take a closer look next week. |
@@ -579,7 +579,9 @@ private[spark] object SparkConf extends Logging { | |||
"are no longer accepted. To specify the equivalent now, one may use '64k'."), | |||
DeprecatedConfig("spark.rpc", "2.0", "Not used any more."), | |||
DeprecatedConfig("spark.scheduler.executorTaskBlacklistTime", "2.1.0", | |||
"Please use the new blacklisting options, spark.blacklist.*") | |||
"Please use the new blacklisting options, spark.blacklist.*"), | |||
DeprecatedConfig("spark.yarn.am.port", "2.2.1", "Not used any more"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can "backdate" this to 2.0.0 since that's when akka was removed, making these options obsolete.
@@ -221,7 +220,7 @@ private[spark] object CoarseGrainedExecutorBackend extends Logging { | |||
} | |||
|
|||
val env = SparkEnv.createExecutorEnv( | |||
driverConf, executorId, hostname, port, cores, cfg.ioEncryptionKey, isLocal = false) | |||
driverConf, executorId, hostname, -1, cores, cfg.ioEncryptionKey, isLocal = false) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about just removing the parameter from createExecutorEnv
?
@@ -429,8 +429,7 @@ private[spark] class ApplicationMaster( | |||
} | |||
|
|||
private def runExecutorLauncher(securityMgr: SecurityManager): Unit = { | |||
val port = sparkConf.get(AM_PORT) | |||
rpcEnv = RpcEnv.create("sparkYarnAM", Utils.localHostName, port, sparkConf, securityMgr, | |||
rpcEnv = RpcEnv.create("sparkYarnAM", Utils.localHostName, -1, sparkConf, securityMgr, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It might be better to replace the two create
parameters (port
and clientMode
) with a single serverPort: Option[Int]
now; if it's set, a server is started, if it's not, it operates in client-only mode.
Probably ok to punt on that one though, since it will touch a lot more places I think.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will touch a lot of places, I would incline to leave that create
as it was.
Change-Id: Ia1ae58b27edce1283b507026cdc4c0bd3b35817c
Test build #76562 has started for PR 17866 at commit |
Jenkins, retest this please. |
Test build #76564 has finished for PR 17866 at commit
|
retest this please |
Test build #76581 has finished for PR 17866 at commit
|
The SparkR tests seem broken for everybody. Merging to master. |
…ort configuration ## What changes were proposed in this pull request? After SPARK-10997, client mode Netty RpcEnv doesn't require to start server, so port configurations are not used any more, here propose to remove these two configurations: "spark.executor.port" and "spark.am.port". ## How was this patch tested? Existing UTs. Author: jerryshao <sshao@hortonworks.com> Closes apache#17866 from jerryshao/SPARK-20605.
What changes were proposed in this pull request?
After SPARK-10997, client mode Netty RpcEnv doesn't require to start server, so port configurations are not used any more, here propose to remove these two configurations: "spark.executor.port" and "spark.am.port".
How was this patch tested?
Existing UTs.