Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-20605][Core][Yarn][Mesos] Deprecate not used AM and executor port configuration #17866

Closed
wants to merge 2 commits into from

Conversation

jerryshao
Copy link
Contributor

What changes were proposed in this pull request?

After SPARK-10997, client mode Netty RpcEnv doesn't require to start server, so port configurations are not used any more, here propose to remove these two configurations: "spark.executor.port" and "spark.am.port".

How was this patch tested?

Existing UTs.

Change-Id: I1280b8d803e22bd2084bdb4f49580c7955a2f476
@SparkQA
Copy link

SparkQA commented May 5, 2017

Test build #76475 has finished for PR 17866 at commit 3c9120e.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented May 5, 2017

CC @squito

@squito
Copy link
Contributor

squito commented May 5, 2017

gosh I don't know about this off the top of my head -- maybe @tgravescs or @vanzin know? if not I'll take a closer look next week.

@@ -579,7 +579,9 @@ private[spark] object SparkConf extends Logging {
"are no longer accepted. To specify the equivalent now, one may use '64k'."),
DeprecatedConfig("spark.rpc", "2.0", "Not used any more."),
DeprecatedConfig("spark.scheduler.executorTaskBlacklistTime", "2.1.0",
"Please use the new blacklisting options, spark.blacklist.*")
"Please use the new blacklisting options, spark.blacklist.*"),
DeprecatedConfig("spark.yarn.am.port", "2.2.1", "Not used any more"),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can "backdate" this to 2.0.0 since that's when akka was removed, making these options obsolete.

@@ -221,7 +220,7 @@ private[spark] object CoarseGrainedExecutorBackend extends Logging {
}

val env = SparkEnv.createExecutorEnv(
driverConf, executorId, hostname, port, cores, cfg.ioEncryptionKey, isLocal = false)
driverConf, executorId, hostname, -1, cores, cfg.ioEncryptionKey, isLocal = false)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about just removing the parameter from createExecutorEnv?

@@ -429,8 +429,7 @@ private[spark] class ApplicationMaster(
}

private def runExecutorLauncher(securityMgr: SecurityManager): Unit = {
val port = sparkConf.get(AM_PORT)
rpcEnv = RpcEnv.create("sparkYarnAM", Utils.localHostName, port, sparkConf, securityMgr,
rpcEnv = RpcEnv.create("sparkYarnAM", Utils.localHostName, -1, sparkConf, securityMgr,
Copy link
Contributor

@vanzin vanzin May 5, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It might be better to replace the two create parameters (port and clientMode) with a single serverPort: Option[Int] now; if it's set, a server is started, if it's not, it operates in client-only mode.

Probably ok to punt on that one though, since it will touch a lot more places I think.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will touch a lot of places, I would incline to leave that create as it was.

Change-Id: Ia1ae58b27edce1283b507026cdc4c0bd3b35817c
@SparkQA
Copy link

SparkQA commented May 8, 2017

Test build #76562 has started for PR 17866 at commit ac710c7.

@jerryshao
Copy link
Contributor Author

Jenkins, retest this please.

@SparkQA
Copy link

SparkQA commented May 8, 2017

Test build #76564 has finished for PR 17866 at commit ac710c7.

  • This patch fails SparkR unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@vanzin
Copy link
Contributor

vanzin commented May 8, 2017

retest this please

@SparkQA
Copy link

SparkQA commented May 8, 2017

Test build #76581 has finished for PR 17866 at commit ac710c7.

  • This patch fails SparkR unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@vanzin
Copy link
Contributor

vanzin commented May 8, 2017

The SparkR tests seem broken for everybody. Merging to master.

@asfgit asfgit closed this in 829cd7b May 8, 2017
liyichao pushed a commit to liyichao/spark that referenced this pull request May 24, 2017
…ort configuration

## What changes were proposed in this pull request?

After SPARK-10997, client mode Netty RpcEnv doesn't require to start server, so port configurations are not used any more, here propose to remove these two configurations: "spark.executor.port" and "spark.am.port".

## How was this patch tested?

Existing UTs.

Author: jerryshao <sshao@hortonworks.com>

Closes apache#17866 from jerryshao/SPARK-20605.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants