Skip to content

[SPARK-26700][Core][FOLLOWUP] Add config spark.network.maxRemoteBlockSizeFetchToMem#27463

Closed
xuanyuanking wants to merge 2 commits intoapache:masterfrom
xuanyuanking:SPARK-26700-follow
Closed

[SPARK-26700][Core][FOLLOWUP] Add config spark.network.maxRemoteBlockSizeFetchToMem#27463
xuanyuanking wants to merge 2 commits intoapache:masterfrom
xuanyuanking:SPARK-26700-follow

Conversation

@xuanyuanking
Copy link
Member

What changes were proposed in this pull request?

Add new config spark.network.maxRemoteBlockSizeFetchToMem fallback to the old config spark.maxRemoteBlockSizeFetchToMem.

Why are the changes needed?

For naming consistency.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Existing tests.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shall we move it to SparkConf.deprecatedConfigs?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thansk, done in 2d2637b.

@SparkQA
Copy link

SparkQA commented Feb 5, 2020

Test build #117880 has finished for PR 27463 at commit 4201a93.

  • This patch fails SparkR unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 5, 2020

Test build #117886 has finished for PR 27463 at commit 1939c3f.

  • This patch fails SparkR unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 5, 2020

Test build #117890 has finished for PR 27463 at commit 4201a93.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 5, 2020

Test build #117888 has finished for PR 27463 at commit 77714a3.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

@Ngone51 Ngone51 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we consider use configsWithAlternatives for this case? Which seems more convenient for such case.

@xuanyuanking
Copy link
Member Author

Make senes, done in 5c1f201

@SparkQA
Copy link

SparkQA commented Feb 5, 2020

Test build #117932 has finished for PR 27463 at commit 2d2637b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 5, 2020

Test build #117940 has finished for PR 27463 at commit 5c1f201.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dongjoon-hyun
Copy link
Member

Retest this please.

@SparkQA
Copy link

SparkQA commented Feb 5, 2020

Test build #117946 has finished for PR 27463 at commit 5c1f201.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.


private[spark] val MAX_REMOTE_BLOCK_SIZE_FETCH_TO_MEM =
ConfigBuilder("spark.maxRemoteBlockSizeFetchToMem")
private[spark] val NETWORK_MAX_REMOTE_BLOCK_SIZE_FETCH_TO_MEM =
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can keep using MAX_REMOTE_BLOCK_SIZE_FETCH_TO_MEM

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, done in 8f7540e.

AlternateConfig("spark.reducer.maxReqSizeShuffleToMem", "2.3")),
NETWORK_MAX_REMOTE_BLOCK_SIZE_FETCH_TO_MEM.key -> Seq(
AlternateConfig("spark.reducer.maxReqSizeShuffleToMem", "2.3"),
AlternateConfig("spark.maxRemoteBlockSizeFetchToMem", "3.0")),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will spark log warning message when these 2 alternatives are used? If not we still need to put them in deprecatedConfigs

Copy link
Member

@Ngone51 Ngone51 Feb 6, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It will when user use old one, since deprecation warning also includes configsWithAlternatives:

def logDeprecationWarning(key: String): Unit = {
deprecatedConfigs.get(key).foreach { cfg =>
logWarning(
s"The configuration key '$key' has been deprecated as of Spark ${cfg.version} and " +
s"may be removed in the future. ${cfg.deprecationMessage}")
return
}
allAlternatives.get(key).foreach { case (newKey, cfg) =>
logWarning(
s"The configuration key '$key' has been deprecated as of Spark ${cfg.version} and " +
s"may be removed in the future. Please use the new key '$newKey' instead.")
return
}

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tested locally, Spark will log warning messages.

@SparkQA
Copy link

SparkQA commented Feb 6, 2020

Test build #117973 has finished for PR 27463 at commit 8f7540e.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan
Copy link
Contributor

retest this please

@SparkQA
Copy link

SparkQA commented Feb 6, 2020

Test build #117981 has finished for PR 27463 at commit 8f7540e.

  • This patch fails build dependency tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan
Copy link
Contributor

retest this please

@SparkQA
Copy link

SparkQA commented Feb 6, 2020

Test build #117984 has finished for PR 27463 at commit 8f7540e.

  • This patch fails build dependency tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan
Copy link
Contributor

retest this please

@SparkQA
Copy link

SparkQA commented Feb 6, 2020

Test build #117986 has finished for PR 27463 at commit 8f7540e.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan
Copy link
Contributor

thanks, merging to master/3.0!

@cloud-fan cloud-fan closed this in d861357 Feb 6, 2020
cloud-fan pushed a commit that referenced this pull request Feb 6, 2020
…kSizeFetchToMem`

### What changes were proposed in this pull request?
Add new config `spark.network.maxRemoteBlockSizeFetchToMem` fallback to the old config `spark.maxRemoteBlockSizeFetchToMem`.

### Why are the changes needed?
For naming consistency.

### Does this PR introduce any user-facing change?
No.

### How was this patch tested?
Existing tests.

Closes #27463 from xuanyuanking/SPARK-26700-follow.

Authored-by: Yuanjian Li <xyliyuanjian@gmail.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
(cherry picked from commit d861357)
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
sjincho pushed a commit to sjincho/spark that referenced this pull request Apr 15, 2020
…kSizeFetchToMem`

### What changes were proposed in this pull request?
Add new config `spark.network.maxRemoteBlockSizeFetchToMem` fallback to the old config `spark.maxRemoteBlockSizeFetchToMem`.

### Why are the changes needed?
For naming consistency.

### Does this PR introduce any user-facing change?
No.

### How was this patch tested?
Existing tests.

Closes apache#27463 from xuanyuanking/SPARK-26700-follow.

Authored-by: Yuanjian Li <xyliyuanjian@gmail.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants

Comments