Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-6490][Docs] Add docs for rpc configurations #5607

Closed
wants to merge 5 commits into from
Closed

[SPARK-6490][Docs] Add docs for rpc configurations #5607

wants to merge 5 commits into from

Conversation

zsxwing
Copy link
Member

@zsxwing zsxwing commented Apr 21, 2015

Added docs for rpc configurations and also fixed two places that should have been fixed in #5595.

@srowen
Copy link
Member

srowen commented Apr 21, 2015

LGTM

@SparkQA
Copy link

SparkQA commented Apr 21, 2015

Test build #30659 timed out for PR 5607 at commit 4f07174 after a configured wait of 150m.

@zsxwing
Copy link
Member Author

zsxwing commented Apr 21, 2015

retest this please

@SparkQA
Copy link

SparkQA commented Apr 21, 2015

Test build #30667 has finished for PR 5607 at commit 4f07174.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • case class Data(boundary: Double, prediction: Double)
    • class DateConverter(object):
    • class DatetimeConverter(object):
  • This patch does not change any dependencies.

<td><code>spark.rpc.retry.wait</code></td>
<td>3s</td>
<td>
How long for an RPC ask operation to wait before starting the next retry.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: "before retrying"

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Duration for an ...

@vanzin
Copy link
Contributor

vanzin commented Apr 21, 2015

LGTM.

<tr>
<td><code>spark.rpc.numRetries</code></td>
<td>3</td>
How many times for an RPC ask operation to retry before giving up.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Number of times to retry before an RPC task gives up.

(We should also indicate whether 1 = retry once, or 1 = run it once in total)

@rxin
Copy link
Contributor

rxin commented Apr 21, 2015

Can you also change the default timeout? Thanks.

@zsxwing
Copy link
Member Author

zsxwing commented Apr 21, 2015

Updated docs and the timeout.

@@ -48,11 +48,13 @@ object RpcUtils {

/** Returns the default Spark timeout to use for RPC ask operations. */
def askTimeout(conf: SparkConf): FiniteDuration = {
conf.getTimeAsSeconds("spark.rpc.askTimeout", "30s") seconds
conf.getTimeAsSeconds("spark.rpc.askTimeout",
conf.get("spark.network.timeout", "30s")) seconds
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we change it to the same timeout where we set for spark.network.timeout else where? I think we use a number higher than 30s.

@zsxwing
Copy link
Member Author

zsxwing commented Apr 22, 2015

Increased the default timeout to 120s

@rxin
Copy link
Contributor

rxin commented Apr 22, 2015

LGTM

@SparkQA
Copy link

SparkQA commented Apr 22, 2015

Test build #30713 has finished for PR 5607 at commit 6e37c30.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@rxin
Copy link
Contributor

rxin commented Apr 22, 2015

Thanks. I've merged this.

@asfgit asfgit closed this in 3a3f710 Apr 22, 2015
@SparkQA
Copy link

SparkQA commented Apr 22, 2015

Test build #30714 has finished for PR 5607 at commit 25a6736.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch adds the following new dependencies:
    • commons-math3-3.4.1.jar
    • snappy-java-1.1.1.7.jar
  • This patch removes the following dependencies:
    • commons-math3-3.1.1.jar
    • snappy-java-1.1.1.6.jar

@zsxwing zsxwing deleted the SPARK-6490-docs branch April 22, 2015 05:07
nemccarthy pushed a commit to nemccarthy/spark that referenced this pull request Jun 19, 2015
Added docs for rpc configurations and also fixed two places that should have been fixed in apache#5595.

Author: zsxwing <zsxwing@gmail.com>

Closes apache#5607 from zsxwing/SPARK-6490-docs and squashes the following commits:

25a6736 [zsxwing] Increase the default timeout to 120s
6e37c30 [zsxwing] Update docs
5577540 [zsxwing] Use spark.network.timeout as the default timeout if it presents
4f07174 [zsxwing] Fix unit tests
1c2cf26 [zsxwing] Add docs for rpc configurations
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
5 participants