Skip to content

Commit

Permalink
Use spark.network.timeout as the default timeout if it presents
Browse files Browse the repository at this point in the history
  • Loading branch information
zsxwing committed Apr 21, 2015
1 parent 4f07174 commit 5577540
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 4 deletions.
6 changes: 4 additions & 2 deletions core/src/main/scala/org/apache/spark/util/RpcUtils.scala
Original file line number Diff line number Diff line change
Expand Up @@ -48,11 +48,13 @@ object RpcUtils {

/** Returns the default Spark timeout to use for RPC ask operations. */
def askTimeout(conf: SparkConf): FiniteDuration = {
conf.getTimeAsSeconds("spark.rpc.askTimeout", "30s") seconds
conf.getTimeAsSeconds("spark.rpc.askTimeout",
conf.get("spark.network.timeout", "30s")) seconds
}

/** Returns the default Spark timeout to use for RPC remote endpoint lookup. */
def lookupTimeout(conf: SparkConf): FiniteDuration = {
conf.getTimeAsSeconds("spark.rpc.lookupTimeout", "30s") seconds
conf.getTimeAsSeconds("spark.rpc.lookupTimeout",
conf.get("spark.network.timeout", "30s")) seconds
}
}
5 changes: 3 additions & 2 deletions docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -963,8 +963,9 @@ Apart from these, the following properties are also available, and may be useful
<td>
Default timeout for all network interactions. This config will be used in place of
<code>spark.core.connection.ack.wait.timeout</code>, <code>spark.akka.timeout</code>,
<code>spark.storage.blockManagerSlaveTimeoutMs</code> or
<code>spark.shuffle.io.connectionTimeout</code>, if they are not configured.
<code>spark.storage.blockManagerSlaveTimeoutMs</code>,
<code>spark.shuffle.io.connectionTimeout</code>, <code>spark.rpc.askTimeout</code> or
<code>spark.rpc.lookupTimeout</code> if they are not configured.
</td>
</tr>
<tr>
Expand Down

0 comments on commit 5577540

Please sign in to comment.