Skip to content

Commit

Permalink
[SPARK-9497] [SPARK-9509] [CORE] Use ask instead of askWithRetry
Browse files Browse the repository at this point in the history
`RpcEndpointRef.askWithRetry` throws `SparkException` rather than `TimeoutException`. Use ask to replace it because we don't need to retry here.

Author: zsxwing <zsxwing@gmail.com>

Closes apache#7824 from zsxwing/SPARK-9497 and squashes the following commits:

7bfc2b4 [zsxwing] Use ask instead of askWithRetry
  • Loading branch information
zsxwing authored and kayousterhout committed Jul 31, 2015
1 parent fc0e57e commit 04a49ed
Showing 1 changed file with 3 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ import org.apache.spark.deploy.{ApplicationDescription, ExecutorState}
import org.apache.spark.deploy.DeployMessages._
import org.apache.spark.deploy.master.Master
import org.apache.spark.rpc._
import org.apache.spark.util.{ThreadUtils, Utils}
import org.apache.spark.util.{RpcUtils, ThreadUtils, Utils}

/**
* Interface allowing applications to speak with a Spark deploy cluster. Takes a master URL,
Expand Down Expand Up @@ -248,7 +248,8 @@ private[spark] class AppClient(
def stop() {
if (endpoint != null) {
try {
endpoint.askWithRetry[Boolean](StopAppClient)
val timeout = RpcUtils.askRpcTimeout(conf)
timeout.awaitResult(endpoint.ask[Boolean](StopAppClient))
} catch {
case e: TimeoutException =>
logInfo("Stop request to Master timed out; it may already be shut down.")
Expand Down

0 comments on commit 04a49ed

Please sign in to comment.