Skip to content

Commit

Permalink
[SPARK-24981][CORE] ShutdownHook timeout causes job to fail when succ…
Browse files Browse the repository at this point in the history
…eeded when SparkContext stop() not called by user program

**Description**
The issue is described in [SPARK-24981](https://issues.apache.org/jira/browse/SPARK-24981).

**How does this PR fix the issue?**
This PR catch the Exception that is thrown while the Sparkcontext.stop() is running (when it is called by the ShutdownHookManager).

**How was this patch tested?**
I manually tested it by adding delay (60s) inside the stop(). This make the shutdownHookManger interrupt the thread that is running stop(). The Interrupted Exception was catched and the job succeed.

Author: Hieu Huynh <“Hieu.huynh@oath.com”>
Author: Hieu Tri Huynh <hthieu96@gmail.com>

Closes #21936 from hthuynh2/SPARK_24981.
  • Loading branch information
Hieu Huynh authored and tgravescs committed Aug 6, 2018
1 parent c1760da commit 35700bb
Showing 1 changed file with 6 additions and 1 deletion.
7 changes: 6 additions & 1 deletion core/src/main/scala/org/apache/spark/SparkContext.scala
Expand Up @@ -571,7 +571,12 @@ class SparkContext(config: SparkConf) extends Logging {
_shutdownHookRef = ShutdownHookManager.addShutdownHook(
ShutdownHookManager.SPARK_CONTEXT_SHUTDOWN_PRIORITY) { () =>
logInfo("Invoking stop() from shutdown hook")
stop()
try {
stop()
} catch {
case e: Throwable =>
logWarning("Ignoring Exception while stopping SparkContext from shutdown hook", e)
}
}
} catch {
case NonFatal(e) =>
Expand Down

0 comments on commit 35700bb

Please sign in to comment.