-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-14454][1.6] Better exception handling while marking tasks as failed #12272
Conversation
Test build #55427 has finished for PR 12272 at commit
|
Test build #2772 has finished for PR 12272 at commit
|
Test build #2773 has finished for PR 12272 at commit
|
cc @JoshRosen any idea what's going on? |
We need to backport my |
LGTM. |
Since this is a cherry-pick from master (after fixing some conflicts), will not wait for the build come back, merging this into branch-1.6 now, thanks! |
…failed Backports #12234 to 1.6. Original description below: ## What changes were proposed in this pull request? This patch adds support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in `WriterContainer.scala`: ```scala logError("Aborting task.", cause) // call failure callbacks first, so we could have a chance to cleanup the writer. TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause) if (currentWriter != null) { currentWriter.close() } abortTask() throw new SparkException("Task failed while writing rows.", cause) ``` If `markTaskFailed` or `currentWriter.close` throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility function `Utils.tryWithSafeCatch` that suppresses (`Throwable.addSuppressed`) the exception that are thrown within the catch block and rethrowing the original exception. ## How was this patch tested? No new functionality added Author: Sameer Agarwal <sameer@databricks.com> Closes #12272 from sameeragarwal/fix-exception-1.6.
Could you close this PR? |
yes, thanks |
…failed Backports apache#12234 to 1.6. Original description below: ## What changes were proposed in this pull request? This patch adds support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in `WriterContainer.scala`: ```scala logError("Aborting task.", cause) // call failure callbacks first, so we could have a chance to cleanup the writer. TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause) if (currentWriter != null) { currentWriter.close() } abortTask() throw new SparkException("Task failed while writing rows.", cause) ``` If `markTaskFailed` or `currentWriter.close` throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility function `Utils.tryWithSafeCatch` that suppresses (`Throwable.addSuppressed`) the exception that are thrown within the catch block and rethrowing the original exception. ## How was this patch tested? No new functionality added Author: Sameer Agarwal <sameer@databricks.com> Closes apache#12272 from sameeragarwal/fix-exception-1.6. (cherry picked from commit c12db0d)
Backports #12234 to 1.6. Original description below:
What changes were proposed in this pull request?
This patch adds support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in
WriterContainer.scala
:If
markTaskFailed
orcurrentWriter.close
throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility functionUtils.tryWithSafeCatch
that suppresses (Throwable.addSuppressed
) the exception that are thrown within the catch block and rethrowing the original exception.How was this patch tested?
No new functionality added