Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-14454][1.6] Better exception handling while marking tasks as failed #12272

Closed

Conversation

sameeragarwal
Copy link
Member

Backports #12234 to 1.6. Original description below:

What changes were proposed in this pull request?

This patch adds support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in WriterContainer.scala:

logError("Aborting task.", cause)
// call failure callbacks first, so we could have a chance to cleanup the writer.
TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause)
if (currentWriter != null) {
  currentWriter.close()
}
abortTask()
throw new SparkException("Task failed while writing rows.", cause)

If markTaskFailed or currentWriter.close throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility function Utils.tryWithSafeCatch that suppresses (Throwable.addSuppressed) the exception that are thrown within the catch block and rethrowing the original exception.

How was this patch tested?

No new functionality added

@sameeragarwal
Copy link
Member Author

cc @davies @rxin

@sameeragarwal sameeragarwal changed the title [SPARK-14454][SQL][1.6] Better exception handling while marking tasks as failed [SPARK-14454][1.6] Better exception handling while marking tasks as failed Apr 9, 2016
@SparkQA
Copy link

SparkQA commented Apr 9, 2016

Test build #55427 has finished for PR 12272 at commit 0defe22.

  • This patch fails build dependency tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 9, 2016

Test build #2772 has finished for PR 12272 at commit 0defe22.

  • This patch fails build dependency tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 10, 2016

Test build #2773 has finished for PR 12272 at commit 0defe22.

  • This patch fails build dependency tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@rxin
Copy link
Contributor

rxin commented Apr 10, 2016

cc @JoshRosen any idea what's going on?

@JoshRosen
Copy link
Contributor

We need to backport my mvn download change to branch-1.6, which is not as simple as a cherry-pick because those branches have diverged.

@davies
Copy link
Contributor

davies commented Apr 11, 2016

LGTM.

@davies
Copy link
Contributor

davies commented Apr 11, 2016

Since this is a cherry-pick from master (after fixing some conflicts), will not wait for the build come back, merging this into branch-1.6 now, thanks!

asfgit pushed a commit that referenced this pull request Apr 11, 2016
…failed

Backports #12234 to 1.6. Original description below:

## What changes were proposed in this pull request?

This patch adds support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in `WriterContainer.scala`:

```scala
logError("Aborting task.", cause)
// call failure callbacks first, so we could have a chance to cleanup the writer.
TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause)
if (currentWriter != null) {
  currentWriter.close()
}
abortTask()
throw new SparkException("Task failed while writing rows.", cause)
```

If `markTaskFailed` or `currentWriter.close` throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility function `Utils.tryWithSafeCatch` that suppresses (`Throwable.addSuppressed`) the exception that are thrown within the catch block and rethrowing the original exception.

## How was this patch tested?

No new functionality added

Author: Sameer Agarwal <sameer@databricks.com>

Closes #12272 from sameeragarwal/fix-exception-1.6.
@davies
Copy link
Contributor

davies commented Apr 11, 2016

Could you close this PR?

@sameeragarwal
Copy link
Member Author

yes, thanks

zzcclp pushed a commit to zzcclp/spark that referenced this pull request Apr 12, 2016
…failed

Backports apache#12234 to 1.6. Original description below:

## What changes were proposed in this pull request?

This patch adds support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in `WriterContainer.scala`:

```scala
logError("Aborting task.", cause)
// call failure callbacks first, so we could have a chance to cleanup the writer.
TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause)
if (currentWriter != null) {
  currentWriter.close()
}
abortTask()
throw new SparkException("Task failed while writing rows.", cause)
```

If `markTaskFailed` or `currentWriter.close` throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility function `Utils.tryWithSafeCatch` that suppresses (`Throwable.addSuppressed`) the exception that are thrown within the catch block and rethrowing the original exception.

## How was this patch tested?

No new functionality added

Author: Sameer Agarwal <sameer@databricks.com>

Closes apache#12272 from sameeragarwal/fix-exception-1.6.

(cherry picked from commit c12db0d)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
5 participants