Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-11195][CORE] Use correct classloader for TaskResultGetter #9779

Closed
wants to merge 2 commits into from
Closed

[SPARK-11195][CORE] Use correct classloader for TaskResultGetter #9779

wants to merge 2 commits into from

Conversation

choochootrain
Copy link

Make sure we are using the context classloader when deserializing failed TaskResults instead of the Spark classloader.

The issue is that enqueueFailedTask was using the incorrect classloader which results in ClassNotFoundException.

Adds a test in TaskResultGetterSuite that compiles a custom exception, throws it on the executor, and asserts that Spark handles the TaskResult deserialization instead of returning UnknownReason.

See #9367 for previous comments
See SPARK-11195 for a full repro

Make sure we are using the context classloader when deserializing failed
TaskResults instead of the Spark classloader.

Adds a test in TaskResultGetterSuite that compiles a custom exception,
throws it on the executor, and asserts that Spark handles the
TaskResult deserialization properly.
@choochootrain
Copy link
Author

@yhuai any other comments?

@yhuai
Copy link
Contributor

yhuai commented Nov 18, 2015

ok to test

@SparkQA
Copy link

SparkQA commented Nov 18, 2015

Test build #46124 has finished for PR 9779 at commit 93939c3.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):\n * class HashingTF(override val uid: String)\n * class Interaction @Since(\"1.6.0\") (override val uid: String) extends Transformer\n * class Normalizer(override val uid: String)\n * class SQLTransformer @Since(\"1.6.0\") (override val uid: String) extends Transformer with Writable\n * class Tokenizer(override val uid: String)\n * case class ArrayConversion(elementConversion: JDBCConversion) extends JDBCConversion\n * abstract class JdbcDialect extends Serializable\n

* Before this fix, enqueueFailedTask would throw a ClassNotFoundException when deserializing
* the exception, resulting in an UnknownReason for the TaskEndResult.
*/
test("failed task deserialized with the correct classloader") {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would put the (SPARK-11195) in the test name to be consistent with other tests.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

++

@SparkQA
Copy link

SparkQA commented Nov 18, 2015

Test build #46192 has finished for PR 9779 at commit 8bd78af.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@yhuai
Copy link
Contributor

yhuai commented Nov 18, 2015

LGTM. Merging to master and branch 1.6. I will try to cherry-pick it to 1.5. If there is no major conflict, we will not need a 1.5 specific pr.

@yhuai
Copy link
Contributor

yhuai commented Nov 18, 2015

Thank you for the fix!

asfgit pushed a commit that referenced this pull request Nov 18, 2015
Make sure we are using the context classloader when deserializing failed TaskResults instead of the Spark classloader.

The issue is that `enqueueFailedTask` was using the incorrect classloader which results in `ClassNotFoundException`.

Adds a test in TaskResultGetterSuite that compiles a custom exception, throws it on the executor, and asserts that Spark handles the TaskResult deserialization instead of returning `UnknownReason`.

See #9367 for previous comments
See SPARK-11195 for a full repro

Author: Hurshal Patel <hpatel516@gmail.com>

Closes #9779 from choochootrain/spark-11195-master.

(cherry picked from commit 3cca5ff)
Signed-off-by: Yin Huai <yhuai@databricks.com>
@asfgit asfgit closed this in 3cca5ff Nov 18, 2015
asfgit pushed a commit that referenced this pull request Nov 18, 2015
Make sure we are using the context classloader when deserializing failed TaskResults instead of the Spark classloader.

The issue is that `enqueueFailedTask` was using the incorrect classloader which results in `ClassNotFoundException`.

Adds a test in TaskResultGetterSuite that compiles a custom exception, throws it on the executor, and asserts that Spark handles the TaskResult deserialization instead of returning `UnknownReason`.

See #9367 for previous comments
See SPARK-11195 for a full repro

Author: Hurshal Patel <hpatel516@gmail.com>

Closes #9779 from choochootrain/spark-11195-master.

(cherry picked from commit 3cca5ff)
Signed-off-by: Yin Huai <yhuai@databricks.com>

Conflicts:
	core/src/main/scala/org/apache/spark/TestUtils.scala
@yhuai
Copy link
Contributor

yhuai commented Nov 18, 2015

I have merged this PR to branch master, 1.6, and 1.5. There was a minor conflict with 1.5 branch. But, I fixed it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants