Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-12339] [WebUI] Added a null check that was removed in SPARK-11206 #10405

Closed
wants to merge 1 commit into from

Conversation

ajbozarth
Copy link
Member

Updates made in SPARK-11206 missed an edge case which cause's a NullPointerException when a task is killed. In some cases when a task ends in failure taskMetrics is initialized as null (see JobProgressListener.onTaskEnd()). To address this a null check was added. Before the changes in SPARK-11206 this null check was called at the start of the updateTaskAccumulatorValues() function.

@ajbozarth
Copy link
Member Author

Looping in those involved with SPARK-11206: @carsonwang @JoshRosen @vanzin

@ajbozarth
Copy link
Member Author

FYI the line in JobProgressListener.onTaskEnd that initializes the null value is 387.

@SparkQA
Copy link

SparkQA commented Dec 21, 2015

Test build #48089 has finished for PR 10405 at commit 0a46559.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@carsonwang
Copy link
Contributor

Thanks for catching this. I think the null check here is necessary, and it seems the code that really pass a null taskMetrcis is from the TaskSetManager line 796 when a task is resubmitted because of executor lost.

@srowen
Copy link
Member

srowen commented Dec 21, 2015

Maybe a dumb question, but is it right that calling updateTaskAccumulatorValues is meaningless if there are no taskMetrics? that is, should the call not happen at all or should a different arg be passed rather than the arg that causes an NPE? otherwise LGTM

@ajbozarth
Copy link
Member Author

@srowen I thought the same thing so I looked into git history and git blame and found that this is how updateTaskAccumulatorValues has always worked, it's original code up until the change in SPARK-11206 had a null check that immediately returned at the start of the function.

@andrewor14
Copy link
Contributor

OK, merging into master, thanks.

@asfgit asfgit closed this in b0849b8 Dec 21, 2015
@ajbozarth ajbozarth deleted the spark12339 branch December 21, 2015 22:21
@Naresh523
Copy link

I'm seeing this issue been fixed in v2.0.0. Is there any possibility to fix this in v1.6.x ?

@srowen
Copy link
Member

srowen commented Feb 21, 2017

@Naresh523 this particular issue doesn't appear to exist in 1.6 because the change that this fixes went into 2.0

@Naresh523
Copy link

Naresh523 commented Feb 21, 2017

I'm getting similar error logs [1]. there were no changes in application.
I'm using spark v1.6.0

[1] https://issues.apache.org/jira/browse/SPARK-12339

@VishnuGowthemT
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
7 participants