Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-19607] Finding QueryExecution that matches provided executionId #16940

Closed
wants to merge 1 commit into from

Conversation

ala
Copy link
Contributor

@ala ala commented Feb 15, 2017

What changes were proposed in this pull request?

Implementing a mapping between executionId and corresponding QueryExecution in SQLExecution.

How was this patch tested?

Adds a unit test.

@ala
Copy link
Contributor Author

ala commented Feb 15, 2017

@rxin

@rxin
Copy link
Contributor

rxin commented Feb 15, 2017

LGTM (pending Jenkins).

@SparkQA
Copy link

SparkQA commented Feb 15, 2017

Test build #72937 has finished for PR 16940 at commit ee452a6.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@rxin
Copy link
Contributor

rxin commented Feb 15, 2017

Merging in master (since the failing test case is unrelated).

@dongjoon-hyun
Copy link
Member

Hi, @rxin and @ala .
This seems to cause a test failures.
Could you review the hotfix #16943 ?

ghost pushed a commit to dbtsai/spark that referenced this pull request Feb 15, 2017
…ecutionId

## What changes were proposed in this pull request?

apache#16940 adds a test case which does not stop the spark job. It causes many failures of other test cases.

- https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.7/2403/consoleFull
- https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-2.7/2600/consoleFull

```
[info]   org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
```

## How was this patch tested?

Pass the Jenkins test.

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes apache#16943 from dongjoon-hyun/SPARK-19607-2.
cmonkey pushed a commit to cmonkey/spark that referenced this pull request Feb 16, 2017
## What changes were proposed in this pull request?

Implementing a mapping between executionId and corresponding QueryExecution in SQLExecution.

## How was this patch tested?

Adds a unit test.

Author: Ala Luszczak <ala@databricks.com>

Closes apache#16940 from ala/execution-id.
cmonkey pushed a commit to cmonkey/spark that referenced this pull request Feb 16, 2017
…ecutionId

## What changes were proposed in this pull request?

apache#16940 adds a test case which does not stop the spark job. It causes many failures of other test cases.

- https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.7/2403/consoleFull
- https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-2.7/2600/consoleFull

```
[info]   org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
```

## How was this patch tested?

Pass the Jenkins test.

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes apache#16943 from dongjoon-hyun/SPARK-19607-2.
liancheng pushed a commit to liancheng/spark that referenced this pull request Mar 17, 2017
Implementing a mapping between executionId and corresponding QueryExecution in SQLExecution.

Adds a unit test.

Author: Ala Luszczak <ala@databricks.com>

Closes apache#16940 from ala/execution-id.

(cherry picked from commit b55563c)
Signed-off-by: Reynold Xin <rxin@databricks.com>
liancheng pushed a commit to liancheng/spark that referenced this pull request Mar 17, 2017
…ecutionId

## What changes were proposed in this pull request?

This is a backport of apache@59dc26e

apache#16940 adds a test case which does not stop the spark job. It causes many failures of other test cases.

- https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.7/2403/consoleFull
- https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-2.7/2600/consoleFull

```
[info]   org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
```

## How was this patch tested?

Pass the Jenkins test.

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes apache#234 from ala/execution-id-fix.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants