Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-1550] [PySpark] Allow SparkContext creation after failed attempts #1606

Closed
wants to merge 1 commit into from

Conversation

JoshRosen
Copy link
Contributor

This addresses a PySpark issue where a failed attempt to construct SparkContext would prevent any future SparkContext creation.

@SparkQA
Copy link

SparkQA commented Jul 27, 2014

QA tests have started for PR 1606. This patch merges cleanly.
View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17232/consoleFull

@@ -249,17 +258,14 @@ def defaultMinPartitions(self):
"""
return self._jsc.sc().defaultMinPartitions()

def __del__(self):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because we keep a reference to this object in SparkContext._active_spark_context, this method never gets called except when cleaning up after a SparkContext creation attempt that failed because another context was already running. In that case, the call to sc.stop() clears SparkContext._active_spark_context and we lose track of the active context, which can allow the creation of multiple running contexts.

@SparkQA
Copy link

SparkQA commented Jul 27, 2014

QA results for PR 1606:
- This patch PASSES unit tests.
- This patch merges cleanly
- This patch adds no public classes

For more information see test ouptut:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17232/consoleFull

@mattf
Copy link
Contributor

mattf commented Jul 27, 2014

+1 lgtm

i've been hitting this issue repeatedly, but assumed it was a corner case that wouldn't get much attention. my assumption is that use of spark-submit and pyspark are the primary ways to get a context.

@mateiz
Copy link
Contributor

mateiz commented Jul 28, 2014

Thanks Josh, merged this.

@asfgit asfgit closed this in a7d145e Jul 28, 2014
xiliu82 pushed a commit to xiliu82/spark that referenced this pull request Sep 4, 2014
This addresses a PySpark issue where a failed attempt to construct SparkContext would prevent any future SparkContext creation.

Author: Josh Rosen <joshrosen@apache.org>

Closes apache#1606 from JoshRosen/SPARK-1550 and squashes the following commits:

ec7fadc [Josh Rosen] [SPARK-1550] [PySpark] Allow SparkContext creation after failed attempts
sunchao pushed a commit to sunchao/spark that referenced this pull request Jun 2, 2023
(cherry picked from commit d6a02ec59cd1997858a2e63e85e834361a769dc3)
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants