Skip to content

Conversation

@falaki
Copy link
Contributor

@falaki falaki commented Mar 25, 2017

What changes were proposed in this pull request?

Instead of creating new JavaSparkContext we use SparkContext.getOrCreate.

How was this patch tested?

Existing tests

@SparkQA
Copy link

SparkQA commented Mar 25, 2017

Test build #75193 has finished for PR 17423 at commit f07e3a9.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member

(I just happened to see this does not trigger the build with AppVeyor. Let me leave a build that I triggered with this by my account).
Build started: [SparkR] ALL PR-17423
Diff: master...spark-test:5B5DC229-F605-4598-AF9B-55C98C96F4D0

@felixcheung
Copy link
Member

this is already checked on the R side and we should never call createSparkContext more than once

@yhuai
Copy link
Contributor

yhuai commented Mar 26, 2017

@felixcheung SparkContext.getOrCreate is the preferred way to create a SparkContext. So, even we have check, it is still better to use getOrCreate.

@felixcheung
Copy link
Member

sure, that's just the context around it

@yhuai
Copy link
Contributor

yhuai commented Mar 26, 2017

got it. Thanks :)

@mengxr
Copy link
Contributor

mengxr commented Mar 27, 2017

LGTM. Merged into master. The failed tests are irrelevant to this PR, fixed in a2ce0a2.

@asfgit asfgit closed this in 0588dc7 Mar 27, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants