Skip to content

[BEAM-1405] Spark runner should not close the SparkContext when the context is provided#1934

Merged
asfgit merged 2 commits intoapache:masterfrom
iemejia:BEAM-1405-spark-stop-context
Feb 7, 2017
Merged

[BEAM-1405] Spark runner should not close the SparkContext when the context is provided#1934
asfgit merged 2 commits intoapache:masterfrom
iemejia:BEAM-1405-spark-stop-context

Conversation

@iemejia
Copy link
Member

@iemejia iemejia commented Feb 7, 2017

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

  • Make sure the PR title is formatted like:
    [BEAM-<Jira issue #>] Description of pull request
  • Make sure tests pass via mvn clean verify. (Even better, enable
    Travis-CI on your fork and ensure the whole test matrix passes).
  • Replace <Jira issue #> in the title with the actual Jira issue
    number, if there is one.
  • If this contribution is large, please file an Apache
    Individual Contributor License Agreement.

@iemejia
Copy link
Member Author

iemejia commented Feb 7, 2017

R: @amitsela

@asfbot
Copy link

asfbot commented Feb 7, 2017

Refer to this link for build results (access rights to CI server needed):
https://builds.apache.org/job/beam_PreCommit_Java_MavenInstall/7139/

Build result: FAILURE

[...truncated 7504 lines...] at hudson.remoting.UserRequest.perform(UserRequest.java:50) at hudson.remoting.Request$2.run(Request.java:332) at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)Caused by: org.apache.maven.plugin.MojoExecutionException: Archetype IT 'basic' failed: Execution failure: exit code = 1 at org.apache.maven.archetype.mojos.IntegrationTestMojo.execute(IntegrationTestMojo.java:269) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208) ... 31 more2017-02-07T11:03:09.543 [ERROR] 2017-02-07T11:03:09.543 [ERROR] Re-run Maven using the -X switch to enable full debug logging.2017-02-07T11:03:09.543 [ERROR] 2017-02-07T11:03:09.543 [ERROR] For more information about the errors and possible solutions, please read the following articles:2017-02-07T11:03:09.543 [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException2017-02-07T11:03:09.543 [ERROR] 2017-02-07T11:03:09.543 [ERROR] After correcting the problems, you can resume the build with the command2017-02-07T11:03:09.543 [ERROR] mvn -rf :beam-sdks-java-maven-archetypes-starterchannel stoppedSetting status of 3b29d67 to FAILURE with url https://builds.apache.org/job/beam_PreCommit_Java_MavenInstall/7139/ and message: 'Build finished. 'Using context: Jenkins: Maven clean install
--none--

Copy link
Member

@amitsela amitsela left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@iemejia looks good though I had a couple of comments on the tests.

* Provided Spark Context tests.
*/
public class ProvidedSparkContextTest {
private static final String[] WORDS_ARRAY = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Those should stay up here I think (all static members declarations + getSparkContextOptions)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I won't move the declarations.

public void testWithProvidedContext() throws Exception {
JavaSparkContext jsc = new JavaSparkContext("local[*]", "Existing_Context");
testWithValidProvidedContext(jsc);
if (!jsc.sc().isStopped()) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Asserting here that the context is not stopped would be good, and make testReuseProvidedContext redundant I believe - you want to make sure that context remains alive after executing a pipeline with a provided context & that also guarantees it's reusable..

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Understood !

* @throws Exception
*/
@Test
public void testReuseProvidedContext() throws Exception {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could be avoided by asserting the context stays alive after executing a provided context pipeline (mentioned above).

@amitsela
Copy link
Member

amitsela commented Feb 7, 2017

Failure seems to be unrelated, let's wait for the next build (after relating to review comments).

@jbonofre
Copy link
Member

jbonofre commented Feb 7, 2017

CC: @jbonofre

@amitsela
Copy link
Member

amitsela commented Feb 7, 2017

@jbonofre stop! be on vacation, we're fine here 😉

@jbonofre
Copy link
Member

jbonofre commented Feb 7, 2017

Doing my best 😀

@iemejia
Copy link
Member Author

iemejia commented Feb 7, 2017

@jbonofre Amit is right, profit of the beach !

@iemejia iemejia force-pushed the BEAM-1405-spark-stop-context branch from 3b29d67 to 5edcdff Compare February 7, 2017 13:52
@iemejia
Copy link
Member Author

iemejia commented Feb 7, 2017

I did the changes and rebased as asked.

@amitsela
Copy link
Member

amitsela commented Feb 7, 2017

LGTM pending tests.

@coveralls
Copy link

Coverage Status

Coverage decreased (-0.01%) to 69.585% when pulling 5edcdff on iemejia:BEAM-1405-spark-stop-context into 9dad73c on apache:master.

@asfbot
Copy link

asfbot commented Feb 7, 2017

Refer to this link for build results (access rights to CI server needed):
https://builds.apache.org/job/beam_PreCommit_Java_MavenInstall/7141/
--none--

@iemejia iemejia changed the title [BEAM-1405} Spark runner should not close the spark context when set to reuse it [BEAM-1405} Spark runner should not close the SparkContext when the context is provided Feb 7, 2017
@iemejia iemejia changed the title [BEAM-1405} Spark runner should not close the SparkContext when the context is provided [BEAM-1405] Spark runner should not close the SparkContext when the context is provided Feb 7, 2017
@asfgit asfgit merged commit 5edcdff into apache:master Feb 7, 2017
asfgit pushed a commit that referenced this pull request Feb 7, 2017
@iemejia iemejia deleted the BEAM-1405-spark-stop-context branch February 20, 2017 21:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants