Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-2290] Worker should directly use its own sparkHome instead of appDesc.sparkHome when LaunchExecutor #1244

Closed
wants to merge 4 commits into from

Conversation

YanTangZhai
Copy link
Contributor

Worker should directly use its own sparkHome instead of appDesc.sparkHome when LaunchExecutor

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@pwendell
Copy link
Contributor

If we are going to remove this feature, we should just take the sparkHome field out of ApplicationDescription entirely.

@YanTangZhai
Copy link
Contributor Author

The sarkHome field is taken out of ApplicationDescription entirely. Please review again. Thanks.

@pwendell
Copy link
Contributor

pwendell commented Jul 8, 2014

LGTM pending tests, this is something that has confused people before, so I think it's best to just leave it out.

@pwendell
Copy link
Contributor

pwendell commented Jul 8, 2014

Jenkins, test this please.

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished.

@AmplabJenkins
Copy link

Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16401/

@pwendell
Copy link
Contributor

pwendell commented Jul 9, 2014

Jenkins, retest this please.

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished.

@AmplabJenkins
Copy link

Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16440/

@andrewor14
Copy link
Contributor

Jenkins, test this please

@andrewor14
Copy link
Contributor

Changes look reasonable to me. There were a few questions from the mailing list about this, so it'll be good to get this in.

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@SparkQA
Copy link

SparkQA commented Jul 10, 2014

QA tests have started for PR 1244. This patch merges cleanly.
View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16518/consoleFull

@SparkQA
Copy link

SparkQA commented Jul 10, 2014

QA results for PR 1244:
- This patch FAILED unit tests.
- This patch merges cleanly
- This patch adds no public classes

For more information see test ouptut:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16518/consoleFull

@AmplabJenkins
Copy link

Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16518/

@andrewor14
Copy link
Contributor

@YanTangZhai This doesn't compile... could you fix it?

@YanTangZhai
Copy link
Contributor Author

I've fixed the compile problem. Please review and test again. Thanks very much.

@CodingCat
Copy link
Contributor

instead of discarding sparkHome parameter entirely, shall we just prioritizing local SPARK_HOME env and pass the applicationDesc.sparkHome only if SPARK_HOME is not set in worker side?

@andrewor14
Copy link
Contributor

@CodingCat We only want to do this if the driver shares the same directory structure as the executors. This is an assumption that is incorrect in many deployment settings. Really, we should have something like spark.executor.home that is not the same as SPARK_HOME.

I am not 100% sure if we can just rip this functionality out actually. I am under the impression that Mesos still depends on something like this, so we should double check before we remove it.

@CodingCat
Copy link
Contributor

@andrewor14 yeah, I agree with you, I just thought in somewhere (document in the earlier versions? I cannot find it now), the user has to set this env variable? so I said prioritizing worker side SPARK_HOME, if this is not set, Spark will try to read application setup about SPARK_HOME (which may generates error if the directory structure is not the same)....

I also noticed this JIRA https://issues.apache.org/jira/browse/SPARK-2454 (left some comments there)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
6 participants