Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-26563]Fix java SimpleApp documentation #23487

Closed
wants to merge 1 commit into from
Closed

[SPARK-26563]Fix java SimpleApp documentation #23487

wants to merge 1 commit into from

Conversation

LeoColman
Copy link

As is, the example provided by the documentation will crash with org.apache.spark.SparkException: A master URL must be set in your configuration. This error happens due to no Spark Master being configured.

When creating a spark master locally, the simple application works correctly, as This StackOverflow Answer will state.


This change is very important as the original example will not work without this.

As is, the example provided by the documentation will crash with `org.apache.spark.SparkException: A master URL must be set in your configuration`. This error happens due to no `Spark Master` being configured.

When creating a spark master locally, the simple application works correctly, as [This StackOverflow Answer](https://stackoverflow.com/a/40555616/4257162) will state.
@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@LeoColman LeoColman changed the title Fix java SimpleApp documentation [SPARK-26563]Fix java SimpleApp documentation Jan 7, 2019
@HyukjinKwon
Copy link
Member

Master is specified in the submit in the example. then use the spark-submit script to run our program.

# Use spark-submit to run your application
$ YOUR_SPARK_HOME/bin/spark-submit \
  --class "SimpleApp" \
  --master local[4] \
  target/scala-2.11/simple-project_2.11-1.0.jar

That's also explain in the stack overflow link you pointed out:

To be clear, this is not what you should do in a production environment. In a production environment, spark.master should be specified in one of a couple other places: either in $SPARK_HOME/conf/spark-defaults.conf (this is where cloudera manager will put it), or on the command line when you submit the app. (ex spark-submit --master yarn).

@dongjoon-hyun
Copy link
Member

+1 for @HyukjinKwon comment.

@LeoColman
Copy link
Author

I do see that, and I did understand.

However, as I understand, a self-contained application should run by itself. As this is a hello world/quickstart application, I understood it as a lock-n-load, ready to execute.

Without configuring it (the way it's done in this commit), it's not possible to execute it without also having Spark installed. In my opinion, this violates the self-contained application definition.

If I'm thinking in the wrong way, please let me know

@LeoColman
Copy link
Author

Maybe I misunderstood the documentation.
I have to confess that this is my first time fiddling with Apache Spark, and I got confused while trying to execute my Hello World application. When I figured it out, I thought that maybe I could prevent this in the future

@HyukjinKwon
Copy link
Member

That's okay because it's clearly documented. I think I see no issue for following the documentation.

@dongjoon-hyun
Copy link
Member

Thank you for understanding, @Kerooker .
No problem at all. I'm looking forward to seeing you in another PR.
For now, let me close this PR.

@LeoColman LeoColman deleted the patch-1 branch January 8, 2019 17:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants