Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update initializing-sparkcontext.md #129

Merged
merged 1 commit into from Jun 18, 2015
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion spark/examples/initializing-sparkcontext.md
Expand Up @@ -4,7 +4,7 @@ Initializing SparkContext to Scala and Java
The [Spark Programming Guide](https://spark.apache.org/docs/latest/programming-guide.html) from the official documentation provides many examples and notes around creating a Spark application in Scala, Java and Python. This document strives to provide a very basic template for minimal code required to write a Spark application.


Important: When initiating the SparkContext do *not* use a constructor or method to set the master value from within the code. There is a number of examples available on the Internet (not official Spark examples) that take this action which will end up overriding the cofniguration provided by `spark-submit` and may cause the application to fail or not perform as expected when used with a cluster manager.
Important: When initiating the SparkContext do *not* use a constructor or method to set the master value from within the code. There is a number of examples available on the Internet (not official Spark examples) that take this action which will end up overriding the configuration provided by `spark-submit` and may cause the application to fail or not perform as expected when used with a cluster manager.


## Java
Expand Down