get sparkhome from sparkconf Preferentially#615
Conversation
|
Can one of the admins verify this patch? |
There was a problem hiding this comment.
Just wondering, could you describe the use case where you want to set this via a Spark property and not by setting SPARK_HOME on the cluster?
There was a problem hiding this comment.
Since spark provide sparkconf to manage it's config system, we should first get option from sparkconf, then system env(compatible with old version). And we‘d better avoid using system env.
Actually in sparkcontext it get sparkhome like this
private[spark] def getSparkHome(): Option[String] = {
conf.getOption("spark.home").orElse(Option(System.getenv("SPARK_HOME")))
}
also, we keep consistent.
There was a problem hiding this comment.
SparkConf is intended for application-wide configurations. SPARK_HOME is actually a node-dependent configuration, so we'd prefer to keep it outside of Spark conf. If you look we've done this for other node-specific configs like SPARK_LOCAL_DIR.
There was a problem hiding this comment.
Right, sparkconf for application-wide configuration is sensible
scala -version will raise on 2.11.0 even there is no error. Ignore the error.
No description provided.