-
Notifications
You must be signed in to change notification settings - Fork 989
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SPARK_CLASSPATH is deprecated in Spark 1.0+. #580
Conversation
SPARK_CLASSPATH is deprecated in Spark 1.0+. using --driver-class-path is prefered: 15/10/27 16:55:04 WARN SparkConf: SPARK_CLASSPATH was detected (set to '/Users/poiuytrez/Documents/programs/spark-1.5.1/lib/elasticsearch-spark_2.11-2.1.1.jar'). This is deprecated in Spark 1.0+. Please instead use: - ./spark-submit with --driver-class-path to augment the driver classpath - spark.executor.extraClassPath to augment the executor classpath
I have signed the document but I still get "Commit author has not signed the CLA and is not a member of Elasticsearch". |
Hi, Thanks for the contribution. Likely you need to send the patch to Spark itself (since their docs are out of date as well). |
I confirm that I used the email guillaume@dataXXX in the CLA. |
CLA still fails. I don't understand why. |
Probably a hiccup regarding the email. I'll take your word for it and apply the PR. Thanks! |
@poiuytrez I've double check the database and the email you mentioned nor your github user is not in there. Maybe the CLA was not properly submitted. |
Weird, I signed it 2 times. I have the document in my emails. |
The transaction id is : XUZR5277E483J5Z |
I am getting the below error on my windows machine. How can I resolve it? I am using Pyspark and Jupyter notebook and trying to pull the data from Oracle. Please instead use:
|
SPARK_CLASSPATH is deprecated in Spark 1.0+. using --driver-class-path is prefered:
15/10/27 16:55:04 WARN SparkConf:
SPARK_CLASSPATH was detected (set to '/Users/poiuytrez/Documents/programs/spark-1.5.1/lib/elasticsearch-spark_2.11-2.1.1.jar').
This is deprecated in Spark 1.0+.
Please instead use: