New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Elasticsearch : Cannot detect ES version #791
Comments
If you only specify the host, es-hadoop will use the default port (namely Cheers, |
@costin Thanks for your assistance. |
plz stop ur firewall on elasticsearch |
i have the same question here. And my es cluster was deployed whit my hadoop cluster. es version 5.5.0 The exception is Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 0.0 failed 1 times, most recent failure: Lost task 2.0 in stage 0.0 (TID 2, localhost): org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only' Can anyone help me? |
The issue is we are overwriting the spark default configuration in mapr - /opt/mapr/spark/spark-2.1.0/conf (We are using MapR distribution) and the spark configuration we are passing in our application were not able to bind to sparkConfig. So it is pointing to local host during index creation(127.0.0.1:9200)- check in your exception log if you faced this I have changed the configuration details in the application and passed those while creating the sparkSession object and I have tested the application. Now, the application is working fine and I’m able to create the index in Elastic Search and load the data. sparkConfig passed while creating the sparkSession: val sparkConf = new SparkConf() |
You have to add your ES port on your sparkConf, maybe this is 9243 or 443 as your ES is running on AWS. You can also redirect your calls to ES to your local:
|
Hi,
I am using Amazon ElasticSearch service and seprate Spark cluster on EMR and trying to execute elasticsearch-hadoop Apache Spark writing example , However on submitting the job on my local before I am getting following exception as info.
and later on am getting following exception.
Strack trace:
Following is my code that I am running -
Code:
Maven dependency
Is It a bug or am I missing something here ?
The text was updated successfully, but these errors were encountered: