New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
URLDecoder: Illegal hex characters in escape (%) pattern - For input string: " S" #747
Labels
Comments
Any code sample to reproduce this? |
Hi @costin, Thanks for quick response. Code Sample:object ESDataTest {
def main(args: Array[String]): Unit = {
val conf = new SparkConf(false)
.setMaster("local[*]")
.set("es.nodes", ELASTIC_NODES)
.set("es.port", ELASTIC_PORT)
.setAppName("Text File Test")
val sc: SparkContext = new SparkContext(conf)
val ssc: SQLContext = new SQLContext(sc)
val df = ssc.esDF("pharmadata/test")
df.printSchema()
df.show()
}
} Ingest below file into ES and try to read it using above code sample. Thanks, |
costin
added a commit
that referenced
this issue
May 2, 2016
relates #747 (cherry picked from commit 5e3742ae41c03786c5473c5e6d618a430e621dc8)
Fixed in master and 2.x. The problem was caused by Spark infrastructure not URI escaping the field names which, when using special fields (like |
costin
added a commit
that referenced
this issue
May 2, 2016
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
"org.elasticsearch" % "elasticsearch-spark_2.11" % "2.3.0"
The text was updated successfully, but these errors were encountered: