New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Writing index names dynamically with nonlowercase fields causes error #446
Comments
Hi, any comment on this? Is this really an unncessary restriction or is there an hidden reason for that? Thanks in advance! |
@rbraley @Highbrainer Maybe I'm missing something here but Elasticsearch doesn't allow indexes with uppercase.
the connector does early validation so one doesn't start a job only to have things failing... Unless I'm missing something, I'm closing this down as won't fix. If you think this isn't the case, please comment. |
@Highbrainer Have you tried using uppercase/camelcase fields and have ran into an error? If so, please open a new issue. This is supported and tested and part of the test suite. |
I think I finally understand what the issue is about - the field name is camelcase and is used dynamically but the value is not. I'll submit a fix shortly. |
Hi costin this fix only works if there is a single { } pattern in the index portion. if we have an index_{fieldOne}_{fieldTwo} it will error out. |
rdd.saveToES("index_{camelCaseField}")
yields the following error:
org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Invalid index [index_{camelCaseField}] - needs to be lowercase
The field in my documents is not lowercase and cannot be changed. I "fixed" this by removing
+// Assert.isTrue(StringUtils.isLowerCase(index), String.format("Invalid index [%s] - needs to be lowercase", index));
in mr/src/main/java/org/elasticsearch/hadoop/rest/Resource.java
it has been working for me and it seems an unnecessary restriction on the source json documents anyhow.
I would like a more long term solution than forking elasticsearch-hadoop :)
The text was updated successfully, but these errors were encountered: