-
Notifications
You must be signed in to change notification settings - Fork 988
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SELECT * FROM tabletest WHERE col1 IN (0,10,5,27 ) #615
Comments
When I try the same but with a column of type double or date, a similar exception is thrown |
What version of es-hadoop are you using?
|
I'm using the version 2.1.1 and 1.7.3 of elasticsearch |
Please use 2.1.2 as it likely fixed your issue. |
Thanks, in version 2.1.2 is fixed, but with DATE type does not works. ES MAPPING: {"databasetest":{"mappings":{"tabletest":{"properties":{"date":{"type":"date","format":"dateOptionalTime"},"ident":{"type":"long"},"money":{"type":"double"},"name":{"type":"string"},"new":{"type":"boolean"}}}}}} CUCUMBER TEST:
I do not know if this issue could be an Spark issue. |
Maybe a comma is missing here:
? |
When using Date types with Spark IN filter, apply a terms query instead of match relates #615
When I try to execute a query with an IN operator, if the column is of type LONG, the datasource throws the next exception:
The column is of type LONG.
The executed test is the next (CUCUMBER FORMAT)
The text was updated successfully, but these errors were encountered: