You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Bug report. If you’ve found a bug, please provide a code snippet or test to reproduce it below.
The easier it is to track down the bug, the faster it is solved.
Feature Request. Start by telling us what problem you’re trying to solve.
Often a solution already exists! Don’t send pull requests to implement new features without
first getting our support. Sometimes we leave features out on purpose to keep the project small.
Issue description
Parser of ES type mapping does not work properly when you have a field of type object named 'properties' in it. I checked the code and verified that the parser is unable to distinguish between a field of type object named 'properties' and the 'properties' key used by ES to define object sub-fields.
What kind an issue is this?
The easier it is to track down the bug, the faster it is solved.
Often a solution already exists! Don’t send pull requests to implement new features without
first getting our support. Sometimes we leave features out on purpose to keep the project small.
Issue description
Parser of ES type mapping does not work properly when you have a field of type object named 'properties' in it. I checked the code and verified that the parser is unable to distinguish between a field of type object named 'properties' and the 'properties' key used by ES to define object sub-fields.
Steps to reproduce
Code:
create ES index, mapping and document:
query using elasticsearch-hadoop (python code)
output shows empty document:
Strack trace:
no exceptions, but DEBUG log indicates that although mapping was retrieved correctly, parsing was not done correctly
Version Info
OS: : Ubuntu 14.04 LTS
JVM : java version "1.8.0_91"
Hadoop/Spark: spark-1.6.2-bin-hadoop2.6
ES-Hadoop : elasticsearch-hadoop-5.0.0-alpha4
ES : elasticsearch-2.2.0
The text was updated successfully, but these errors were encountered: