Fails to ingest the nested object value into druid using kafka with below ingestion spec and test input data.
Sample Input data :
{
"edata":
{
"visits": [
{
"objid": "312583946656169984213975",
"index": 0
},
{
"objid": "312583929625190400114333",
"index": 1
},
{
"objid": "31261937011242598421918",
"index": 2
}, ]
}
}
kafka Ingestion spec
{
"type": "kafka",
"dataSchema": {
"dataSource": "datasource-1",
"parser": {
"type": "string",
"parseSpec": {
"format": "json",
"flattenSpec": {
"useFieldDiscovery": false,
"fields": [
{
"type": "path",
"name": "edata_visits_index",
"expr": "$.edata.visits[*].index"
}
]
},
"dimensionsSpec": {
"dimensions": [
{
"type": "long",
"name": "edata_visits_index"
}
],
"dimensionsExclusions": []
},
"timestampSpec": {
"column": "ets",
"format": "auto"
}
}
},
"metricsSpec": [],
"granularitySpec": {
"type": "uniform",
"segmentGranularity": "day",
"queryGranularity": "none",
"rollup": false
}
},
"ioConfig": {
"topic": "telemetry",
"consumerProperties": {
"bootstrap.servers": "localhost:9092"
},
"taskCount": 2,
"replicas": 1,
"taskDuration": "PT100S",
"useEarliestOffset": true
},
"tuningConfig": {
"type": "kafka",
"reportParseExceptions": false
}
}
Error Output
""2019-02-11T12:05:50,450 ERROR [task-runner-0-priority-0] org.apache.druid.indexing.kafka.IncrementalPublishingKafkaIndexTaskRunner - Encountered exception while running task.
" java.lang.UnsupportedOperationException: Numeric columns do not support multivalue rows.
at org.apache.druid.segment.LongDimensionIndexer.processRowValsToUnsortedEncodedKeyComponent(LongDimensionIndexer.java:45) ~[druid-processing-0.13.0-incubating.jar:0.13.0-incubating]
at org.apache.druid.segment.LongDimensionIndexer.processRowValsToUnsortedEncodedKeyComponent(LongDimensionIndexer.java:37) ~[druid-processing-0.13.0-incubating.jar:0.13.0-incubating]
at org.apache.druid.segment.incremental.IncrementalIndex.toIncrementalIndexRow(IncrementalIndex.java:674) ~[druid-processing-0.13.0-incubating.jar:0.13.0-incubating]
at org.apache.druid.segment.incremental.IncrementalIndex.add(IncrementalIndex.java:609) ~[druid-processing-0.13.0-incubating.jar:0.13.0-incubating]
at org.apache.druid.segment.realtime.plumber.Sink.add(Sink.java:181) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
at org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.add(AppenderatorImpl.java:246) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
at org.apache.druid.segment.realtime.appenderator.BaseAppenderatorDriver.append(BaseAppenderatorDriver.java:403) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
at org.apache.druid.segment.realtime.appenderator.StreamAppenderatorDriver.add(StreamAppenderatorDriver.java:180) ~[druid-server-0.13.0-incubating.jar:0.13.0-incubating]
Is something wrong with my ingestion spec ? why we are getting this error ?
Fails to ingest the nested object value into druid using kafka with below ingestion spec and test input data.
Sample Input data :
kafka Ingestion spec
Error Output
Is something wrong with my ingestion spec ? why we are getting this error ?