Problem
It seems that es will reject a json-in-json log whcih contains the escape character \ , for example:
{"id":"5","log":{"level_2":"{\"apiVersion\":\"batch/v1beta1\",\"kind\":\"CronJob\"}"}}
Obviously, for the key log, its value is a another json object, that is
"level_2":"{\"apiVersion\":\"batch/v1beta1\",\"kind\":\"CronJob\"}"
I would expect:
- json parser remove extra escape characters \ ,
- es accepts the json-in-json, escape-character contained log
However I got this error from fluentd log
2019-01-25 05:59:47 +0000 [warn]: #0 dump an error event: error_class=Fluent::Plugin::ElasticsearchErrorHandler::ElasticsearchError error="400 - Rejected by Elasticsearch [error type]: mapper_parsing_exception [reason]: 'failed to parse field [log] of type [text]'" location=nil tag="audit" time=2019-01-25 05:59:36.636265100 +0000 record={"id"=>"5", "log"=>{"level_2"=>"{\"apiVersion\":\"batch/v1beta1\",\"kind\":\"CronJob\"}"}}
...
Steps to replicate
my fluentd.conf is as follows:
<source>
@type tail
@id in_tail_container_logs
path /var/log/containers/sample.log
tag audit
@log_level debug
read_from_head true
<parse>
@type json
</parse>
</source>
<match audit>
@type elasticsearch
@log_level debug
host elasticsearch.logging.svc.cluster.local
port 9200
logstash_format true
</match>
Expected Behavior or What you need to ask
I can see a single layered json from kibana even if its value contains escape character \ . However no luck with a json-in-json log

...
Using Fluentd and ES plugin versions
- Kubernetes 1.10.5
- Fluentd 1.3.2
- ES plugin: fluent-plugin-elasticsearch (2.11.11)
- paste result of
fluent-gem list, td-agent-gem list or your Gemfile.lock
elasticsearch (6.1.0)
elasticsearch-api (6.1.0)
elasticsearch-transport (6.1.0)
Problem
It seems that es will reject a json-in-json log whcih contains the escape character \ , for example:
{"id":"5","log":{"level_2":"{\"apiVersion\":\"batch/v1beta1\",\"kind\":\"CronJob\"}"}}Obviously, for the key log, its value is a another json object, that is
"level_2":"{\"apiVersion\":\"batch/v1beta1\",\"kind\":\"CronJob\"}"I would expect:
However I got this error from fluentd log
2019-01-25 05:59:47 +0000 [warn]: #0 dump an error event: error_class=Fluent::Plugin::ElasticsearchErrorHandler::ElasticsearchError error="400 - Rejected by Elasticsearch [error type]: mapper_parsing_exception [reason]: 'failed to parse field [log] of type [text]'" location=nil tag="audit" time=2019-01-25 05:59:36.636265100 +0000 record={"id"=>"5", "log"=>{"level_2"=>"{\"apiVersion\":\"batch/v1beta1\",\"kind\":\"CronJob\"}"}}...
Steps to replicate
my fluentd.conf is as follows:
Expected Behavior or What you need to ask
I can see a single layered json from kibana even if its value contains escape character \ . However no luck with a json-in-json log

...
Using Fluentd and ES plugin versions
fluent-gem list,td-agent-gem listor your Gemfile.lock