-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The field "time" is missing #1833
Comments
Exist: https://docs.fluentd.org/v1.0/articles/parse-section#parse-parameters |
Thanks, it works. But there is a minor issue. Actually I generated the following example log entry, But docker converted it to: It's acceptable, and I configured the following filter,
The issue is that the field log.time lost, the field "time" added by docker is reserved. How can I reserve the log.time? Of course, it's a minor issue. |
@repeatedly Something seems broken with time for me as well. Here's a sample configuration I'm testing:
Then I run fluentd via: And I query it with: I get the following output:
So it looks as if the time field is not being parsed at all, and it's definitely ignoring the keep_time_key field. There are no errors parsing my config file or anything. And lastly I've double-checked that my time_format is valid in ruby:
|
Also worth noting that no "time" event in my curl command will trigger an error or warning in the output, and no bogus values that I can pass in the conf file to time_format or any of the other time values will trigger an error or warning. |
@ahrtr FWIW, your particular problem ("time" field discarded while parsing) can be resolved Check the following example configuration how to set the option: <filter debug.**>
@type parser
format json
key_name log
reserve_data true
hash_value_field log
time_format "%Y-%m-%dT%H:%M:%S"
keep_time_key true
</filter> How I've checked it worksI confirmed this actually works on my environment (Fluentd v1.1.0) using the following data record: {"log":"{\"time\":\"2018-01-27T02:38:16\"}"} Before turning on
Then, after turning on
|
@fujimotos, thanks for the info. I will try it out sometime later. |
The log entries collected by fluentd can be sent to Elasticsearch successfully, and eventually I could see the log entries on Kibana. But the issue is that the field "time" is missing.
I found there was a related issue as below, but I did not find the paramter "time_key" and "keep_time_key" on the official online document, what did I miss?
#1360
The text was updated successfully, but these errors were encountered: