Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
The actual output from the application
Now this becomes in docker log, to be parsed by fluentbit in_tail: (example differs from the above)
I would expect it will apply to this case as well, however I it does not. Below I provided my configuration.
Describe the solution you'd like
in_tail/docker_mode - shall have the possibility to read docker's json-log as a stream of original text. json parser, here is just pre-processor that will buffer the "log" key, so multiline regexp patterns can be used later.
Describe alternatives you've considered
I believe this problem can be avoided if:
Fluent bit FILTERS are applied after the parsing, so can't transform the stream early.
Fluentbit config I am using:
input-kubernetes.conf: | [INPUT] Name tail Tag kube.* Path /var/log/containers/*.log Parser docker DB /var/log/flb_kube.db Skip_Long_Lines Off Docker_Mode On Refresh_Interval 10 Chunk_Size 32k Buffer_Max_Size 2M filter-kubernetes.conf: | [FILTER] Name kubernetes Match kube.* Kube_URL https://kubernetes.default.svc.cluster.local:443 Merge_Log On K8S-Logging.Parser On [PARSER] Name docker Format json Time_Key time Time_Format %Y-%m-%dT%H:%M:%S.%L Time_Keep On # Command | Decoder | Field | Optional Action # =============|==================|================= Decode_Field_As escaped_utf8 log do_next Decode_Field_As escaped log do_next Decode_Field_As json log
Hey I'm struggling with the same right now. Is there any additional planned feature or bug fix for this?