How to concatenate long logs (>16K are split) #4220
Unanswered
beckyjmcdabq
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Same issue |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
We occasionally have a log message greater than 16K that when scraped by fluentd and forwarded to Elasticsearch will appear as two separate documents. This often happens with Java exceptions/stack traces being logged.
How can I configure fluentd to concatenate the separate logs into one log message before forwarding to Elasticsearch?
Google searching has pointed out using the "concat" filter but I'm struggling to find good documentation or an example of how I can get all the fields correct. We are using the "tail" source to parse docker logs. If someone can help with the configuration needed to support the long logs that get split, that would be most helpful.
Beta Was this translation helpful? Give feedback.
All reactions