-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consistently corrupt log data for long (>16kB) log lines #12197
Comments
Ah, finally managed to find an old issue on the topic: #2281 Edit: Scratch that, the old issue mentions that logs are being split, which is not the case here: The logs are indeed split by docker but seems to be combined again by promtail. It just does it incorrectly, by the looks of things - as the timestamp should not be there. |
@cstyan Are you planning to port this fix to Grafana Agent or Alloy? |
+1 we're also experiencing this issue and would benefit from having the fix ported to Alloy. |
This is still an issue in Alloy, are there any plans to reproduce this fix there? |
Describe the bug
When a log is >16kB in size, we see "random" injected iso8601-timestamps in the data.
These timestamps appear every 16k characters (
1<<14
) exactly, and they are equal to the Loki timestamp (to the nanosecond), so they are not actually random but fully consistent.To Reproduce
I have attached a tar file with a small docker compose reproducer project. Using this, it is enough to run
docker compose up -d
and inspect, using Grafana's explore functionality onlocalhost:3000
, the logs from the example_logger service, which will be corrupt.However, the steps can also be explained slightly more detailed:
{container=~`.*logger.*`} |~ `\w2024-`
. This will highlight the timestamps that have been injected.Expected behavior
Logs should appear in their original form. Ie. in my reproducer compose project, i would expect 32k 'a' characters uninterrupted by iso timestamps.
Environment:
Screenshots, Promtail config, or terminal output
badlogs.tar.gz
The text was updated successfully, but these errors were encountered: