-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
High volume JSON-formatted logs collected by agent are truncated in DataDog #5764
Comments
I faced the same issue. Is there any update? |
This issue is still present. |
We are having the same issue with: |
Hi all, please make sure that:
If you're still having issues, please contact our support team and send them an Agent flare along with the raw logs that are affected (after removing any sensitive information from them). |
I'm having this issue as well. |
Also having this issue. Any updates? |
facing same issue, where my log single json line log size is more than 400kB This needs to go in as a config https://github.com/DataDog/datadog-agent/blob/main/pkg/logs/internal/decoder/decoder.go#L23 |
I'm having the same issue. |
This is still happening. 3 years still no solution |
Having the same issue. All my json logs has less then 400 chars |
@DataDog Team, why was this magic number chosen? Shouldn't this be a very simple PR (Which i'd happily open) to fix? What is the hold up on this? |
@DataDog Any update on this? :-( |
same issue! Seems nobody cares! |
Experiencing the same. How to fix it? |
It seems t's been fixed here: |
Even when maxing out the Keeping log files smaller than For my use case with Apache Flink, I can have Log4j2 rotate files based on size to ensure they're smaller than |
Output of the info page (if this is a bug)
Describe what happened:
I recently converted The Muse's Kafka Connect service log output from standard log4j console output to JSON formatted output using logback. I confirmed this works for other services, such as Kafka Brokers. With Kafka Connect, this works fine for INFO level logs and above, but I find that a majority of DEBUG logs end up with the message 'TRUNCATED' preceding the log data; and I find that the log data itself is the concatenation of dozens of unconverted JSON log records. The more logs there are in a short period of time, the more I see truncated. See the screenshots below.
Describe what you expected:
Logs output as expected, that is, one deserialized log record per row in the DataDog Log UI.
Steps to reproduce the issue:
NOTE: I can create a sample Kafka Connect setup for you to test, if that will be helpful. I would just need to modify my existing setup to remove any employer-specific information.
Additional environment details (Operating System, Cloud provider, etc):
Cloud Provider: AWS
Cloud Service: ECS
Agent version: 6.17.1
Agent containerized? Yes.
Log output driver: json-file [non-blocking, 4m intermediate ring, 5 file rotation, 500kb per file]
Log collection method: Agent [SSL encrypted TCP, uncompressed]
The text was updated successfully, but these errors were encountered: