-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Too many decompressed-data20220209-1054-****** files are created by td-agent (4.3.0). #3653
Comments
@repeatedly - Can you please help us with this issue? |
This issue has been automatically marked as stale because it has been open 90 days with no activity. Remove stale label or comment or this issue will be closed in 30 days |
This issue was automatically closed because of stale in 30 days |
Can this be reopened? We are seeing the same thing, around 2.5GB of these decompressed data files being created per minute, leading to us having a full volume in 100 minutes. Fluentd then blocks because we've configured it to throw exceptions if the buffer is full. My config is something like
|
Can this be reopened please? @vikranth06 did you end up fixing this? |
Does fluentd/lib/fluent/plugin/buffer/chunk.rb Line 209 in 5844f72
|
We see the same issue. This is our config:
As we use gzip compression I wonder why we see those tmp files. According to the code we should not go this path when the data is compressed?
|
Describe the bug
We are seeing too many decompressed-data20220209-1054-****** files created by td-agent (4.3.0) in /tmp folder of our Ubuntu 20.04 machine.
This started after it was upgraded to 4.3.0 from 4.0.1.
We have downgraded it back to 4.0.1 but the files are still getting created.
These files consists of the data that is to be pushed to ElasticSearch.
To Reproduce
Td-agent was upgraded while applying the latest updates using sudo apt-update command.
No other actions were performed.
Expected behavior
dont see a need of creating so many data-decompressed files.
Your Environment
Your Configuration
Your Error Log
Additional context
No response
The text was updated successfully, but these errors were encountered: