-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[tail] in_tail plugin doesn't refresh newly added files during processing. #573
Comments
Could you show me your configuration? |
With In fact, file refresh only occurs when the processing loop is over for all current inputs (am I right ?) EDIT : Input configuration for info :
|
It depends on Ruby implementation. Fluentd doesn't guarantee "alphanumerical" order.
Fluentd focuses on streaming log processing, not parallel processing.
It depends on the order of triggered events. |
Hello.
I'm using Fluentd for a quite heavy processing (5 < n < 10 plugins) on large files.
I have an "in_tail" plugin, with "path" -> "input*.log".
Today, I tested a scenario.
Each 50k take 3 minutes to process.
The result is : The 100k lines of the "input1.log" are processed in 6 minutes.
During the processing, the "input2.log" and "input3.log" are not detected.
After 6 minutes (end of first processing), the two other files are finally detected, and they will never be processed since Fluentd will wait for new lines in these files (if I add new lines at this point).
I my environment, a new file will be created each hour. If Fluentd is processing when the file appear, some log lines will be lost.
I tried different refresh_interval, but no differences.
Am I correct if I say that Fluentd should at least record file spawning (and set cursor position to 0) as soon as the refresh_interval is reached ?
Thank you.
EDIT : Ubuntu 14.04, Fluentd 0.12.7, old-fashion config
The text was updated successfully, but these errors were encountered: