Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loosing log info due to timeout flush #4

Closed
mpas opened this issue Jun 2, 2016 · 13 comments
Closed

Loosing log info due to timeout flush #4

mpas opened this issue Jun 2, 2016 · 13 comments

Comments

@mpas
Copy link

mpas commented Jun 2, 2016

I have a situation where log information is getting lost due to timeout errors. I have set the flush interval to 30 seconds but i am stille getting timeout flushes...

I am sending the following log information to Fluentd:

2016-06-02 12:32:53.077  INFO 1 --- [           main] content line 1
2016-06-02 12:32:53.097  INFO 1 --- [           main] content line 2
2016-06-02 12:32:53.371  INFO 1 --- [           main] content line 3
2016-06-02 12:32:54.368  INFO 1 --- [           main] content line 4
2016-06-02 12:32:54.858  INFO 1 --- [           main] content line 5
2016-06-02 12:32:54.862  INFO 1 --- [           main] content line 6
2016-06-02 12:32:55.089  INFO 1 --- [           main] content line 7

This is the result in the console of Fluentd:

2016-06-02 12:32:53 +0000 content line 1
2016-06-02 12:32:53 +0000 [warn]: dump an error event: error_class=Fluent::ConcatFilter::TimeoutError error="Timeout flush: --> for content line 2
2016-06-02 12:32:53 +0000 [info]: Timeout flush: --> for content line 2
2016-06-02 12:32:53 +0000 content line 3
2016-06-02 12:32:54 +0000 [warn]: dump an error event: error_class=Fluent::ConcatFilter::TimeoutError error="Timeout flush: --> for content line 4
2016-06-02 12:32:54 +0000 [info]: Timeout flush: --> for content line 4
2016-06-02 12:32:54 +0000 content line 5
2016-06-02 12:32:54 +0000 content line 6
2016-06-02 12:32:55 +0000 [warn]: dump an error event: error_class=Fluent::ConcatFilter::TimeoutError error="Timeout flush: --> for content line 7
2016-06-02 12:32:55 +0000 [info]: Timeout flush: --> for content line 7

So i it seems that i am loosing log information, [lines 2 / 4 / 7] or is there another way to pickup the flushed information?

My configuration for fluentd is:

<source>
  @type forward
  port 24224
  bind 0.0.0.0
</source>
<filter *.docker.*>
  @type concat
  key log
  stream_identity_key container_id
  multiline_start_regexp /^(\d{4})-(\d{2})-(\d{2}) (\d{2}):(\d{2}):(\d{2})[^\s]+/
  flush_interval 30s
</filter>
<match *.docker.*>
    @type stdout
</match>

It seems that every second i am having this issue, i could not find a reason for this.

@okkez
Copy link
Member

okkez commented Jun 2, 2016

You can use @ERROR label to handle timeoute events.
See http://docs.fluentd.org/articles/config-file#error-label

I will add document about @ERROR to README.md in next week.

@mpas
Copy link
Author

mpas commented Jun 2, 2016

Just to be clear, it is the intention to flush every second an raise an error? How is this normally handled? Setting the flush_interval is no solution for this? If not could you please explain what the purpose of the flush interval is?

@mpas
Copy link
Author

mpas commented Jun 2, 2016

Thanks for the @error tip :)!! Finally seem to get stuff going!

@mpas
Copy link
Author

mpas commented Jun 3, 2016

Would it be possible to give some pointers on how I can use the @error label to process the timeout errors?

@repeatedly
Copy link
Contributor

repeatedly commented Jun 4, 2016

I have set the flush interval to 30 seconds but i am stille getting timeout flushes...

@okkez It seems a bug of fluent-plugin-concat. I commented in the commit. Please check.

@okkez
Copy link
Member

okkez commented Jun 6, 2016

OK, I will check code.

@okkez
Copy link
Member

okkez commented Jun 6, 2016

I have set the flush interval to 30 seconds but i am stille getting timeout flushes...

This is my bug. sorry 🙇
I've fixed in master. I will release new version soon 🚀

@okkez
Copy link
Member

okkez commented Jun 6, 2016

Released 0.4.1!

@mpas
Copy link
Author

mpas commented Jun 6, 2016

@okkez @repeatedly Thanks a lot for the help!! It is working as expected now!! Tested the new release and it works!

@mpas mpas closed this as completed Jun 6, 2016
@okkez
Copy link
Member

okkez commented Jun 6, 2016

Thank you for reporting issue @mpas

@bwnyasse
Copy link

bwnyasse commented Aug 7, 2016

Hi @okkez

I'm still getting the timeout flush. I'm using the last release version ( V0.5.0 ).
Do we need to specify extra configuration ?

Thanks

@mpas
Copy link
Author

mpas commented Aug 7, 2016

The timeout flush is just indication that a flush has happend. Use the
timeout label to process entries where the flush has occurred.

Small example:

@type forward port 24224 bind 0.0.0.0 @type concat ... timeout_label @processdata

<label @processdata>
...

On 7 August 2016 at 16:21:58, Boris-Wilfried (notifications@github.com)
wrote:

Hi @okkez https://github.com/okkez

I'm still getting the timeout flush. I'm using the last release version (
V0.5.0 ).
Do we need to specify extra configuration ?

Thanks


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#4 (comment),
or mute the thread
https://github.com/notifications/unsubscribe-auth/AAIfVvM2GaiuiXIJFq9T84M493gv1Gzwks5qdeoGgaJpZM4IsiVE
.

@bwnyasse
Copy link

bwnyasse commented Aug 9, 2016

@mpas Thank You . It's working as describe .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants