Skip to content
This repository has been archived by the owner on Jun 1, 2024. It is now read-only.

Not all logs writing to Elasticsearch #125

Closed
ZOXEXIVO opened this issue Oct 5, 2017 · 21 comments
Closed

Not all logs writing to Elasticsearch #125

ZOXEXIVO opened this issue Oct 5, 2017 · 21 comments

Comments

@ZOXEXIVO
Copy link

ZOXEXIVO commented Oct 5, 2017

Some logs stay only in buffer and not syncs to Elastic.
We have some messages, that randomly syncing to elastic and always present in buffer.

@rumatavz
Copy link

Me too

@frg
Copy link

frg commented Oct 15, 2017

What I've found is that if you've got multiple instances only 1 file is synced until the day ends. When the day ends it then starts syncing the other instance files. I think we need a sub-bookmark per file and the main bookmark would rotate between them.

bookmark 1:~ sub-bookmark 1:1 buffers

This would help distribute the syncing opportunity for each file.

@mivano
Copy link
Contributor

mivano commented Oct 21, 2017

The durable sink has some quirks. Certainly with issues from ES popping back up. See PR #127. Might need some rewrite there.

@kbatman37
Copy link

Same issue here. Is there a workaround/fix?

@mivano
Copy link
Contributor

mivano commented Dec 2, 2017

Workarounds if you want to have persistent buffering; use a buffer like a queue system such as RabbitMQ or use a file buffer. Combine that with a system like logstash and you can reliable submit the events to Elasticsearch.

@ZOXEXIVO
Copy link
Author

ZOXEXIVO commented Jul 8, 2018

We have workaround - stop using elasticsearch sink and write directly to files (Serilog Rolling files) and sync them throught Filebeat

@jordanbrowneb
Copy link

I also had this issue and was able to resolve it by instantiating Serilog.Core.Logger inside of a using statement. This allowed the logger to flush before closing.

@kilroyFR
Copy link

kilroyFR commented Jan 5, 2019

Log.CloseAndFlush() worked perfectly for me

@siberianguy
Copy link

We're still using the previous release (6.3) and see logs getting regularly lost. We're considering to upgrade to the latest release (7.1). Could anyone comment whether there're any improvements in this regards in the current version. I see the codebase was significantly reworked and now it's using hopefully more mature code from Seq's sink. I see there's a Durable sink now (not sure if it was here before) but it's designed this problem or not

@siberianguy
Copy link

After looking through the code, I see the Durable sink is being used if you specified BufferBaseFileName. Is it a recommended approach? Is an alternative approach (without BufferBaseFileName) not reliable?

I've also found a compilation command #if Durable in the code and I'm not sure when it's applied

@erik-wramner
Copy link

We are having this problem now and as we are using Kubernetes (i.e. containers) we can't use files. They disappear when the application stops. It would be very valuable to have a reliable way to force all outstanding logs to be written before the application shuts down.

@siberianguy
Copy link

Same here. We've enabled all the possible options related to duraability but still find logs missing from time to time. It's not critical for us at the moment, so we just live with it but it worries me a lot and I haven't found any decent alternative.

@erik-wramner
Copy link

Added code to call Log.CloseAndFlush() from IApplicationLifetime's ApplicationStopped callback. Not sure if it helps yet, but if it does it is an OK solution.

@siberianguy
Copy link

siberianguy commented Apr 26, 2019 via email

@willsoares
Copy link

Does this problem still exists? It has a decent workaround?

@erik-wramner
Copy link

I wanted to gain some experience before replying. Adding CloseAndFlush was definitely an improvement. Now we always get logs for successful jobs when they are done and in many cases we get the logs for failures as well (.NET calls the handler if the application crashes). It doesn't help when the application is killed. Based on our experience so far I would say that it is a decent solution.

@mivano
Copy link
Contributor

mivano commented May 8, 2019

Logging should never break or impact your application, so it might be possible that logevents are lost in favor of having a running system. Next to that, there are multiple parts; serilog, the sink, (the durable sink) and ES itself. All elements that can fail.

CloseAndFlush is a generic Serilog function that makes sure that all batched sinks are called to flush their data. So make sure to call it when you gracefully close your application.

That might not always be possible and if you have the requirement to make sure all events are persistent, then you might need to look for a sink that does not use a buffer (even the in memory one) and persist the logevent as quickly as possible to a store (like a queue). Asynchronously you can then process the events and put them in ES. As with all options; it depends on what you need to use.

There is in serilog an AuditTo functionality which makes sure that you have reliable logging. However, this sink does not support that (yet).

@cdaven
Copy link

cdaven commented Nov 5, 2019

Have you read Lifecycle of Loggers? Log.CloseAndFlush() only works if you're logging on the global Log object. If you create local Logger objects, they are flushed when you dispose them.

@erik-wramner
Copy link

According to the life cycle page:

Call Log.ForContext(...) to receive an ILogger with additional properties attached; this doesn't need any special close/flush logic, as this will be handled by the parent logger

We always use an IOC container to inject an ILogger, I think that is common practice. Thus CloseAndFlush should work.

@mivano
Copy link
Contributor

mivano commented Dec 16, 2019

Old issue, cleaning up

@mivano mivano closed this as completed Dec 16, 2019
@Kamil-Zakiev
Copy link

Applied the approach that @erik-wramner mentioned (Log.CloseAndFlush() from IApplicationLifetime's ApplicationStopped callback) and it works fine: now I can see logs related to graceful shutdown!

Thank you, Eric!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests