You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi I'm using this plugin to get logs from journal and then save them to files splitting by container name.
The configuration I'm using looks like this:
# Logs from docker-systemd
<source>
@type systemd
@id in_systemd_docker
matches [{ "_SYSTEMD_UNIT": "docker.service" }]
<storage>
@type local
persistent true
path /var/log/fluentd/journald-docker-cursor.json
</storage>
read_from_head false
tag docker.systemd
</source>
<match docker.systemd>
@type copy
<store>
@type file
@id out_file_docker
path /file-logs/${$.CONTAINER_TAG}/%Y/%m/%d/std${$.PRIORITY}
append true
<format>
@type single_value
message_key MESSAGE
</format>
<buffer $.CONTAINER_TAG,$.PRIORITY,time>
@type file
path /var/log/fluentd/file-buffers/
timekey 1d
flush_thread_interval 10
flush_mode interval
flush_interval 10s
flush_at_shutdown true
</buffer>
</store>
<store>
@type prometheus
<metric>
name fluentd_output_status_num_records_total
type counter
desc The total number of outgoing
</metric>
</store>
</match>
Using docker fluentd logging driver disables docker logs command, which would be very frustrating if the logging system it self encounters issue (for example a network issue, we just ran into a case like it a few days ago). I guess that a reason for some people using to this plugin as an alternative.
Btw this plugin prevents fluentd to use multiple workers. It'd be very nice if and enhancement could be made in the way that enabling multiprocessing.
Hi I'm using this plugin to get logs from journal and then save them to files splitting by container name.
The configuration I'm using looks like this:
With this setting I'm only getting a throughput of ~1000 lines per second while, according to https://docs.fluentd.org/deployment/performance-tuning-single-process, FluentD should be able to run 5000 lines per second.
A few additional details:
I'm running FluentD inside a Docker container with 4Gb and 4096 CPU shares
Tried with local storage as well as shared storage
Tried removing the file output and using only with metrics as out
The text was updated successfully, but these errors were encountered: