Skip to content

Commit

Permalink
Remove references to min_events in bulk_max_size docs. (#38634) (#38637)
Browse files Browse the repository at this point in the history
* Remove references to min_events in bulk_max_size docs.

As of https://github.com/elastic/beats/pull/37795/files in 8.13.0
queue.flush.min_events is no longer relevant.

* Fix whitespace

Co-authored-by: Pierre HILBERT <pierre.hilbert@elastic.co>

---------

Co-authored-by: Pierre HILBERT <pierre.hilbert@elastic.co>
(cherry picked from commit 989d36f)

Co-authored-by: Craig MacKenzie <craig.mackenzie@elastic.co>
  • Loading branch information
mergify[bot] and cmacknz committed Mar 27, 2024
1 parent 4afb9c1 commit 05b0b3e
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 12 deletions.
6 changes: 2 additions & 4 deletions libbeat/outputs/elasticsearch/docs/elasticsearch.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -666,10 +666,8 @@ endif::[]

The maximum number of events to bulk in a single Elasticsearch bulk API index request. The default is 1600.

Events can be collected into batches. When using the memory queue with `queue.mem.flush.min_events`
set to a value greater than `1`, the maximum batch is is the value of `queue.mem.flush.min_events`.
{beatname_uc} will split batches read from the queue which are larger than `bulk_max_size` into
multiple batches.
Events can be collected into batches. {beatname_uc} will split batches read from the queue which are
larger than `bulk_max_size` into multiple batches.

Specifying a larger batch size can improve performance by lowering the overhead of sending events.
However big batch sizes can also increase processing times, which might result in
Expand Down
6 changes: 2 additions & 4 deletions libbeat/outputs/logstash/docs/logstash.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -381,10 +381,8 @@ endif::[]

The maximum number of events to bulk in a single {ls} request. The default is 2048.

Events can be collected into batches. When using the memory queue with `queue.mem.flush.min_events`
set to a value greater than `1`, the maximum batch is is the value of `queue.mem.flush.min_events`.
{beatname_uc} will split batches read from the queue which are larger than `bulk_max_size` into
multiple batches.
Events can be collected into batches. {beatname_uc} will split batches read from the queue which are
larger than `bulk_max_size` into multiple batches.

Specifying a larger batch size can improve performance by lowering the overhead of sending events.
However big batch sizes can also increase processing times, which might result in
Expand Down
6 changes: 2 additions & 4 deletions libbeat/outputs/redis/docs/redis.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -216,10 +216,8 @@ endif::[]

The maximum number of events to bulk in a single Redis request or pipeline. The default is 2048.

Events can be collected into batches. When using the memory queue with `queue.mem.flush.min_events`
set to a value greater than `1`, the maximum batch is is the value of `queue.mem.flush.min_events`.
{beatname_uc} will split batches read from the queue which are larger than `bulk_max_size` into
multiple batches.
Events can be collected into batches. {beatname_uc} will split batches read from the queue which are
larger than `bulk_max_size` into multiple batches.

Specifying a larger batch size can improve performance by lowering the overhead
of sending events. However big batch sizes can also increase processing times,
Expand Down

0 comments on commit 05b0b3e

Please sign in to comment.