Skip to content

Add optional AIMD batch size growth strategy#29

Merged
kjnilsson merged 2 commits intomasterfrom
basic-aimd
Mar 31, 2026
Merged

Add optional AIMD batch size growth strategy#29
kjnilsson merged 2 commits intomasterfrom
basic-aimd

Conversation

@kjnilsson
Copy link
Copy Markdown
Contributor

@kjnilsson kjnilsson commented Mar 31, 2026

Introduce a batch_size_growth start option (default: exponential, which preserves the existing doubling behaviour) that accepts {aimd, Step} to switch to Additive Increase / Multiplicative Decrease growth: the batch size grows by Step each time a full batch is completed, and halves whenever the mailbox is found empty.

Fixes #23

Copy link
Copy Markdown

@lhoguin lhoguin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know if the strategy works well and if so under which scenarios / use cases (but I've only ever looked at gen_batch_server, not used, so there's also that).

It does lack the option added to the README/documentation in any case.

It looks fine otherwise.

Introduce a `batch_size_growth` start option (default: `exponential`,
which preserves the existing doubling behaviour) that accepts
`{aimd, Step}` to switch to Additive Increase / Multiplicative Decrease
growth: the batch size grows by `Step` each time a full batch is
completed, and halves whenever the mailbox is found empty.
@kjnilsson
Copy link
Copy Markdown
Contributor Author

I don't know if the strategy works well and if so under which scenarios / use cases (but I've only ever looked at gen_batch_server, not used, so there's also that).

yes this is a fair question. I have run some benchmarks which suggest that aimd can't beat exponential for throughput but aimd is gentler on heap size growth (and thus GC perhaps). So it probably shouldn't be a default but good to have option to experiment for certain roles.

Parameters: 100k messages, 4 KB payload, 3 ms base delay + 3 ms/MB variable, min_batch=32, max_batch=8192

Metric Exponential AIMD step=32 AIMD step=64 AIMD step=128
Elapsed 1243.0 ms 1422.9 ms 1352.8 ms 1306.4 ms
Throughput 80,454 msg/s 70,280 msg/s 73,919 msg/s 76,544 msg/s
Batches 20 79 56 40
Batch min 32 32 32 32
Batch mean 5,000 1,266 1,786 2,500
Batch p50 8,192 1,280 1,760 2,464
Batch p95 8,192 2,368 3,296 4,640
Batch max 8,192 2,496 3,488 4,896
GC minor / major 0 / 0 0 / 0 0 / 0 0 / 0
Heap peak 2,485.8 KB 138.5 KB 362.7 KB 949.5 KB
Heap/batch mean 137,685 words 23,414 words 40,463 words 49,685 words
Heap/batch p50 121,536 words 17,731 words 46,422 words 46,422 words
Heap/batch max 318,187 words 75,113 words 121,536 words 121,536 words

@lhoguin
Copy link
Copy Markdown

lhoguin commented Mar 31, 2026

Big difference there indeed.

@kjnilsson kjnilsson marked this pull request as ready for review March 31, 2026 14:53
@kjnilsson kjnilsson merged commit 99a20f9 into master Mar 31, 2026
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Optionally use "Basic AIMD" batch size growth.

2 participants