Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Throughput and latency? #179

Closed
marvin-hansen opened this issue Feb 16, 2024 · 2 comments
Closed

Throughput and latency? #179

marvin-hansen opened this issue Feb 16, 2024 · 2 comments

Comments

@marvin-hansen
Copy link

Hi,

I know it's a new project, but what are the estimated numbers of throughput and latency on a normal dev machine?

The documentation isn't mentioning anything, but it really helps having at least a paragraph to get an idea of the order of magnitude.

Are we talking milliseconds or microseconds latency?

What's the normal throughput for 128 byte messages vs 512 bytes long messages?

Big kudos for the default support of rpc, it's a very meaningful features.

@marvin-hansen
Copy link
Author

Ran benchmark:

cargo run --release -- --num-of-streams 1 --message-size 32 --num-of-messages 10000000

Benchmark Results

Number of Messages: 10,000,000
Number of Streams: 1
Message Size (Bytes): 32
Batching Enabled: false
Compression Enabled: false

| Duration | Total Transferred | Avg. Throughput | Avg. Latency |
| 8.5470 Secs | 305.18 MB | 35.71 MB/s | 854.70 ns |

That's about 1.18 million msg/sec.

@petehayes102
Copy link
Member

Thanks for the feedback! We do need to address this more clearly in documentation to set expectations. We've been hesitant to do this prior to reaching 1.0 as we anticipate some improvement, but it's probably worth publishing now and revising every so often.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants