New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Need the ability to produce/consume messages beyond 1MB in size. #2342
Comments
This value is not configurable as large records, greater than 1Mb impact performance. |
For now, we can increase default size |
record size must be less than batch and segment size |
Hi @AJ , I've done compression but it's still too large (7131124 uncompressed, 1253783 compressed) It is not possible for us to control the input because it comes from an RPC out of our control. This use case is a real show stopper for us. |
@thinkrapido we have change the configuration to process 2.0MB records, so you should be able to produce/consume your records if you compress them. Are you currently running a local cluster or against InfinyOn Cloud? If you are running a local cluster, you can deploy a development build to test it. https://github.com/infinyon/fluvio/blob/master/DEVELOPER.md Otherwise, this feature will be available in the next release of InfinyOn Cloud. |
Keep in mind that, if you are using the Rust API, you should increase the let config = TopicProducerConfigBuilder::default()
.batch_size(10_000_000)
.compression(Compression::Gzip)
.build()?;
let producer = fluvio.topic_producer_with_config("my-topic", config).await?;
producer.send(RecordKey::NULL, large_record).await?; |
|
still not working |
It looks like the record is even larger than 10_000_000 bytes before compression, could you try with a batch_size larger than that, maybe 15_000_000? |
@thinkrapido we have some ideas outside of raising the buffer size but we need to understand the shape of your data. Let's schedule some time to talk. |
Try buffer sizes up to 32Mb of raw data. |
Closed by #2356 |
Currently, when a client producer a record of 5881006 bytes, the producer panics:
The text was updated successfully, but these errors were encountered: