-
Notifications
You must be signed in to change notification settings - Fork 233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I keep getting MessageSizeTooLargeError, error message gives size much bigger than actual message were given to producer. #966
Comments
How do you serialize with avro ? Avro is a format where a schema is required for the producer to serialize and the consumer to deserialize, otherwise is just gibberish bytes. Usually the strategy with kafka is to store schema in some sort of central registery, then put in kafka header the reference to the schema used to produce the message. It is what confluent is doing with their schema registry. As far as I know |
@vmaurin I'm using kafkit library for serialization and communication with schema registry. |
Maybe try to dump the message you serialized before passing it to |
@vmaurin Can I specify max_request_size for producer bigger than broker's relevant value if I have compression enabled? |
You mean max.message.bytes on broker/topic ? It might be then it seems to be applied after compression, but then it is also applied to a batch of message, while the check in |
@vmaurin yes, but I use send_and_wait() to send immediately, so I hope batch will not exceed |
@ant0nk @vmaurin did you figure this out? Having similar issue:
|
Got around the problem by disabling
|
@Symas1 Your approach may not work if you send messages quickly enough, as aiokafka combines multiple messages into batches and raising this setting may lead to huge requests being rejected by broker.
|
I'm using avro serialization and nevertheless I'm receiving errors like "The message is 1699136 bytes when serialized which is larger than the maximum request size you have configured with the max_request_size configuration" though original message size was around 800kb and after serialization it also must be less than 1mb (max_request_size by default). Does producer tries to combine several messages in a batch and exceeds max_request_size?
The text was updated successfully, but these errors were encountered: