New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spring Cloud 2020.0.4 breaks Spring Cloud Stream Kafka AVRO Message Conversion when Sleuth is on the classpath #2051
Comments
if you set the property |
@sobychacko - I didn't think of doing that - I can confirm spring.sleuth.function.enabled=false fixes the issue. |
@davidmelia I'll talk to @marcingrzejszczak tomorrow. I know there was a lot of work in |
Just tapped into the same issue here. The problem is that because the Function is wrapped for that brave/sleuth tracing advice the following sets the The decision in IMHO the properties of the Wrapper/Decorator should be delegated through the original instance and not copied. When can we expect this to get fixed? |
You can disable sleuth with |
Hi @olegz I am experiencing exactly the same issue. My case is a little different. We have a function that takes in a object deserialized from json, and its output object needs to be serialized to xml We also want to use spring cloud sleuth with our functions. Especially because we use the traceId from the propagated b3headers in our functions. But because the FunctionAroundWrapper sets - targetFunction.setSkipOutputConversion(true) , it never uses our XMLConverter and the function always returns json. |
hi the problem is that it happens randomly, not for all messages. |
@marcingrzejszczak please suggest a version of spring boot and cloud that it works, or a workaround |
I am going to be closing this issue since
|
@olegz the problem is not with reactive, it is with spring cloud function |
Can you have an example that reproduces it so I can have a look? The original example with the stack trace included at the beginning of this issue is reactive |
it keeps happening in production randomly and we weren't able to reproduce it in the local environment. but we do have the message and the stacktrace from the DLQ. the strange thing is that when we retried the same message from the DLQ it worker with no error part of the stacktrace :
and sometimes this stacktrace
|
I would not call it random or similar problem to the original post. It's clear you have some recursion in your JSON and that explains the randomness since it happens per-message. |
again, once we retry the same message it works, I would expect it shouldn't work . |
There is no such thing as bug that can not be reproduced. It simply means there is no bug |
I have no sample message, no meaningful stack trace, not approximate instruction on how to even attempt to reproduce it. What would you like us to do? |
Describe the bug
When Upgrading from Spring Cloud 2020.0.3 to 2020.0.4 AVRO message conversion is ignored in Spring Cloud Stream Kafka (Kafka 2.7) giving the following error:
Which looks like the avro message converters are being ignored for the Jackson ones.
To solve the error
Sample
https://github.com/davidmelia/spring-boot-webflux-avro-source-only
If you compile and run this project (Assumes a local confluent schema registry on http://localhost:8081 and kafka on localhost:9092) and hit http://localhost:8080/dave this sends a message to kafka. You will see the above error.
Downgraded to spring cloud stream 3.1.3 (uncomment spring-cloud-stream-dependencies in the pom.xml) OR
removing Spring Cloud Sleuth in the pom.xml OR downgrading to Spring Cloud 2020.0.3 fixes this problem.
I previously raised on Spring Cloud Sleuth but after realising that downgrading spring cloud stream fixes the problem I am not sure where the problem lies (#2048)
Thanks
The text was updated successfully, but these errors were encountered: