-
Notifications
You must be signed in to change notification settings - Fork 779
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Existing TraceId not propagated in a Spring Cloud Stream Kafka Producer #1731
Comments
Found something similar with Hoxton.SR8 with rabbitmq, moving to SR2 works, i test an older version based on this other ticket #1748 |
This seems to be working in 2020.0.0-M3 spring cloud release, unfortunately I can't upgrade first cause is a milestone release and second there is an issue with spring cloud stream in that version. |
@jarias can you check that everything is working fine with |
@marcingrzejszczak I tried and no luck for some reason spans where not even sent to zipkin either, but looking at the logs the ids didn't match between the producer service and the consumer service. Working stack:
Not working stack:
I would try to create 2 reduced sample apps so you guys test it out. |
@marcingrzejszczak here is a repo that reproduces the issue https://github.com/jarias/spring-cloud-sleuth-issue-1731 if you downgrade to spring boot 2.3.5.RELEASE and spring cloud Hoxton.SR2 things should work. |
Bit more info, I started debugging this and found that the |
Manage to got it working, see latest in my repo, but I wonder why in SR2 it work without any code changes and now it doesn't, @marcingrzejszczak can you just check my code and see if I'm using the APIs correctly, seems kind of odd the way I had to use the I guess I didn't read the docs fully, cause it does mention the caveats for spring cloud stream with reactor |
Thanks @jarias for trying it out. I'm pointing to your commit here jarias/spring-cloud-sleuth-issue-1731@3fa9228 Yeah there are issues with stream and reactor and once you do things manually then you're in full control of the context passing. Then things work even though you have to do some manual operations. I'm closing this issue and thanks again for providing the information on how you fixed it. |
Describe the bug
I'm using Spring Boot 2.3.3 and Cloud Hoxton.SR6
I've found that an existing traceid is not propagated in the message, a new one is generated instead. See sample below for more details
Sample
I've uploaded a sample project in this repo: https://github.com/codependent/sleuth-kafka
Just start a local Kafka broker (2.5.1 in my case) and then run
SleuthKafkaApplication
class.The application consists of a controller (
EventRestController
) that receives an event and sends (EventProducer
) it to a Kafka topic. Then a consumer (KafkaConfiguration.consumer()
) reads that same message.In every step I print a log to verify the correlation information.
Just execute this curl to see it in action:
The following logs show up:
As you can see the controller has this traceid and spanid generated:
1f321656197f4119,1f321656197f4119
That same info is present right before sending the message:
2020-09-03 14:04:34.253 INFO [sleuth-kafka,1f321656197f4119,1f321656197f4119,true] 9111 --- [ctor-http-nio-4] c.c.sleuthkafka.producer.EventProducer : send() - event Event(id=someid2, body=some event)
However the consumer gets a different traceId:
2020-09-03 14:04:34.259 INFO [sleuth-kafka,7c9ddc695caba01b,1daff4416e92b0fe,false] 9111 --- [container-0-C-1] uration$$EnhancerBySpringCGLIB$$8bb402c7 : consumer() - event Event(id=someid2, body=some event)
Using kafkacat I see that these last values (
7c9ddc695caba01b,1daff4416e92b0fe
) were the ones written into the topic:For some reason the existing context isn't being retrieved in
TracingChannelInterceptor
The text was updated successfully, but these errors were encountered: