-
Notifications
You must be signed in to change notification settings - Fork 102
Conversation
Please do not merge. I think there is a problem similar to danielwegener/logback-kafka-appender#44 |
Well the problem emerges with SL4J. KafkaProducer request a Logger with package name org.apache.kafka or javax.management while it is sending. And creates a deadlock. Also it is possible working with different appenders for them: <root level="ALL">
<appender-ref ref="gelfKafka" />
</root>
<logger name="org.apache.kafka" level="ALL" additivity="false">
<appender-ref ref="gelfUdp" />
</logger>
<logger name="javax.management" level="ALL" additivity="false">
<appender-ref ref="gelfUdp" />
</logger> Well even if I don't like that it will be ok for everyone. Because what I believe Kafka would be used for a failover scenerio of Graylog down in an application architecture. And when graylog is down nobody will care about these logs? Adding it in documentation limitations section would do the job. What do you think? @mp911de |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for your pull request. I added a few review comments. I see two major issues here:
- We shouldn't add another config system to configure Kafka but rather stick to how we configure e.g. the Redis sender and move all extended properties into the URL. We also should not extend appender config/properties for technology-specific extensions as this requires changes to our public API and causes confusion where to configure settings.
- Logger deadlock: The mentioned limitation/workaround is inevitable if we integrate with third-party dependencies that require a logger – we're already in a logger component and so (re)using a logger can easily lead to problems, that's why we introduced
ErrorReporter
. For Kafka, the only thing we can do is providing proper documentation and a warning about the issue so people do not discover this shortcoming in production.
Care to have a look and address the review comments?
src/main/java/biz/paluch/logging/gelf/intern/sender/GelfKafkaSender.java
Outdated
Show resolved
Hide resolved
src/main/java/biz/paluch/logging/gelf/intern/sender/GelfKafkaSenderProvider.java
Outdated
Show resolved
Hide resolved
src/main/java/biz/paluch/logging/gelf/log4j2/GelfLogAppender.java
Outdated
Show resolved
Hide resolved
src/main/java/biz/paluch/logging/gelf/intern/sender/QueryStringParser.java
Outdated
Show resolved
Hide resolved
Awesome work! Thanks a lot. Can you do the two following things to make merging a bit easier:
|
03cebb7
to
a4e0570
Compare
All done 👍 |
Log events can now be shipped using Kafka. Appenders can configure a Kafka URL in the host field according to the scheme: kafka://broker[:port]?[producer_properties]#[log-topic] e.g. kafka://localhost#topic-log Kafka uses internally logging and JMX components with logging so these logging categories should be excluded to prevent circular, recursive log calls. Original pull request: #173.
Remove Kafka transport documentation from readme as transports are documented on the site. Add timeouts to Kafka send futures. Simplify tests. Remove log4j test as log4j 1.x is out of maintenance. Make fields final where possible. Add author tags, fix javadoc tags. Original pull request: #173
Thanks a lot. That's merged and polished now. |
This PR add functionality to logstash-gelf to log Kafka as Producer
Make sure that: