New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Java upgrade to log4j2 #3861
Java upgrade to log4j2 #3861
Conversation
Build SUCCESS |
08c9c1c
to
ce624e2
Compare
Build SUCCESS |
ce624e2
to
b82576c
Compare
Build SUCCESS |
I am not a Java expert by far and not even an expert of log4j, but I read this branch and looks good to me. It migrates to using the LogManager API, changes our Appender implementation to match the log4j2 API and also also creates the log4j configuration programmatically by a class, which gets injected using a -D argument to the JVM. With all that said, I just wanted to indicate that I looked at these commits and in case this works for java based end-to-end tests, this should be ok to go in. |
Signed-off-by: Kokan <kokaipeter@gmail.com>
Signed-off-by: Kokan <kokaipeter@gmail.com>
Signed-off-by: Kokan <kokaipeter@gmail.com>
The log4j allows us to replace the configurator object with our own, allowing the user to programmaticly configure log4j. In our case we want to replace all appender with our implementation, this is perfectly align with that. Note: with this user cannot change how log4j is logging. Signed-off-by: Kokan <kokaipeter@gmail.com>
Signed-off-by: Kokan <kokaipeter@gmail.com>
Signed-off-by: Kokan <kokaipeter@gmail.com>
The log4j 2.16 only supports 1.8+. This also puts as to support only 1.8. Signed-off-by: Kokan <kokaipeter@gmail.com>
Signed-off-by: Kokan <kokaipeter@gmail.com>
b82576c
to
deca818
Compare
Build SUCCESS |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've tested this PR with the Kafka Java destination using Java 1.8 and Kafka 3.0.0. It works correctly.
@@ -47,8 +48,7 @@ | |||
public KafkaDestination(long handle) { | |||
super(handle); | |||
|
|||
Logger.getLogger("org.apache.kafka").setLevel(SyslogNgInternalLogger.getLevel()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please do not remove this line, it keeps the Kafka logger's level in sync with ours.
We might need org.apache.logging.log4j.core.config.Configurator.setLevel
here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This definitely did not kept in sync, as it did not follow up on log level change during runtime.
But the idea is that it shouldn't be needed as the root logger (before anybody else) is configured with the log level of syslog-ng, and you shouldn't be able to obtain a logger with different log level (unless forcefully set to different value, which we are still not protected against)
When I've tried I got debug level only when debug switch was present, so it seemed to work. Did you have different experience ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This definitely did not kept in sync, as it did not follow up on log level change during runtime.
We can discuss the limitations of my original fix here, but I think I was fairly honest about its weaknesses in #3679, and it got accepted back then: "Currently, the log level is not updated when debug/trace is enabled or disabled at runtime (using syslog-ng-ctl)."
Nevertheless, even if it was not a good quality fix, it kept Kafka's internal logger in an operable state, and in sync with our loglevel (with a reload). Without it, we had no debug logs at all.
Removing this without stating anything in a well-separated commit message is not okay in my opinion, it hides the fact that a bugfix has been partially reverted.
When I've tried I got debug level only when debug switch was present, so it seemed to work. Did you have different experience ?
I don't see any internal logs coming out of Kafka, I've retested it with Kafka 2.8.1. This is probably because we do not configure the Kafka logger properly.
log4j:WARN No appenders could be found for logger (org.apache.kafka.clients.producer.ProducerConfig).
log4j:WARN Please initialize the log4j system properly.
It is not guaranteed that all Java destination libraries use log4j2 for logging, so setting the root logger is not really enough.
For example, the original fix used the knowledge that we and Kafka both use log4j v1, and we set up their log level according to ours. (I guess logging should be configured separately in each library where logging is desired.)
It seems Kafka still uses log4j1, that must be the reason I don't see anything logged (because we haven't configured their v1 logger):
https://issues.apache.org/jira/browse/KAFKA-13534
(If it's really broken due to the fact that they still use the v1 library, I think it's not worth fixing it again.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did not say it is a wrong fix, but does not keep log level in sync. Sure I could have done more in the commit message, but that bug is not reverted at all in the PR!
In general, the code in question does not make sense in v2 with our configuration, as the root logger gets initialized before the libs and thus any logger obtained are going to get the log level configured in configuration object. (This also does not provide in sync log level.)
As you found out the kafka lib ships with log4j 1.x, this we cannot change. The PR here removes v1 and adds v2 as syslog-ng java logger. As such initializing kafka logger (the way the code did) would mean I must initialize v1, and that would imply that syslog-ng must keep v1 as well.
I did something different, and poked the project a little. It turned out that kafka actually ships log4j v1 but uses slf4j https://github.com/apache/kafka/blob/trunk/clients/src/main/java/org/apache/kafka/clients/producer/KafkaProducer.java#L76 and simply removing log4j v1 and adding the proper log4j-slf4j-impl does work as expected (without setting the log level) https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-slf4j-impl/2.17.1
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general, the code in question does not make sense in v2 with our configuration, as the root logger gets initialized before the libs and thus any logger obtained are going to get the log level configured in configuration object. (This also does not provide in sync log level.)
That was the case with the v1 logger too, but we still needed the mentioned line because Kafka did not inherit the root log level somehow.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TLDR: Kafka internal logging will work out-of-the-box when the user or they upgrade to log4j v2.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thank you for the fix!
Extending appenders: https://logging.apache.org/log4j/2.x/manual/extending.html#Appenders
Programmatic Configuration https://logging.apache.org/log4j/2.x/manual/customconfig.html