Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kafka input sasl+ssl #7783

Closed
p53 opened this issue Mar 30, 2020 · 8 comments · Fixed by #10146
Closed

Kafka input sasl+ssl #7783

p53 opened this issue Mar 30, 2020 · 8 comments · Fixed by #10146

Comments

@p53
Copy link

p53 commented Mar 30, 2020

Expected Behavior

setting sasl+ssl parameters and connecting to kafka sasl+ssl listener

Current Behavior

it is failing withCaused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: You must pass java.security.auth.login.config in secure mode.

exception although sasl.jaas.config is supplied in config

Possible Solution

Steps to Reproduce (for bugs)

  1. login to graylog
  2. choose kafka raw input
  3. set properties and add custom properties:
custom_properties:
 "ssl.truststore.location=/etc/ssl/jsse/client.truststore.jks
ssl.truststore.password=xxxxx
sasl.enabled.mechanisms=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='xxxx' password='xxxxf';
security.protocol=SASL_SSL
sasl.mechanism=PLAIN"

Context

trying to make sasl_ssl encrypted authenticated connection

Your Environment

using official docker-compose from documentation

  • Graylog Version: 3.2/3.0
  • Elasticsearch Version:
  • MongoDB Version:
  • Operating System:
  • Browser version:

i don't see any custom sasl properties, only kerberos, which i didn't set..., looks like ignoring sasl properties? in log output...

see:

graylog_1        | 2020-03-30 13:48:13,353 INFO : org.apache.kafka.clients.consumer.ConsumerConfig - ConsumerConfig values: 
graylog_1        |      metric.reporters = []
graylog_1        |      metadata.max.age.ms = 300000
graylog_1        |      value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
graylog_1        |      group.id = graylog2-loc
graylog_1        |      partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor]
graylog_1        |      reconnect.backoff.ms = 50
graylog_1        |      sasl.kerberos.ticket.renew.window.factor = 0.8
graylog_1        |      max.partition.fetch.bytes = 1048576
graylog_1        |      bootstrap.servers = [kafka-stage-schrzagt1.mb.security.in.pan-net.eu:9094]
graylog_1        |      retry.backoff.ms = 100
graylog_1        |      sasl.kerberos.kinit.cmd = /usr/bin/kinit
graylog_1        |      sasl.kerberos.service.name = null
graylog_1        |      sasl.kerberos.ticket.renew.jitter = 0.05
graylog_1        |      ssl.keystore.type = JKS
graylog_1        |      ssl.trustmanager.algorithm = PKIX
graylog_1        |      enable.auto.commit = true
graylog_1        |      ssl.key.password = null
graylog_1        |      fetch.max.wait.ms = 100
graylog_1        |      sasl.kerberos.min.time.before.relogin = 60000
graylog_1        |      connections.max.idle.ms = 540000
graylog_1        |      ssl.truststore.password = [hidden]
graylog_1        |      session.timeout.ms = 30000
graylog_1        |      metrics.num.samples = 2
graylog_1        |      client.id = 
graylog_1        |      ssl.endpoint.identification.algorithm = null
graylog_1        |      key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
graylog_1        |      ssl.protocol = TLS
graylog_1        |      check.crcs = true
graylog_1        |      request.timeout.ms = 40000
graylog_1        |      ssl.provider = null
graylog_1        |      ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
graylog_1        |      ssl.keystore.location = null
graylog_1        |      heartbeat.interval.ms = 3000
graylog_1        |      auto.commit.interval.ms = 1000
graylog_1        |      receive.buffer.bytes = 32768
graylog_1        |      ssl.cipher.suites = null
graylog_1        |      ssl.truststore.type = JKS
graylog_1        |      security.protocol = SASL_SSL
graylog_1        |      ssl.truststore.location = /etc/ssl/jsse/client.truststore.jks
graylog_1        |      ssl.keystore.password = null
graylog_1        |      ssl.keymanager.algorithm = SunX509
graylog_1        |      metrics.sample.window.ms = 30000
graylog_1        |      fetch.min.bytes = 5
graylog_1        |      send.buffer.bytes = 131072
graylog_1        |      auto.offset.reset = latest
graylog_1        | 2020-03-30 13:48:13,573 ERROR: org.graylog2.shared.inputs.InputLauncher - The [org.graylog2.inputs.raw.kafka.RawKafkaInput] input with ID <5e81f89cefe6c600149a6a4a> misfired. Reason: You must pass java.security.auth.lo
gin.config in secure mode.
graylog_1        | org.graylog2.plugin.inputs.MisfireException: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
graylog_1        |      at org.graylog2.plugin.inputs.MessageInput.launch(MessageInput.java:158) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.shared.inputs.InputLauncher$1.run(InputLauncher.java:84) [graylog.jar:?]
graylog_1        |      at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:181) [graylog.jar:?]
graylog_1        |      at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_242]
graylog_1        |      at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_242]
graylog_1        |      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_242]
graylog_1        |      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_242]
graylog_1        |      at java.lang.Thread.run(Thread.java:748) [?:1.8.0_242]
graylog_1        | Caused by: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
graylog_1        |      at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:648) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:542) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:524) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport$ConsumerRunnable.<init>(KafkaTransport.java:262) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport.lambda$doLaunchConsumer$2(KafkaTransport.java:249) ~[graylog.jar:?]
graylog_1        |      at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110) ~[?:1.8.0_242]
graylog_1        |      at java.util.stream.IntPipeline$Head.forEach(IntPipeline.java:581) ~[?:1.8.0_242]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport.doLaunchConsumer(KafkaTransport.java:249) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport.doLaunch(KafkaTransport.java:219) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.plugin.inputs.transports.ThrottleableTransport.launch(ThrottleableTransport.java:76) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.plugin.inputs.MessageInput.launch(MessageInput.java:155) ~[graylog.jar:?]
graylog_1        |      ... 7 more
graylog_1        | Caused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: You must pass java.security.auth.login.config in secure mode.
graylog_1        |      at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:74) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:60) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:79) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:577) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:542) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:524) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport$ConsumerRunnable.<init>(KafkaTransport.java:262) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport.lambda$doLaunchConsumer$2(KafkaTransport.java:249) ~[graylog.jar:?]
graylog_1        |      at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110) ~[?:1.8.0_242]
graylog_1        |      at java.util.stream.IntPipeline$Head.forEach(IntPipeline.java:581) ~[?:1.8.0_242]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport.doLaunchConsumer(KafkaTransport.java:249) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport.doLaunch(KafkaTransport.java:219) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.plugin.inputs.transports.ThrottleableTransport.launch(ThrottleableTransport.java:76) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.plugin.inputs.MessageInput.launch(MessageInput.java:155) ~[graylog.jar:?]
graylog_1        |      ... 7 more
graylog_1        | Caused by: java.lang.IllegalArgumentException: You must pass java.security.auth.login.config in secure mode.
graylog_1        |      at org.apache.kafka.common.security.kerberos.Login.login(Login.java:289) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.common.security.kerberos.Login.<init>(Login.java:104) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.common.security.kerberos.LoginManager.<init>(LoginManager.java:44) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.common.security.kerberos.LoginManager.acquireLoginManager(LoginManager.java:85) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:55) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:60) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:79) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:577) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:542) ~[graylog.jar:?]
graylog_1        |      at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:524) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport$ConsumerRunnable.<init>(KafkaTransport.java:262) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport.lambda$doLaunchConsumer$2(KafkaTransport.java:249) ~[graylog.jar:?]
graylog_1        |      at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110) ~[?:1.8.0_242]
graylog_1        |      at java.util.stream.IntPipeline$Head.forEach(IntPipeline.java:581) ~[?:1.8.0_242]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport.doLaunchConsumer(KafkaTransport.java:249) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.inputs.transports.KafkaTransport.doLaunch(KafkaTransport.java:219) ~[graylog.jar:?]
graylog_1        |      at org.graylog2.plugin.inputs.transports.ThrottleableTransport.launch(ThrottleableTransport.java:76) ~[graylog.jar:?]

@p53 p53 added the bug label Mar 30, 2020
@jdekoning
Copy link

I am not too sure how this is supposed to work. But as far as I see the Kafka client used is still 0.9.0.1 which does not yet support org.apache.kafka.common.security.plain.PlainLoginModule, or at least not SASL_SSL. Maybe also time to update the kafka client?

@p53
Copy link
Author

p53 commented Apr 15, 2020

according to this https://issues.apache.org/jira/browse/KAFKA-3166 0.9.0.1 should have support for SASL_SSL, also if i check this in 0.9 branch of kafka https://github.com/apache/kafka/search?q=sasl_ssl&unscoped_q=sasl_ssl

@jdekoning
Copy link

Ok I guess SASL_SSL is in some working form. But https://github.com/apache/kafka/blob/0.10.0/clients/src/main/java/org/apache/kafka/common/security/plain/PlainLoginModule.java does not exist in the 0.9.0 branch.

@mpfz0r
Copy link
Member

mpfz0r commented Apr 17, 2020

Unfortunately we don't have the resources to look at this right now.
But maybe @muralibasani or @pbr0ck3r can share how they got TLS running? ❤️

@mpfz0r
Copy link
Member

mpfz0r commented Apr 17, 2020

PS: We are also considering to update the Kafka client, but I can't give you a time on that.

@pbr0ck3r
Copy link
Contributor

@muralibasani and I were able to get it working setting the following properties:

ssl.keystore.location
ssl.keystore.password
ssl.key.password
ssl.truststore.location
ssl.truststore.password
ssl.enabled.protocols="TLSv1.2"
security.protocol="ssl"

I no longer am working on graylog at a production level so am not able actually troubleshoot/verify anything, but did have that configuration running for several months without issues. We were on kafka version 2.1.x.

@p53
Copy link
Author

p53 commented Apr 17, 2020

yes, we are also using it now with ssl but i want to use sasl_ssl

@JordySipkema
Copy link

Did some research on this:

  • GSSAPI (Kerberos) is supported >= 0.9.0.0 (Released November 23, 2015 )
  • PLAIN is supported >= 0.10.0.0 (Released May 22, 2016)
  • SCRAM is supported >= 0.10.2.0 (Released February 21, 2017)

The current client version used in Graylog is 0.9.0.1 and therefore only Kerberos authentication is supported. Due to this I cannot use the builtin Kafka client (at this moment we are using SCRAM).

Is there any update when the client will be updated in Graylog?
As an alternative I would like to know how third parties are able to create a plugin to override the version used by Graylog. As seen here it seems not that easy to specify a different client when creating a plugin.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants