Skip to content

Self-signed certs: certificate verification error on connect #252

@gabe-sorensen

Description

@gabe-sorensen

Using self-signed certificates to connect to Kafka doesn't appear to be supported, even when providing the CA certificate in the fluentd config. I'm getting the following error on connect:

2019-04-19 16:30:01 +0000 [info]: parsing config file is succeeded path="/fluentd/etc/fluent.conf"
2019-04-19 16:30:01 +0000 [info]: Will watch for topics infra-logs at brokers kafka-logs.domain:9092 and 'logs-consumer' group
2019-04-19 16:30:01 +0000 [info]: using configuration file: <ROOT>
  <source>
    @type kafka_group
    brokers "kafka-logs.domain:9092"
    ssl_ca_cert "/etc/ipa/ca.crt"
    ssl_client_cert "/etc/ipa/host.crt"
    ssl_client_cert_key "/etc/ipa/host.key"
    consumer_group "logs-consumer"
    topics "infra-logs"
    format "json"
    start_from_beginning false
  </source>
  <match **>
    @type stdout
  </match>
</ROOT>
2019-04-19 16:30:01 +0000 [info]: starting fluentd-1.4.2 pid=6 ruby="2.3.3"
2019-04-19 16:30:01 +0000 [info]: spawn command to main:  cmdline=["/usr/bin/ruby2.3", "-Eascii-8bit:ascii-8bit", "/usr/local/bin/fluentd", "-c", "/fluentd/etc/fluent.conf", "-p", "/fluentd/plugins", "--under-supervisor"]
2019-04-19 16:30:02 +0000 [info]: gem 'fluent-plugin-kafka' version '0.5.5'
2019-04-19 16:30:02 +0000 [info]: gem 'fluent-plugin-zookeeper' version '0.1.2'
2019-04-19 16:30:02 +0000 [info]: gem 'fluentd' version '1.4.2'
2019-04-19 16:30:02 +0000 [info]: adding match pattern="**" type="stdout"
2019-04-19 16:30:02 +0000 [info]: adding source type="kafka_group"
2019-04-19 16:30:02 +0000 [info]: #0 Will watch for topics infra-logs at brokers kafka-logs.domain:9092 and 'logs-consumer' group
2019-04-19 16:30:02 +0000 [info]: #0 starting fluentd worker pid=14 ppid=6 worker=0
2019-04-19 16:30:02 +0000 [error]: #0 unexpected error error_class=OpenSSL::SSL::SSLError error="SSL_connect returned=1 errno=0 state=error: certificate verify failed"
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/ssl_socket_with_timeout.rb:66:in `connect_nonblock'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/ssl_socket_with_timeout.rb:66:in `initialize'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/connection.rb:108:in `new'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/connection.rb:108:in `open'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/connection.rb:87:in `block in send_request'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/instrumenter.rb:21:in `instrument'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/connection.rb:86:in `send_request'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/broker.rb:30:in `fetch_metadata'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:198:in `block in fetch_cluster_info'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:193:in `each'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:193:in `fetch_cluster_info'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:181:in `cluster_info'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:67:in `refresh_metadata!'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:48:in `add_target_topics'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/consumer_group.rb:24:in `subscribe'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/consumer.rb:86:in `subscribe'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluent-plugin-kafka-0.5.5/lib/fluent/plugin/in_kafka_group.rb:149:in `block in setup_consumer'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluent-plugin-kafka-0.5.5/lib/fluent/plugin/in_kafka_group.rb:148:in `each'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluent-plugin-kafka-0.5.5/lib/fluent/plugin/in_kafka_group.rb:148:in `setup_consumer'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluent-plugin-kafka-0.5.5/lib/fluent/plugin/in_kafka_group.rb:129:in `start'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/compat/call_super_mixin.rb:42:in `start'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:203:in `block in start'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:192:in `block (2 levels) in lifecycle'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:191:in `each'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:191:in `block in lifecycle'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:178:in `each'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:178:in `lifecycle'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:202:in `start'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/engine.rb:274:in `start'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/engine.rb:219:in `run'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/supervisor.rb:805:in `run_engine'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/supervisor.rb:549:in `block in run_worker'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/supervisor.rb:730:in `main_process'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/supervisor.rb:544:in `run_worker'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/command/fluentd.rb:316:in `<top (required)>'
  2019-04-19 16:30:02 +0000 [error]: #0 /usr/lib/ruby/2.3.0/rubygems/core_ext/kernel_require.rb:55:in `require'
  2019-04-19 16:30:02 +0000 [error]: #0 /usr/lib/ruby/2.3.0/rubygems/core_ext/kernel_require.rb:55:in `require'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/bin/fluentd:8:in `<top (required)>'
  2019-04-19 16:30:02 +0000 [error]: #0 /usr/local/bin/fluentd:22:in `load'
  2019-04-19 16:30:02 +0000 [error]: #0 /usr/local/bin/fluentd:22:in `<main>'
2019-04-19 16:30:02 +0000 [error]: #0 unexpected error error_class=OpenSSL::SSL::SSLError error="SSL_connect returned=1 errno=0 state=error: certificate verify failed"
  2019-04-19 16:30:02 +0000 [error]: #0 suppressed same stacktrace
2019-04-19 16:30:03 +0000 [info]: Worker 0 finished unexpectedly with status 1

I've also tried adding the CA entry as an array (ssl_ca_cert ["/etc/ipa/ca.crt"]), and get the same error.

The same certificates/keys work fine for fluent-bit kafka output:

Copyright (C) Treasure Data
[2019/04/19 16:34:37] [ info] [storage] initializing...
[2019/04/19 16:34:37] [ info] [storage] in-memory
[2019/04/19 16:34:37] [ info] [storage] normal synchronization mode, checksum disabled
[2019/04/19 16:34:37] [ info] [engine] started (pid=1)
[2019/04/19 16:34:37] [ info] [in_systemd] seek_cursor=s=dea1eae7fbde4a93b99fc438eb30e9a1;i=edb... OK
[2019/04/19 16:34:37] [ info] [out_kafka] brokers='kafka-logs.domain:9092' topics='infra-logs'
[2019/04/19 16:34:37] [ info] [http_server] listen iface=127.0.0.1 tcp_port=2020```

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions