Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ruby-kafka support kerberos ? #670

Closed
openchung opened this issue Oct 17, 2018 · 15 comments
Closed

ruby-kafka support kerberos ? #670

openchung opened this issue Oct 17, 2018 · 15 comments

Comments

@openchung
Copy link

openchung commented Oct 17, 2018

I use fluentd to send json log to sasl_ssl cloudera kafka , but I meet the following warn. So it cause send failed. I have verified my keytab and principal using kinit to verify .

2018-10-17 07:45:10 +0800 [warn]: suppressed same stacktrace
2018-10-17 07:45:10 +0800 fluent.warn: {"message":"Send exception occurred: gss_init_sec_context did not return GSS_S_COMPLETE"}
2018-10-17 07:45:10 +0800 fluent.warn: {"message":"Exception Backtrace : /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/gssapi-1.2.0/lib/gssapi/simple.rb:95:in init_context'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/sasl/gssapi.rb:72:ininitialize_gssapi_context'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/sasl/gssapi.rb:25:in authenticate!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/sasl_authenticator.rb:51:inauthenticate!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/connection_builder.rb:27:in build_connection'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/broker.rb:184:inconnection'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/broker.rb:170:in send_request'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/broker.rb:44:infetch_metadata'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:386:in block in fetch_cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:381:ineach'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:381:in fetch_cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:367:incluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:95:in refresh_metadata!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:50:inadd_target_topics'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:276:in deliver_messages_with_retries'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:238:inblock in deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:in call'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:ininstrument'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:231:in deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:281:indeliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:344:in write'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:inwrite_chunk'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:in pop'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:intry_flush'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in `run'"}
2018-10-17 07:45:10 +0800 fluent.info: {"message":"initialized kafka producer: kafka"}
2018-10-17 07:45:10 +0800 fluent.warn: {"next_retry":"2018-10-17 07:47:30 +0800","error_class":"GSSAPI::GssApiError","error":"gss_init_sec_context did not return GSS_S_COMPLETE","plugin_id":"object:3f84a1db517c","message":"temporarily failed to flush the buffer. next_retry=2018-10-17 07:47:30 +0800 error_class="GSSAPI::GssApiError" error="gss_init_sec_context did not return GSS_S_COMPLETE" plugin_id="object:3f84a1db517c""}```

Following is my td-agent.conf

log_level debug @type tail format json read_lines_limit 200 path /mnt/old/tx-failover.log* pos_file /opt/kafka_failover.log.pos tag audit.trail @type kafka_buffered brokers 192.168.5.129:9093 client_id kafka buffer_type memory default_topic log4j output_data_type json ssl_ca_cert /opt/server.cer.pem #sasl__mechanism gssapi principal kafka@KAFKA.COM keytab /opt/kafka.keytab @type stdout output_type json ```

Following is my dependency :

actionmailer (4.2.8)
actionpack (4.2.8)
actionview (4.2.8)
activejob (4.2.8)
activemodel (4.2.8)
activerecord (4.2.8)
activesupport (4.2.8)
addressable (2.5.2, 2.5.1)
arel (6.0.4)
aws-sdk (2.10.45)
aws-sdk-core (2.10.45)
aws-sdk-resources (2.10.45)
aws-sigv4 (1.0.2)
base91 (0.0.1)
bigdecimal (default: 1.2.4)
bson (4.1.1)
builder (3.2.3)
bundler (1.14.5)
celluloid (0.15.2)
cool.io (1.5.1)
diff-lcs (1.3)
draper (1.4.0)
elasticsearch (5.0.5)
elasticsearch-api (5.0.5)
elasticsearch-transport (5.0.5)
erubis (2.7.0)
excon (0.62.0)
faraday (0.13.1)
ffi (1.9.25)
fluent-logger (0.7.1)
fluent-mixin-plaintextformatter (0.2.6)
fluent-plugin-elasticsearch (1.17.2)
fluent-plugin-genhashvalue (0.04)
fluent-plugin-kafka (0.7.9, 0.6.1)
fluent-plugin-mongo (0.8.1)
fluent-plugin-rewrite-tag-filter (1.5.6)
fluent-plugin-s3 (0.8.5)
fluent-plugin-scribe (0.10.14)
fluent-plugin-td (0.10.29)
fluent-plugin-td-monitoring (0.2.3)
fluent-plugin-webhdfs (0.7.1)
fluentd (0.12.40)
fluentd-ui (0.4.4)
font-awesome-rails (4.7.0.1)
globalid (0.4.0)
gssapi (1.2.0)
haml (4.0.7)
haml-rails (0.5.3)
hike (1.2.3)
hirb (0.7.3)
http_parser.rb (0.6.0)
httpclient (2.8.2.4)
i18n (0.8.1)
io-console (default: 0.4.3)
ipaddress (0.8.3)
jbuilder (2.6.3)
jmespath (1.3.1)
jquery-rails (3.1.4)
json (default: 1.8.1)
kramdown (1.13.2)
kramdown-haml (0.0.3)
loofah (2.0.3)
ltsv (0.1.0)
mail (2.6.4)
mime-types (3.1)
mime-types-data (3.2016.0521)
mini_portile2 (2.3.0, 2.1.0)
minitest (5.10.1, default: 4.7.5)
mixlib-cli (1.7.0)
mixlib-config (2.2.4)
mixlib-log (1.7.1)
mixlib-shellout (2.2.7)
mongo (2.2.7)
msgpack (1.1.0)
multi_json (1.12.1)
multipart-post (2.0.0)
nokogiri (1.8.1)
ohai (6.20.0)
oj (2.18.1)
parallel (1.8.0)
psych (default: 2.0.5)
public_suffix (3.0.0, 2.0.5)
puma (3.8.2)
rack (1.6.5)
rack-test (0.6.3)
rails (4.2.8)
rails-deprecated_sanitizer (1.0.3)
rails-dom-testing (1.0.8)
rails-html-sanitizer (1.0.3)
railties (4.2.8)
rake (default: 10.1.0)
rdoc (default: 4.1.0)
request_store (1.3.2)
ruby-kafka (0.6.8)
ruby-progressbar (1.8.3)
rubyzip (1.2.1, 1.1.7)
sass (3.2.19)
sass-rails (4.0.5)
settingslogic (2.0.9)
sigdump (0.2.4)
sprockets (2.12.4)
sprockets-rails (2.3.3)
string-scrub (0.0.5)
sucker_punch (1.0.5)
systemu (2.5.2)
td (0.15.2)
td-client (0.8.85)
td-logger (0.3.27)
test-unit (default: 2.1.10.0)
thor (0.19.4)
thread_safe (0.3.6)
thrift (0.8.0)
tilt (1.4.1)
timers (1.1.0)
tzinfo (1.2.3)
tzinfo-data (1.2017.2)
uuidtools (2.1.5)
webhdfs (0.8.0)
yajl-ruby (1.3.0)
zip-zip (0.3)

Please help me.

@0x2c7
Copy link
Contributor

0x2c7 commented Oct 19, 2018

Hi @openchung, I'm not sure that I understand your problem. Could you please describe your use cases, your settings, the stacks you are using? At the first look, isn't this issue should be in fluentd bug tracker?

@openchung
Copy link
Author

openchung commented Oct 19, 2018

Sorry. We use fluent-plugin-kafka which is implemented by ruby-kafka. We use fluentd to collect the logs. The type of content is json and transfer to the Cloudera kafka through SASL_SSL(GSSAPI). It has exception happened in ruby-kafka.

The following is the stacks.
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-mixin-plaintextformatter' version '0.2.6'
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-plugin-elasticsearch' version '1.17.2'
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-plugin-genhashvalue' version '0.04'
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-plugin-kafka' version '0.7.9'
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-plugin-kafka' version '0.6.1'
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-plugin-mongo' version '0.8.1'
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-plugin-rewrite-tag-filter' version '1.5.6'
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-plugin-s3' version '0.8.5'
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-plugin-scribe' version '0.10.14'
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-plugin-td' version '0.10.29'
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-plugin-td-monitoring' version '0.2.3'
2018-10-21 11:56:48 +0800 [info]: gem 'fluent-plugin-webhdfs' version '0.7.1'
2018-10-21 11:56:48 +0800 [info]: gem 'fluentd' version '0.12.40'
2018-10-21 11:56:48 +0800 [info]: adding match pattern="audit.*" type="kafka_buffered"
2018-10-21 11:56:48 +0800 [trace]: registered output plugin 'kafka_buffered'
2018-10-21 11:56:48 +0800 [info]: brokers has been set directly: ["192.168.5.129"]
2018-10-21 11:56:48 +0800 [info]: adding match pattern="**" type="stdout"
2018-10-21 11:56:48 +0800 [info]: adding source type="tail"
2018-10-21 11:56:48 +0800 [info]: using configuration file: <ROOT>
  <system>
    log_level trace
  </system>
  <source>
    @type tail
    format json
    read_lines_limit 200
    path /mnt/old/tx-failover.log
    pos_file /opt/kafka_failover.log.pos
    tag audit.trail
  </source>
  <match audit.*>
    @type kafka_buffered
    brokers 192.168.5.129
    client_id kafka
    buffer_type memory
    default_topic log4j
    output_data_type json
    ssl_ca_cert /opt/ca_cert.pem
    get_kafka_client_log true
    principal kafka/cipkafka1t.testesunbank.com.tw@KAFKA.COM
    keytab /opt/kafka.keytab
  </match>
  <match **>
    @type stdout
    output_type json
  </match>
</ROOT>
2018-10-21 11:56:48 +0800 [info]: initialized kafka producer: kafka
2018-10-21 11:56:48 +0800 [info]: following tail of /mnt/old/tx-failover.log
2018-10-21 11:57:03 +0800 [info]: detected rotation of /mnt/old/tx-failover.log; waiting 5 seconds
2018-10-21 11:57:03 +0800 [info]: following tail of /mnt/old/tx-failover.log
2018-10-21 11:57:03 +0800 fluent.info: {"message":"detected rotation of /mnt/old/tx-failover.log; waiting 5 seconds"}
2018-10-21 11:57:03 +0800 fluent.info: {"message":"following tail of /mnt/old/tx-failover.log"}
2018-10-21 11:57:48 +0800 [trace]: message will send to log4j with partition_key: , partition: , message_key:  and value: {"widget":{"debug":"on","window":{"title":"Sample Konfabulator Widget","name":"main_window","width":500,"height":500},"image":{"src":"Images/Sun.png","name":"sun1","hOffset":250,"vOffset":250,"alignment":"center"},"text":{"data":"Click Here","size":36,"style":"bold","name":"text1","hOffset":250,"vOffset":100,"alignment":"center","onMouseUp":"sun1.opacity = (sun1.opacity / 100) * 90;"}}}.
2018-10-21 11:57:48 +0800 [trace]: message will send to log4j with partition_key: , partition: , message_key:  and value: {"widget":{"debug":"on","window":{"title":"Sample Konfabulator Widget","name":"main_window","width":500,"height":500},"image":{"src":"Images/Sun.png","name":"sun1","hOffset":250,"vOffset":250,"alignment":"center"},"text":{"data":"Click Here","size":36,"style":"bold","name":"text1","hOffset":250,"vOffset":100,"alignment":"center","onMouseUp":"sun1.opacity = (sun1.opacity / 100) * 90;"}}}.
2018-10-21 11:57:48 +0800 [trace]: message will send to log4j with partition_key: , partition: , message_key:  and value: {"widget":{"debug":"on","window":{"title":"Sample Konfabulator Widget","name":"main_window","width":500,"height":500},"image":{"src":"Images/Sun.png","name":"sun1","hOffset":250,"vOffset":250,"alignment":"center"},"text":{"data":"Click Here","size":36,"style":"bold","name":"text1","hOffset":250,"vOffset":100,"alignment":"center","onMouseUp":"sun1.opacity = (sun1.opacity / 100) * 90;"}}}.
2018-10-21 11:57:48 +0800 [trace]: message will send to log4j with partition_key: , partition: , message_key:  and value: {"widget":{"debug":"on","window":{"title":"Sample Konfabulator Widget","name":"main_window","width":500,"height":500},"image":{"src":"Images/Sun.png","name":"sun1","hOffset":250,"vOffset":250,"alignment":"center"},"text":{"data":"Click Here","size":36,"style":"bold","name":"text1","hOffset":250,"vOffset":100,"alignment":"center","onMouseUp":"sun1.opacity = (sun1.opacity / 100) * 90;"}}}.
2018-10-21 11:57:48 +0800 [debug]: 4 messages send.
2018-10-21 11:57:48 +0800 [info]: New topics added to target list: log4j
2018-10-21 11:57:48 +0800 [info]: Fetching cluster metadata from kafka://192.168.5.129:9092
2018-10-21 11:57:48 +0800 [debug]: Opening connection to 192.168.5.129:9092 with client id kafka...
2018-10-21 11:57:48 +0800 [debug]: Sending sasl_handshake API request 1 to 192.168.5.129:9092
2018-10-21 11:57:48 +0800 [debug]: Waiting for response 1 from 192.168.5.129:9092
2018-10-21 11:57:48 +0800 [debug]: Received response 1 from 192.168.5.129:9092
2018-10-21 11:57:48 +0800 fluent.trace: {"message":"message will send to log4j with partition_key: , partition: , message_key:  and value: {\"widget\":{\"debug\":\"on\",\"window\":{\"title\":\"Sample Konfabulator Widget\",\"name\":\"main_window\",\"width\":500,\"height\":500},\"image\":{\"src\":\"Images/Sun.png\",\"name\":\"sun1\",\"hOffset\":250,\"vOffset\":250,\"alignment\":\"center\"},\"text\":{\"data\":\"Click Here\",\"size\":36,\"style\":\"bold\",\"name\":\"text1\",\"hOffset\":250,\"vOffset\":100,\"alignment\":\"center\",\"onMouseUp\":\"sun1.opacity = (sun1.opacity / 100) * 90;\"}}}."}
2018-10-21 11:57:48 +0800 fluent.trace: {"message":"message will send to log4j with partition_key: , partition: , message_key:  and value: {\"widget\":{\"debug\":\"on\",\"window\":{\"title\":\"Sample Konfabulator Widget\",\"name\":\"main_window\",\"width\":500,\"height\":500},\"image\":{\"src\":\"Images/Sun.png\",\"name\":\"sun1\",\"hOffset\":250,\"vOffset\":250,\"alignment\":\"center\"},\"text\":{\"data\":\"Click Here\",\"size\":36,\"style\":\"bold\",\"name\":\"text1\",\"hOffset\":250,\"vOffset\":100,\"alignment\":\"center\",\"onMouseUp\":\"sun1.opacity = (sun1.opacity / 100) * 90;\"}}}."}
2018-10-21 11:57:48 +0800 fluent.trace: {"message":"message will send to log4j with partition_key: , partition: , message_key:  and value: {\"widget\":{\"debug\":\"on\",\"window\":{\"title\":\"Sample Konfabulator Widget\",\"name\":\"main_window\",\"width\":500,\"height\":500},\"image\":{\"src\":\"Images/Sun.png\",\"name\":\"sun1\",\"hOffset\":250,\"vOffset\":250,\"alignment\":\"center\"},\"text\":{\"data\":\"Click Here\",\"size\":36,\"style\":\"bold\",\"name\":\"text1\",\"hOffset\":250,\"vOffset\":100,\"alignment\":\"center\",\"onMouseUp\":\"sun1.opacity = (sun1.opacity / 100) * 90;\"}}}."}
2018-10-21 11:57:48 +0800 fluent.trace: {"message":"message will send to log4j with partition_key: , partition: , message_key:  and value: {\"widget\":{\"debug\":\"on\",\"window\":{\"title\":\"Sample Konfabulator Widget\",\"name\":\"main_window\",\"width\":500,\"height\":500},\"image\":{\"src\":\"Images/Sun.png\",\"name\":\"sun1\",\"hOffset\":250,\"vOffset\":250,\"alignment\":\"center\"},\"text\":{\"data\":\"Click Here\",\"size\":36,\"style\":\"bold\",\"name\":\"text1\",\"hOffset\":250,\"vOffset\":100,\"alignment\":\"center\",\"onMouseUp\":\"sun1.opacity = (sun1.opacity / 100) * 90;\"}}}."}
2018-10-21 11:57:48 +0800 fluent.debug: {"message":"4 messages send."}
2018-10-21 11:57:48 +0800 fluent.info: {"message":"New topics added to target list: log4j"}
2018-10-21 11:57:48 +0800 fluent.info: {"message":"Fetching cluster metadata from kafka://192.168.5.129:9092"}
2018-10-21 11:57:48 +0800 fluent.debug: {"message":"Opening connection to 192.168.5.129:9092 with client id kafka..."}
2018-10-21 11:57:48 +0800 fluent.debug: {"message":"Sending sasl_handshake API request 1 to 192.168.5.129:9092"}
2018-10-21 11:57:48 +0800 fluent.debug: {"message":"Waiting for response 1 from 192.168.5.129:9092"}
2018-10-21 11:57:48 +0800 fluent.debug: {"message":"Received response 1 from 192.168.5.129:9092"}
2018-10-21 11:57:48 +0800 [debug]: GSSAPI: Initializing context with 192.168.5.129:9092, principal kafka/cipkafka1t.testesunbank.com.tw@KAFKA.COM
2018-10-21 11:57:48 +0800 [debug]: Sending topic_metadata API request 2 to 192.168.5.129:9092
2018-10-21 11:57:48 +0800 [debug]: Waiting for response 2 from 192.168.5.129:9092
2018-10-21 11:57:48 +0800 [debug]: Closing socket to 192.168.5.129:9092
2018-10-21 11:57:48 +0800 [debug]: Closing socket to 192.168.5.129:9092
2018-10-21 11:57:48 +0800 [error]: Failed to fetch metadata from kafka://192.168.5.129:9092: Connection error EOFError: end of file reached
2018-10-21 11:57:48 +0800 [warn]: Send exception occurred: Could not connect to any of the seed brokers:
- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached
2018-10-21 11:57:48 +0800 [warn]: Exception Backtrace : /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:407:in `fetch_cluster_info'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:367:in `cluster_info'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:95:in `refresh_metadata!'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:50:in `add_target_topics'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:276:in `deliver_messages_with_retries'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:238:in `block in deliver_messages'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:in `call'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:in `instrument'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:231:in `deliver_messages'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:281:in `deliver_messages'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:344:in `write'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:in `write_chunk'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:in `pop'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:in `try_flush'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in `run'
2018-10-21 11:57:48 +0800 [info]: initialized kafka producer: kafka
2018-10-21 11:57:48 +0800 [warn]: temporarily failed to flush the buffer. next_retry=2018-10-21 11:57:49 +0800 error_class="Kafka::ConnectionError" error="Could not connect to any of the seed brokers:\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached" plugin_id="object:3ffadb6bb820"
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:407:in `fetch_cluster_info'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:367:in `cluster_info'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:95:in `refresh_metadata!'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:50:in `add_target_topics'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:276:in `deliver_messages_with_retries'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:238:in `block in deliver_messages'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:in `call'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:in `instrument'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:231:in `deliver_messages'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:281:in `deliver_messages'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:344:in `write'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:in `write_chunk'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:in `pop'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:in `try_flush'
  2018-10-21 11:57:48 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in `run'
2018-10-21 11:57:48 +0800 fluent.debug: {"message":"GSSAPI: Initializing context with 192.168.5.129:9092, principal kafka/cipkafka1t.testesunbank.com.tw@KAFKA.COM"}
2018-10-21 11:57:48 +0800 fluent.debug: {"message":"Sending topic_metadata API request 2 to 192.168.5.129:9092"}
2018-10-21 11:57:48 +0800 fluent.debug: {"message":"Waiting for response 2 from 192.168.5.129:9092"}
2018-10-21 11:57:48 +0800 fluent.debug: {"message":"Closing socket to 192.168.5.129:9092"}
2018-10-21 11:57:48 +0800 fluent.debug: {"message":"Closing socket to 192.168.5.129:9092"}
2018-10-21 11:57:48 +0800 fluent.error: {"message":"Failed to fetch metadata from kafka://192.168.5.129:9092: Connection error EOFError: end of file reached"}
2018-10-21 11:57:48 +0800 fluent.warn: {"message":"Send exception occurred: Could not connect to any of the seed brokers:\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached"}
2018-10-21 11:57:48 +0800 fluent.warn: {"message":"Exception Backtrace : /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:407:in `fetch_cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:367:in `cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:95:in `refresh_metadata!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:50:in `add_target_topics'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:276:in `deliver_messages_with_retries'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:238:in `block in deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:in `call'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:in `instrument'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:231:in `deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:281:in `deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:344:in `write'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:in `write_chunk'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:in `pop'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:in `try_flush'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in `run'"}
2018-10-21 11:57:48 +0800 fluent.info: {"message":"initialized kafka producer: kafka"}
2018-10-21 11:57:48 +0800 fluent.warn: {"next_retry":"2018-10-21 11:57:49 +0800","error_class":"Kafka::ConnectionError","error":"Could not connect to any of the seed brokers:\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached","plugin_id":"object:3ffadb6bb820","message":"temporarily failed to flush the buffer. next_retry=2018-10-21 11:57:49 +0800 error_class=\"Kafka::ConnectionError\" error=\"Could not connect to any of the seed brokers:\\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached\" plugin_id=\"object:3ffadb6bb820\""}

@0x2c7
Copy link
Contributor

0x2c7 commented Oct 30, 2018

Hi @openchung, I'm not familiar with kerberos, and never use cloudera or fluentd with Kafka before. So, it will take me a while to play around and try to re-produce the situation. It would be a great help if you can create a repository with non-sensitive information that produce the exact errors.

@Thor77
Copy link

Thor77 commented Feb 25, 2019

I'm running into the same issue trying to connect to a (Confluent Kafka) broker via SASL_SSL with GSSAPI, did you find a solution @openchung?

@Thor77
Copy link

Thor77 commented Feb 26, 2019

Simple example to reproduce the issue:

require 'kafka'

kafka = Kafka.new(['kafka.host:9093'], ssl_ca_certs_from_system: true, sasl_gssapi_principal: 'principal@EXAMPLE.COM', sasl_gssapi_keytab: '/etc/keytabs/principal.keytab')
kafka.deliver_message('test', topic: 'test')
/usr/lib/ruby/2.3.0/openssl/buffering.rb:178:in `sysread_nonblock': end of file reached (EOFError)
	from /usr/lib/ruby/2.3.0/openssl/buffering.rb:178:in `read_nonblock'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/ssl_socket_with_timeout.rb:102:in `read'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/protocol/decoder.rb:165:in `read'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/protocol/decoder.rb:59:in `int32'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/protocol/decoder.rb:136:in `bytes'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/sasl/gssapi.rb:56:in `send_and_receive_sasl_token'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/sasl/gssapi.rb:31:in `authenticate!'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/sasl_authenticator.rb:51:in `authenticate!'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/connection_builder.rb:27:in `build_connection'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/broker.rb:202:in `connection'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/broker.rb:188:in `send_request'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/broker.rb:44:in `fetch_metadata'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/cluster.rb:375:in `block in fetch_cluster_info'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/cluster.rb:370:in `each'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/cluster.rb:370:in `fetch_cluster_info'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/cluster.rb:350:in `cluster_info'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/cluster.rb:98:in `refresh_metadata!'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/cluster.rb:52:in `add_target_topics'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/cluster.rb:143:in `partitions_for'
	from /var/lib/gems/2.3.0/gems/ruby-kafka-0.7.5/lib/kafka/client.rb:146:in `deliver_message'
	from kafkatest.rb:4:in `<main>'

Looking at my Kerberos KDC logs it doesn't even get to authenticating and just aborts immediately.
If you need any more information, let me know.

@tracers2222
Copy link

Anyone find a solution to this? I am having the same problem connecting to kafka using GSSAPI.

@github-actions
Copy link

Issue has been marked as stale due to a lack of activity.

@Thor77
Copy link

Thor77 commented Sep 30, 2019

The issue still persists, please reopen...

@dborysenko
Copy link

+1 for re-opening.

@mihir2402
Copy link

+1 for re-opening

1 similar comment
@elafontaine
Copy link

+1 for re-opening

@frencopei
Copy link

  • for re-opening

@rymonroe
Copy link

rymonroe commented Jun 3, 2020

I'm still having issue

@roligupt
Copy link

Running into the same issue.

@xidiandb
Copy link

same error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests