Skip to content

Conversation

@pnowojski
Copy link
Contributor

This is a change in documentation only.


One possible cause of this error is when a new leader election is taking place,
for example after or during restarting a Kafka broker.
This is a retriable exception, so Flink job should be able to restart and resume normal operation.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As a comment that is independent of the PR:
Have we thought about catching this exception, and reassigning the new elected partitions to the client?
I'm curious if this is possible and a proper solution on Flink's side.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dunno, it would require some independent work to investigate it. I don't know how severe/often is that issue to weight it's priority. Probably not very frequent.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright, as I mentioned, this shouldn't affect the PR. Lets keep it as is.

### Data loss

Depending on your Kafka configuration, even after Kafka acknowledges
writes you can still experience data loss. In particular keep in mind about following properties
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

about "the" following properties

## Troubleshooting

<div class="alert alert-warning">
If you have a problem with Kafka when using Flink, keep in mind that Flink only wraps <tt>KafkaConsumer</tt> or <tt>KafkaProducer</tt>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we include links to the Javadocs for KafkaConsumer and KafkaProducer?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moreover, the connectors for 0.8 and 0.9+ uses different Kafka Java APIs. Might want to point that out.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would be an external link to Kafka docs (for what version?).

What do you mean different Kafka Java APIs? (I'm/I wasn't aware of that, so I don't know what should I point out)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Kafka 0.8 uses a lower-level client called SimpleConsumer: https://www.javadoc.io/doc/org.apache.kafka/kafka_2.10/0.8.0

Other versions use a higher level client, called the KafkaConsumer:
https://kafka.apache.org/10/javadoc/?org/apache/kafka/clients/consumer/KafkaConsumer.html

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, done. Thanks for the review @tzulitai.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants