-
Notifications
You must be signed in to change notification settings - Fork 13.8k
[FLINK-10874][kafka-docs] Document likely cause of UnknownTopicOrPartitionException #7097
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
|
||
| One possible cause of this error is when a new leader election is taking place, | ||
| for example after or during restarting a Kafka broker. | ||
| This is a retriable exception, so Flink job should be able to restart and resume normal operation. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As a comment that is independent of the PR:
Have we thought about catching this exception, and reassigning the new elected partitions to the client?
I'm curious if this is possible and a proper solution on Flink's side.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Dunno, it would require some independent work to investigate it. I don't know how severe/often is that issue to weight it's priority. Probably not very frequent.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Alright, as I mentioned, this shouldn't affect the PR. Lets keep it as is.
docs/dev/connectors/kafka.md
Outdated
| ### Data loss | ||
|
|
||
| Depending on your Kafka configuration, even after Kafka acknowledges | ||
| writes you can still experience data loss. In particular keep in mind about following properties |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
about "the" following properties
docs/dev/connectors/kafka.md
Outdated
| ## Troubleshooting | ||
|
|
||
| <div class="alert alert-warning"> | ||
| If you have a problem with Kafka when using Flink, keep in mind that Flink only wraps <tt>KafkaConsumer</tt> or <tt>KafkaProducer</tt> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we include links to the Javadocs for KafkaConsumer and KafkaProducer?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Moreover, the connectors for 0.8 and 0.9+ uses different Kafka Java APIs. Might want to point that out.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That would be an external link to Kafka docs (for what version?).
What do you mean different Kafka Java APIs? (I'm/I wasn't aware of that, so I don't know what should I point out)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Kafka 0.8 uses a lower-level client called SimpleConsumer: https://www.javadoc.io/doc/org.apache.kafka/kafka_2.10/0.8.0
Other versions use a higher level client, called the KafkaConsumer:
https://kafka.apache.org/10/javadoc/?org/apache/kafka/clients/consumer/KafkaConsumer.html
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, done. Thanks for the review @tzulitai.
This is a change in documentation only.