NIFI-6836 Increased topic list length in ConsumeKafka and added validator#3885
NIFI-6836 Increased topic list length in ConsumeKafka and added validator#3885bartektartanus wants to merge 1 commit intoapache:masterfrom bartektartanus:master
Conversation
|
Hello, can we start a discussion about this PR? :D |
|
@bartektartanus It isn't clear we know what the impact would be in terms of performance and resource usage with a larger number. That needs testing and description. We also need this to be on all the relevant Kafka processors like the 2.0 ones as well. Thanks |
|
So maybe we can leave the limit as is (for now) and only add validator? Because now this processor silently ignores some topics if you specify more than 100. |
| if (context.isExpressionLanguageSupported(subject) && context.isExpressionLanguagePresent(input)) { | ||
| return new ValidationResult.Builder().subject(subject).input(input).explanation("Expression Language Present").valid(true).build(); | ||
| } | ||
| if (input == null || input.trim().isEmpty()) { |
There was a problem hiding this comment.
StringUtils.isEmpty is better here.
| static final AllowableValue TOPIC_NAME = new AllowableValue("names", "names", "Topic is a full topic name or comma separated list of names"); | ||
| static final AllowableValue TOPIC_PATTERN = new AllowableValue("pattern", "pattern", "Topic is a regex using the Java Pattern syntax"); | ||
|
|
||
| static final int TOPIC_LIST_MAX_LENGTH = 1000; |
There was a problem hiding this comment.
I agree with @joewitt that we cannot go up 10x without a serious performance evaluation.
| if (input == null || input.trim().isEmpty()) { | ||
| return new ValidationResult.Builder().subject(subject).input(input).explanation("List must have at least one non-empty element").valid(false).build(); | ||
| } | ||
| final int listLength = input.split(",").length; |
There was a problem hiding this comment.
Better expression is input.chars().filter(c -> c == ',').count()
|
We're marking this PR as stale due to lack of updates in the past few months. If after another couple of weeks the stale label has not been removed this PR will be closed. This stale marker and eventual auto close does not indicate a judgement of the PR just lack of reviewer bandwidth and helps us keep the PR queue more manageable. If you would like this PR re-opened you can do so and a committer can remove the stale tag. Or you can open a new PR. Try to help review other PRs to increase PR review bandwidth which in turn helps yours. |
Description of PR
NIFI-6836 - this PR increases the topic length limit from 100 to 1000 and also adds validator to warn the user about it.
I've submitted this issue few days ago and prepared initial version of PR - fixed only for
KafkaConsumer1_0. This bug is in every Kafka Consumer, from 0_10 to 2_0. I can also make changes in all files, but maybe some refactoring should be done? SomeAbstractKafkaConsumerwith all common properties? Or maybe current approach is intended as you do want to mix code between different versions of Kafka Consumer?For all changes:
Is there a JIRA ticket associated with this PR? Is it referenced
in the commit message?
Does your PR title start with NIFI-XXXX where XXXX is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character.
Has your PR been rebased against the latest commit within the target branch (typically
master)?Is your initial contribution a single, squashed commit? Additional commits in response to PR reviewer feedback should be made on this branch and pushed to allow change tracking. Do not
squashor use--forcewhen pushing to allow for clean monitoring of changes.For code changes:
mvn -Pcontrib-check clean installat the rootnififolder?