Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NIFI-4046: If we are unable to parse out any records from a Kafka Mes… #1906

Closed
wants to merge 2 commits into from

Conversation

markap14
Copy link
Contributor

@markap14 markap14 commented Jun 9, 2017

…aage with ConsumeKafkaRecord, then we should route all of the bytes received to 'parse.failure'

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

For all changes:

  • Is there a JIRA ticket associated with this PR? Is it referenced
    in the commit message?

  • Does your PR title start with NIFI-XXXX where XXXX is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character.

  • Has your PR been rebased against the latest commit within the target branch (typically master)?

  • Is your initial contribution a single, squashed commit?

For code changes:

  • Have you ensured that the full suite of tests is executed via mvn -Pcontrib-check clean install at the root nifi folder?
  • Have you written or updated unit tests to verify your changes?
  • If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under ASF 2.0?
  • If applicable, have you updated the LICENSE file, including the main LICENSE file under nifi-assembly?
  • If applicable, have you updated the NOTICE file, including the main NOTICE file found under nifi-assembly?
  • If adding new Properties, have you added .displayName in addition to .name (programmatic access) for each of the new properties?

For documentation related changes:

  • Have you ensured that format looks appropriate for the output in which it is rendered?

Note:

Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible.

…aage with ConsumeKafkaRecord, then we should route all of the bytes received to 'parse.failure'
@ijokarumawak
Copy link
Member

Reviewing...

@ijokarumawak
Copy link
Member

I tested with following schema:

{
  "type": "record",  "name": "test",
  "fields": [
    {  "name": "id",  "type": "long"  },
    {   "name": "name", "type": [ "null",  "string" ]  }
  ]
}

And this message:

{"name": "does not have ID"}

Then I got following exception, the message was not routed to 'failure':

2017-06-12 14:46:26,967 ERROR [Timer-Driven Process Thread-10] o.a.n.p.k.pubsub.ConsumeKafkaRecord_0_10 ConsumeKafkaRecord_0_10[id=9ad0582e-015c-1000-3b99-4fb95ea47d32] Exception while processing data from kafka so will close the lease org.apache.nifi.processors.kafka.pubsub.ConsumerPool$SimpleConsumerLease@2b2cc16c due to org.apache.nifi.processor.exception.ProcessException: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.NullPointerException: null of long in field id of test: org.apache.nifi.processor.exception.ProcessException: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.NullPointerException: null of long in field id of testorg.apache.nifi.processor.exception.ProcessException: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.NullPointerException: null of long in field id of test
        at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.writeRecordData(ConsumerLease.java:528)
        at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.lambda$processRecords$2(ConsumerLease.java:320)
        at java.util.HashMap$KeySpliterator.forEachRemaining(HashMap.java:1548)
        at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580)
        at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.processRecords(ConsumerLease.java:307)
        at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.poll(ConsumerLease.java:168)
        at org.apache.nifi.processors.kafka.pubsub.ConsumeKafkaRecord_0_10.onTrigger(ConsumeKafkaRecord_0_10.java:327)
        at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
        at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120)
        at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
        at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
        at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.NullPointerException: null of long in field id of test
        at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:308)
        at org.apache.nifi.avro.WriteAvroResultWithSchema.writeRecord(WriteAvroResultWithSchema.java:59)
        at org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
        at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.writeRecordData(ConsumerLease.java:506)
        ... 18 common frames omitted
Caused by: java.lang.NullPointerException: null of long in field id of test
        at org.apache.avro.generic.GenericDatumWriter.npe(GenericDatumWriter.java:132)
        at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:126)
        at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:73)
        at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:60)
        at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:302)
        ... 21 common frames omitted
Caused by: java.lang.NullPointerException: null
        at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118)
        at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:73)
        at org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:153)
        at org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:143)
        at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:105)
        ... 24 common frames omitted

Probably we need to wrap writer.write(record) as well?
https://github.com/apache/nifi/pull/1906/files#diff-5a7aa2af019388ff0cd5be33b8fbd660R506

There is the template file that I used.
https://gist.github.com/ijokarumawak/b5943f83e291f4c08adec9cf4add2e26

@markap14 Can you take a look?

@markap14
Copy link
Contributor Author

@ijokarumawak that's a great catch! Pushed a new commit to address. Thanks!

@ijokarumawak
Copy link
Member

@markap14 Thanks for the updates. All LGTM, +1! Merging to master..

@asfgit asfgit closed this in cdc154f Jun 30, 2017
mattyb149 pushed a commit to mattyb149/nifi that referenced this pull request Nov 30, 2017
…aage with ConsumeKafkaRecord, then we should route all of the bytes received to 'parse.failure'

NIFI-4046: Addressed issue of Record Writer failing with ConsumeKafkaRecord

This closes apache#1906.

Signed-off-by: Koji Kawamura <ijokarumawak@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants