Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fixes #567: ( Impact on BC/DR - Data loss ) - High priority, Nodes with PointValue property cannot be successfully sink to Neo4j With Neo4j Streams Plugin #584

Merged
merged 4 commits into from
Oct 8, 2023

Conversation

conker84
Copy link
Contributor

Fixes #567 in branch 5.0 (Kafka Connect)

One sentence summary of the change.

Proposed Changes (Mandatory)

A brief list of proposed changes in order to fix the issue:

  • fixed the bug
  • added tests

…ity, Nodes with PointValue property cannot be successfully sink to Neo4j With Neo4j Streams Plugin
Copy link
Contributor

@ali-ince ali-ince left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same question as in #583, what do we do for AVRO messages?

@conker84
Copy link
Contributor Author

conker84 commented Aug 30, 2023

Same question as in #583, what do we do for AVRO messages?

As the Streams Transaction Event Handler does not support AVRO format we never thought about this. If we have to do this, it is a new implementation that we need to discuss internally.

@ali-ince
Copy link
Contributor

It looks like kafka connect platform supports AVRO out of the box and we even have it documented: https://neo4j.com/docs/kafka/kafka-connect/source/#_create_the_source_instance. Nevermind, I think since that schema key is part of the message value, I think it will work with any serialization type supported by kafka connect.

@conker84
Copy link
Contributor Author

It looks like kafka connect platform supports AVRO out of the box and we even have it documented: https://neo4j.com/docs/kafka/kafka-connect/source/#_create_the_source_instance. Nevermind, I think since that schema key is part of the message value, I think it will work with any serialization type supported by Kafka Connect.

Yes, indeed. What I meant is that in the past we never tested it as it wasn't supposed to work with ANVRO messages, any bug is not covered by the support as in the past was a decision to not support AVRO for CDC messages mostly because of the Schema that needs to be defined and does not match properly with Neo4j that is mainly used as a schema-less database, which can lead to problems with the before/after info.

@conker84 conker84 merged commit 59bac57 into neo4j-contrib:5.0 Oct 8, 2023
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants