-
Notifications
You must be signed in to change notification settings - Fork 952
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JDBC sink fails to Postgres database #609
Comments
The JDBC Sink requires a schema to your data. I'm not sure why this is triggering the error you're seeing, but you definitely to include a schema, either as part of your JSON or using Avro. See https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained |
We'll keep this open so that we catch this earlier and give a much better exception message. |
Facing the same issue with mongo as source and mysql as sink.Request your help.. |
And if I have a schemaless origin topic, is it possible to create a separated Avro file for the topic data and use it to be able to have a well-defined schema? |
Thanks @Cricket007 |
Some notes on this: https://rmoff.net/2020/01/22/kafka-connect-classcastexception/ |
@Cricket007 Thanks for the suggestion on writing a custom SMT. @fabiotc here is a SMT I wrote for appending schema to a record https://github.com/yousufdev/kafka-connect-append-schema, hope it helps. |
thanks @rmoff |
@fabiotc yes you can append schema into a complex structure, just pass it in the 'schema' property of the SMT. |
I have this issue neeed some help here please java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.kafka.connect.data.Struct This is my connector source (im pulling the string from txt file) This is my jdbc connector configuration (im trying to sink to postgress) |
JDBC sink requires Structured data, not strings. FileStream source only writes strings unless you use a HoistField transform, for example |
Hi I haver added the HoistField in the connect Source like this : transforms=HoistField Now im able to see the string as json format in the topic but now im getting in the connector jdbc sink this issue java.lang.ClassCastException: java.util.HashMap cannot be cast to org.apache.kafka.connect.data.Struct |
That error has already been answered in this thread How about using the JDBC source connector (or Debezium) from one database to another? The problem isn't the connector itself, it's the data you're sending through Connect, and the FileStream source just isn't a good example one to use |
Thank you so much , I changed the way I was doing it... I'm directly sending the message to the topic with the kafka-console-producer including the schema and payload and the sink is able to write those fields in the DB, thank you for your help I'm building a prove of concept to present it at work, thanks.
Kafka JDBC Sink Connector { |
Keeping the schema part of each message isn't recommended. Rather, you can use Jsonschema, Avro, or Protobuf console producers + their corresponding converters https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained |
@OneCricketeer, @fabiotc can you please star my SMT repo again ? My account got compromised and deleted.Thanks |
I encounter the following error when I sink a topic to my postgres database:
My sink connector config:
ENVIRONMENT variables of my kafka-connect container:
I have a separate producer which can produce JSON data to Kafka without schema defined.
The text was updated successfully, but these errors were encountered: