Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create a Solr Source Kamelet #598

Closed
oscerd opened this issue Dec 2, 2021 · 3 comments · Fixed by #604
Closed

Create a Solr Source Kamelet #598

oscerd opened this issue Dec 2, 2021 · 3 comments · Fixed by #604
Assignees

Comments

@oscerd
Copy link
Contributor

oscerd commented Dec 2, 2021

It will always use the producer side, but it will be repeat the same query. We could eventually doing something like this, as a consumer directly in component.

@oscerd oscerd self-assigned this Dec 2, 2021
@CreangelDev0001
Copy link

Would you happen to have some example? I tried a lot to setup a sink and source connector with Kafka but I still Im struggled with them.

@oscerd
Copy link
Contributor Author

oscerd commented Oct 19, 2023

The kamelets related to Solr have been removed. What connector are you using? Camel-kafka-connector? What version? What have you tried?

@CreangelDev0001
Copy link

CreangelDev0001 commented Oct 19, 2023

Hi @oscerd,

I have the next set of containers:
3 Kafka brokers
Kafka Connect
schema registry
UI for Apache Kafka

I'm trying to load some data from documents using jcustenborder's spooldir connectors from a fileserver to a kafka topic. Then I want to index topic messages to Druid or Solr. With Apache Kafka druid supervisors API this process was realized. To solr I'm trying to use apache camel kafka sink connector to send data. I configure my topics with the next parameters.

{ "connector.class":"org.apache.camel.kafkaconnector.solrsink.CamelSolrsinkSinkConnector", "topics":'indexedTopic', "value.converter.schemas.enable":false, "camel.kamelet.solr-sink.collection":"test_collection", "camel.kamelet.solr-sink.servers":"192.168.230.98:18085", }

My messages has been stored with the next structure

image
{ "schema": { "type": "struct", "fields": [ { "type": "string", "optional": true, "field": "LicenseType" }, { "type": "string", "optional": true, "field": "Breed" }, { "type": "string", "optional": true, "field": "Color" }, { "type": "string", "optional": true, "field": "DogName" }, { "type": "string", "optional": true, "field": "OwnerZip" }, { "type": "string", "optional": true, "field": "ExpYear" }, { "type": "string", "optional": true, "field": "ValidDate" } ], "optional": false, "name": "com.github.jcustenborder.kafka.connect.model.Value" }, "payload": { "LicenseType": "Dog Individual Spayed Female", "Breed": "BICHON FRISE", "Color": "WHITE", "DogName": "CHLOE", "OwnerZip": "15090", "ExpYear": "2017", "ValidDate": "12/15/2016 9:58" } }
image
image

I have got error related to data parsing and in the best case Camel connector tried to send data to Solr however it tried to make an atomic update and Solr generated a parsing error. I think that it is related to the nesting in the message data.

my came kafka connect component version is 3.21, my kafka connect image is confluentinc/cp-kafka-connect-base:6.1.0 and my kafka brokers use the latest image (I think that is 7.5.1).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants