-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create a Solr Source Kamelet #598
Comments
Would you happen to have some example? I tried a lot to setup a sink and source connector with Kafka but I still Im struggled with them. |
The kamelets related to Solr have been removed. What connector are you using? Camel-kafka-connector? What version? What have you tried? |
Hi @oscerd, I have the next set of containers: I'm trying to load some data from documents using jcustenborder's spooldir connectors from a fileserver to a kafka topic. Then I want to index topic messages to Druid or Solr. With Apache Kafka druid supervisors API this process was realized. To solr I'm trying to use apache camel kafka sink connector to send data. I configure my topics with the next parameters.
My messages has been stored with the next structure
I have got error related to data parsing and in the best case Camel connector tried to send data to Solr however it tried to make an atomic update and Solr generated a parsing error. I think that it is related to the nesting in the message data. my came kafka connect component version is 3.21, my kafka connect image is confluentinc/cp-kafka-connect-base:6.1.0 and my kafka brokers use the latest image (I think that is 7.5.1). |
It will always use the producer side, but it will be repeat the same query. We could eventually doing something like this, as a consumer directly in component.
The text was updated successfully, but these errors were encountered: