-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
IMPROVEMENT REQUEST: Add a transformer mode to the connectors to make lightweight edits to the messages while they are being produced/consumed #410
Comments
Basically what I meant was just providing a specific transformer for this case in the connector. A general solution seems to be a bit too much complex. But you're welcome to try. |
I can try out the specific transformer for this case. You are probably right about the general solution being complex |
I tried creating an archetype for this, from the archetype page. I cannot seem to find the correct archetype params for this what I tried:
what I got: I understand my params are incorrect, can you perhaps tell me how to figure out the correct params? |
Please read how to use the archetype: https://camel.apache.org/camel-kafka-connector/latest/archetypes.html It's all explained. You need to provide the archetype reported there and then following the instructions. In the project created you'll need to create the transformer. If you want to create the transformer in the actual codebase you need to modify the connector under connectors folder. What you are trying to do cannot work. The connector artifact is not an archetype |
There is only one archetype to use. |
@codexetreme try with: then answer the usual maven questions then you should have the skeleton project created. |
Thanks, I will try out this command as well.
There a couple questions I have, I took a look at the aws2-s3 connector,
there is a bit of code there for a transformer, however, I don't quite
understand the flow, and when and how the transformer actually gets called.
So, I have a few questions,
1. I create a class that implements from
`org.apache.kafka.connect.transforms.Transformation`. but post this I do
not know how to proceed. Basically I want to make a config option where the
end-user of the library can specify their custom key extractor class via
the newly created option.
2. How do I test my code, in the test code I do not see many tests, so do I
write out my own? or build locally, run, fix, repeat ?
Regards
Yashodhan
…On Thu, Sep 3, 2020 at 6:33 PM Andrea Tarocchi ***@***.***> wrote:
@codexetreme <https://github.com/codexetreme> try with: mvn
archetype:generate
-DarchetypeGroupId=org.apache.camel.kafkaconnector.archetypes
-DarchetypeArtifactId=camel-kafka-connector-extensible-archetype
-DarchetypeVersion=0.4.0
then answer the usual maven questions then you should have the skeleton
project created.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#410 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACCHNM3NRFQZB43VONCLICDSD6H2ZANCNFSM4QS2POTA>
.
|
Hello,
Where you point your new transformation class. for example. In this way you'll use the transformation you implemented in the extended connector.
|
Once the transformations class work and you're happy with the result, you could add that class to the aws2-sqs connector in the camel-kafka-connector repository and open a PR. |
I see, this makes the whole scenario a lot more clear. Let me experiment with this over the weekend, in case I hit any snags I will ask you for more details. One more question I have is, right now the headers that are being sent are not a part of the connector , I assume they are being autogenerated sent from somewhere upstream from the code. How do I set the header in the connector code? If you can guide me to a sample code that might be present in the other connectors that will also help Cheers |
If you are using the source connector the headers will be generated from the camel component consumer under the hood, so you don't have to set any header. |
It's possibile I don't get what you're asking well anyway. So what is the use case for you to set an header in the connector? What do you want to set? |
So, right now , as per my understanding, and experiments, when the source connector runs, it inserts the sqs message ID as the header for the Kafka record. Leaving the key of the record null.
My requirement is that , I want to basically parse the data from the sqs message, and depending on the data in the message, I want to set a key for the Kafka record.
If that is not possible, I want then perhaps set the header to the extracted key, because either way, I need the key that I get from the sqs message data.
Please let me know if my explanation is clear, if not I can elaborate further
…________________________________
From: Andrea Cosentino <notifications@github.com>
Sent: Friday, September 4, 2020 7:22:22 PM
To: apache/camel-kafka-connector <camel-kafka-connector@noreply.github.com>
Cc: Yashodhan Ghadge <yashodhanghadge@gmail.com>; Mention <mention@noreply.github.com>
Subject: Re: [apache/camel-kafka-connector] IMPROVEMENT REQUEST: Add a transformer mode to the connectors to make lightweight edits to the messages while they are being produced/consumed (#410)
It's possibile I don't get what you're asking well anyway. So what is the use case for you to set an header in the connector? What do you want to set?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub<#410 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ACCHNM6RONNQGS5K3DNUUF3SEEH35ANCNFSM4QS2POTA>.
|
I need to test the behavior to understand. I'll do that. |
For each connector we apply a mapping of the Camel headers and properties. So you should able to get the message Id, by looking at the record.headers and check for the following name "CamelHeader.CamelAwsSqsMessageId" and then set it as key record if you want. |
hello, I worked that out over the weekend, and now, I have the key from the SQS Message as required. Our use case is currently satisfied now. I can send a PR, but it is quite a custom functionality where we parse the SQS message and make our key. I can send a PR where the transformer simply sets the CamelHeader.CamelAwsSqsMessageId as the key of the message. Let me know you want to proceed. |
Yes, it would be nice to have the transformer for setting the key, thanks. |
cool let me open a PR |
opened PR #431 please check |
Hello,
In our recent use case, we have to move messages from an AWS SQS Queue to a Kafka topic. This is a perfect use case where we can leverage the power of the Camel Connectors. More specifically the
aws2-sqs
connector.According to the docs, the connector adds the SQS messageId as a header to the Kafka record. and inserts the record as-is into the specified Kafka topic.
A link the documentation for context: Message headers set by the sqs producer
Now, in our use case, we need to set this header to something specific based on the data before it is inserted into the Kafka topic. In the sqs connector docs, I can see no mention where a transformer can be hooked in and run before the data is sent to the target destination. The only way around this, as suggested by @oscerd in the gitter channel, was to use custom archetypes and then generate the connector source code and add functionality as required.
While this approach definitely works, from my experience, a connector reading and making light edits to the data before inserting it into its target destination is a very common use case. In the example of Kafka specifically, their connect API already provides a framework to add-in your own transformations.
A link to the relevant Kafka documentation for context: Transformation with the Connect API
My request is to add functionality to these connectors so that the users of these connectors can chain/add lightweight transformers to make light edits to the data before the connector finally sends it off to its target destination.
PS: if this request is approved, I would love to help out and contribute in implementing this feature :)
The text was updated successfully, but these errors were encountered: