-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Must be able to bind to different types in trigger functions #44
Comments
@brandonh-msft and @ryancrawcour please submit your feedback about this item. Once we agreed, Brandon can start. |
if i'm understanding the |
Looks good to me. |
this is my understanding too. |
Kafka stores the messages as byte[]. The intention is to get the raw data coming from the topic and do all the required serialisation on the function. The following code does that: [FunctionName(nameof(ByteArrayUser))]
public static void ByteArrayUser(
[KafkaTrigger("LocalBroker", "users", ConsumerGroup = "azfunc_byte_array", ValueType = typeof(byte[]))] KafkaEventData[] kafkaEvents,
ILogger logger)
{
foreach (var kafkaEvent in kafkaEvents)
{
logger.LogInformation($"users message has {((byte[])kafkaEvent.Value).Length} length");
}
} The idea would be to be able to replace the parameter definition the following way: [KafkaTrigger("LocalBroker", "users", ConsumerGroup = "azfunc_byte_array")] byte[][] rawKafkaEvents Does it make sense? |
It is ok for me if you don't think there is value here. The more I think about it the more I have the impression that this issue a nice to have, as there is a way to get things working using KafkaEventData. This issue is about making it developer friendly. |
well, the above code does and doesn't do that. if you look at #43 we never implemented what we actually do with the byte array. So you don't actually get any event data out of the bytestream. This is the piece lacking definition; but if we're saying "if you choose byte[], you need to provide the implementation" that could prove to be difficult given the structure of bindings. |
I agree, that constructor shouldn't be there. The way the KafkaEventData will be built in case the ValueType = typeof(byte[]) is the following: new KafkaEventData(new ConsumeResultWrapper<TKey, TValue>(consumeResult));
// TValue => byte[]
// ctor will set KafkaEventData.Value to IConsumeResultData.Value which is a byte[] |
I will check if I can remove that ctor. |
so |
No, it will be of the type in which the KafkaListener<TKey, TValue> was created |
got it. so you were just saying that the only case where |
Does this mean that ✅ 3 in this issue is moot, then? |
Correct. It should be simple. |
…handlers to demonstrate
Remember folks … mvp first, then we add on the "developer friendly" and nice to haves once developers start using it and requesting it. Hopefully by then we'll have a community of open source developers that will submit their own PRs :) |
…handlers to demonstrate
* Remove KafkaEventData(byte[]) ctor Add to sample function example of custom deserialisation Add to sample function binding to strings * adding logging to configProvider during converter operations * task 3 in #44 doesn't require any additional work; added sample handlers to demonstrate * adding logging to configProvider during converter operations * Adding tests to prove pt3 of #44
What's the status of this issue? Closed, or still being worked on? |
The trigger implementation has been tested using KafkaEventData and string as parameter types.
We must implement the following scenarios:
POCO (importance: high)
byte[] (importance: high)
IGenericRecord (importance: low)
string (importance: very low)
support key, partition, offset, timestamp and topic as stand-alone parameters (importance: medium)
The text was updated successfully, but these errors were encountered: