New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proprietary Serde type #37
Conversation
* Internal KafkaConsumer and KafkaProducer works with Array[Byte] serdes for keys and values * Instances for default Kafka serdes * Conversion from Kafka serdes * Operators for effectful and non-effectful conversion of Serdes (mapping, contramapping and invariant mapping), can be used to construct new Serdes from existing ones
Wow that was pretty quick @svroonland! :-) Here are some thoughts:
|
Oh and I'm really happy you got rid of |
Another thought: we definitely want a distinction between pure and effectful serializers/deserializers. This will let us shortcut the ZIO runloop if we're just working with pure deserializers. |
Yeah, I have written these codec-like things before several times. It's easy when you treat them as invariant functors (in the category of endo-functors of course (that part is a joke)) :)
Indeed! By the way, if you have a different design or implementation in mind, don't hesitate to disregard or cherry-pick from this PR! Or just tell me what to change :) |
Hmm, about effectful serdes: I don't think we should want arbitrary effects in serdes, maybe we should just use |
Effectful serdes are definitely useful! E.g., running logging effects. Don't give up on them :-) |
I see. But perhaps its simpler to model them non-effectful and if you want to do something like logging you do that in the stream consumption? |
.. To elaborate on that: you could get a stream of |
Yeah, but that's just introducing another effect type where we could just use ZIO and attain simpler code. |
Maybe I should rephrase. Specifying serdes as
As a consequence of 2, we'll be allocating less, as we're staying in ZIO and not encoding the results in an intermediate representation. |
Sounds fair. One use I can think of is some sort of Avro-schema lookup serializer, which needs to connect to a schema registry or something. |
Well, that seems to compile :) I don't really like the def make[R, K, V](
settings: ConsumerSettings,
keyDeserializer: Deserializer[R, K],
valueDeserializer: Deserializer[R, V]
) so we can do |
@svroonland I actually suggest deferring the specification of a deserializer to the |
Updated PR. If we're happy with this implementation I can start to add some docs and some tests. |
*/ | ||
trait Serde[-R, T] extends Deserializer[R, T] with Serializer[R, T] | ||
|
||
object Serde { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you please add a constructor that wraps Kafka's Serde
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, that's in Serdes. Why not move them here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That makes sense
Implements #30
Serde[T] extends Serializer[T] with Deserializer[T]
KafkaConsumer
andKafkaProducer
works withArray[Byte]
serdes for keys and valueseither
operator onDeserializer
to explicitly handle deserialization failures.