-
Notifications
You must be signed in to change notification settings - Fork 123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Schema Registry support #964
Comments
I am not a big Kafka specialist, but your code example with |
We are currently creating a new Kafka broker based on the Confluent library. We'll look into this as soon as the new broker support is finished. |
Using from confluent_kafka.schema_registry import SchemaRegistryClient
from confluent_kafka.schema_registry.avro import AvroDeserializer
from faststream.confluent import KafkaMessage
def get_avro_deserializer(settings: Annotated[KafkaSettings, Depends(get_kafka_settings)]) -> AvroDeserializer:
"""
:raises SchemaRegistryError: if schema was not registered before
"""
sr = SchemaRegistryClient({"url": str(settings.schema_registry_url)})
latest_version = sr.get_latest_version(f"{settings.topic}-value")
return AvroDeserializer(schema_registry_client=sr, schema_str=latest_version.schema.schema_str)
@apply_types
async def decode_message(
msg: KafkaMessage,
deserializer: Annotated[AvroDeserializer, Depends(get_avro_deserializer)],
settings: Annotated[KafkaSettings, Depends(get_kafka_settings)],
):
ctx = SerializationContext(settings.topic, MessageField.VALUE)
return deserializer(msg.body, ctx)
@broker.subscriber(settings.topic, decoder=decode_message)
async def consume(msg: PydanticModel)
... |
@devova can we close this issue? Is it working for you? |
Sorry there was an bug in my example. You can't decorate with @apply_types
async def decode_message():
... It would properly works only for the first time of f-n invocation, then The working example have to initiate async def decode_message(msg: KafkaMessage):
deserializer = get_avro_deserializer(settings)
return deserializer(msg.body, None) It would be good to add documentation with examples how to deal with different schema registries. Again there are many registries and coupling router with a particular registry isn't a good idea, unless there will be a some Abstract class first, so later community can add implementation |
Opening a new issue #1297 and closing this one. |
Some Kafka libraries like confluent-kafka-python, https://github.com/marcosschroh/python-schema-registry-client and https://github.com/lsst-sqre/kafkit provide various schema registry integration capabilities (HTTP client for schema registry, caching schema registry requests, serialization/deserialization, Avro, Protobuf, JSON-schema support)
It would be nice to have such capabilities built-in with declarative style API.
Further more, I suggest providing both schema-first (use model.avro files) and code-first implementations (generate and register avro schema from Pydantic models or Dataclasses using https://github.com/marcosschroh/dataclasses-avroschema) of schema registry integration.
Using default code example, I envision the following usage pattern.
Describe alternatives you've considered
Each application may try to implement it's own integration with schema registry
The text was updated successfully, but these errors were encountered: