Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deserialization Schema error when change from azure-schemaregistry-kafka-avro 1.0.0.beta.4 to 1.1.0.beta with auto.register.schemas = true #43

Open
mgvinuesa opened this issue Mar 14, 2023 · 2 comments

Comments

@mgvinuesa
Copy link

mgvinuesa commented Mar 14, 2023

Due to a migration of azure-schemaregistry-kafka-avro of 1.0.0.beta.4 to 1.1.0-beta.1 we have detected that KafkaAvroDeserializer fails due to a problem with the Schema deserialization. To sum up, the first revision of the library registers the schema (auto.register.schema=true) and retrieve it as String (The JSON is inside the String) so when the new version of the library tries to retrieve it, it get as String and fail to parse the Schema.

Steps to reproduce with auto.register.schemas = true
I have created a mock project located in my GitHub https://github.com/mgvinuesa/avro-eventhub-bug-steps with a test using the following versions:

  1. avro-eventhub-serializer-1.0.0-beta, uses the following versions:
		<azure.schema-regitry.version>1.0.0-beta.4</azure.schema-regitry.version>
		<azure.identity.version>1.2.1</azure.identity.version>
		<avro.version>1.10.1</avro.version>
  1. avro-eventhub-serializer-1.1.0-beta, uses the following versions:
		<azure.schema-regitry.version>1.1.0-beta.1</azure.schema-regitry.version>
		<azure.identity.version>1.8.0</azure.identity.version>
		<avro.version>1.11.1</avro.version>

Adding valid values for Schema (client, tenant...) in the test AvroSerDesOldVersionTest you are going to register the schema in the portal and deserialized it properly. Then the new schema is created and it is valid:

image

Now, using the second project we are going to do the same thing, serialize and deserialize, in this case, the schema is not going to be created due to it already exists.

But in this case the schema deserialization fails. The error is:

org.apache.avro.SchemaParseException: Illegal character in: string":"String"}]}
	at org.apache.avro.Schema.validateName(Schema.java:1607)
	at org.apache.avro.Schema.access$400(Schema.java:92)
	at org.apache.avro.Schema$Name.<init>(Schema.java:714)
	at org.apache.avro.Schema$Names.get(Schema.java:1567)
	at org.apache.avro.Schema.parse(Schema.java:1677)
	at org.apache.avro.Schema$Parser.parse(Schema.java:1469)
	at org.apache.avro.Schema$Parser.parse(Schema.java:1457)
	at com.azure.data.schemaregistry.apacheavro.SchemaRegistrySchemaCache.lambda$getSchema$1(SchemaRegistrySchemaCache.java:101)
	at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:110)
	at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224)
	at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:126)
	at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79)
	at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
	at reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
	at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:74)
	at reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:122)
	at reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onNext(FluxDoFinally.java:113)
	at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:126)
	at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224)
	at reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onNext(FluxDoFinally.java:113)
	at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:191)
	at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107)
	at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
	at reactor.core.publisher.MonoCollectList$MonoCollectListSubscriber.onComplete(MonoCollectList.java:129)
	at reactor.core.publisher.FluxPeek$PeekSubscriber.onComplete(FluxPeek.java:260)
	at reactor.core.publisher.FluxMap$MapSubscriber.onComplete(FluxMap.java:144)
	at reactor.netty.channel.FluxReceive.onInboundComplete(FluxReceive.java:415)
	at reactor.netty.channel.ChannelOperations.onInboundComplete(ChannelOperations.java:424)
	at reactor.netty.channel.ChannelOperations.terminate(ChannelOperations.java:478)
	at reactor.netty.http.client.HttpClientOperations.onInboundNext(HttpClientOperations.java:712)
	at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:113)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
	at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1382)
	at io.netty.handler.ssl.SslHandler.decodeNonJdkCompatible(SslHandler.java:1256)
	at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1296)
	at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:529)
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:468)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:290)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:833)
	Suppressed: java.lang.Exception: #block terminated with an error
		at reactor.core.publisher.BlockingSingleSubscriber.blockingGet(BlockingSingleSubscriber.java:99)
		at reactor.core.publisher.Mono.block(Mono.java:1742)
		at com.azure.data.schemaregistry.apacheavro.SchemaRegistryApacheAvroSerializer.deserialize(SchemaRegistryApacheAvroSerializer.java:317)
		at com.microsoft.azure.schemaregistry.kafka.avro.KafkaAvroDeserializer.deserialize(KafkaAvroDeserializer.java:91)
		at com.example.demo.AvroSerDesNewVersionTest.testSerDes(AvroSerDesNewVersionTest.java:55)
		at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
		at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
		at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
		at java.base/java.lang.reflect.Method.invoke(Method.java:568)
		at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:725)
		at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60)
		at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131)
		at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149)
		at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:140)
		at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:84)
		at org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115)
		at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105)
		at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)
		at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64)
		at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45)
		at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37)
		at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104)
		at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98)
		at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:214)
		at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
		at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:210)
		at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:135)
		at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:66)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151)
		at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
		at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
		at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
		at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
		at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
		at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
		at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
		at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
		at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
		at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
		at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
		at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
		at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
		at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
		at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35)
		at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
		at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54)
		at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:107)
		at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:88)
		at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:54)
		at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:67)
		at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:52)
		at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:114)
		at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:95)
		at org.junit.platform.launcher.core.DefaultLauncherSession$DelegatingLauncher.execute(DefaultLauncherSession.java:91)
		at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:60)
		at org.eclipse.jdt.internal.junit5.runner.JUnit5TestReference.run(JUnit5TestReference.java:98)
		at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:40)
		at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:529)
		at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:756)
		at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:452)
		at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:210)

The problem is that the received schema is a String, not a JSON, as you can see in the next debug screenshot:

image

You can see the scape characters to define the JSON inside the String. So the following line fails

image

Is there any release notes or capability to make this change backward compatible ? Due to currently all our schemas are created with this old library in the past, and now they are not compatible with the new one.

@mgvinuesa
Copy link
Author

To add more information, without auto.register.schemas does not work the deserializer, ItemNotFound. So I can image that it is a problem regarding the internal azure APIs. If I disable the auto.register.schema but maintain the schema in the portal, I can see this GET request to retrieve it.

11:07:06.577 [reactor-http-nio-3] DEBUG reactor.netty.transport.TransportConnector - [a605dfa6] Connecting to [vialivre-dev-image.servicebus.windows.net/51.144.238.23:443].
11:07:06.611 [reactor-http-nio-3] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [a605dfa6, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] Registering pool release on close event for channel
11:07:06.611 [reactor-http-nio-3] DEBUG reactor.netty.resources.PooledConnectionProvider - [a605dfa6, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] Channel connected, now: 1 active connections, 0 inactive connections and 0 pending acquire requests.
11:07:06.715 [reactor-http-nio-3] DEBUG io.netty.handler.ssl.SslHandler - [id: 0xa605dfa6, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] HANDSHAKEN: protocol:TLSv1.2 cipher suite:TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384
11:07:06.715 [reactor-http-nio-3] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [a605dfa6, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] onStateChange(PooledConnection{channel=[id: 0xa605dfa6, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443]}, [connected])
11:07:06.715 [reactor-http-nio-3] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [a605dfa6-1, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] onStateChange(GET{uri=null, connection=PooledConnection{channel=[id: 0xa605dfa6, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443]}}, [configured])
11:07:06.715 [reactor-http-nio-3] DEBUG reactor.netty.http.client.HttpClientConnect - [a605dfa6-1, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] Handler is being applied: {uri=https://vialivre-dev-image.servicebus.windows.net/$schemagroups/ImageSchemaRegistry/schemas/com.example.avro.Example?api-version=2020-09-01-preview, method=POST}
11:07:06.715 [reactor-http-nio-3] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [a605dfa6-1, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] onStateChange(POST{uri=/$schemagroups/ImageSchemaRegistry/schemas/com.example.avro.Example, connection=PooledConnection{channel=[id: 0xa605dfa6, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443]}}, [request_prepared])
11:07:06.715 [reactor-http-nio-3] DEBUG reactor.netty.ReactorNetty - [a605dfa6-1, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] Added decoder [azureWriteTimeoutHandler] at the end of the user pipeline, full pipeline: [reactor.left.sslHandler, reactor.left.httpCodec, azureWriteTimeoutHandler, reactor.right.reactiveBridge, DefaultChannelPipeline$TailContext#0]
11:07:06.723 [reactor-http-nio-3] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [a605dfa6-1, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] onStateChange(POST{uri=/$schemagroups/ImageSchemaRegistry/schemas/com.example.avro.Example, connection=PooledConnection{channel=[id: 0xa605dfa6, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443]}}, [request_sent])
11:07:06.723 [reactor-http-nio-3] DEBUG reactor.netty.ReactorNetty - [a605dfa6-1, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] Removed handler: azureWriteTimeoutHandler, pipeline: DefaultChannelPipeline{(reactor.left.sslHandler = io.netty.handler.ssl.SslHandler), (reactor.left.httpCodec = io.netty.handler.codec.http.HttpClientCodec), (reactor.right.reactiveBridge = reactor.netty.channel.ChannelOperationsHandler)}
11:07:06.723 [reactor-http-nio-3] DEBUG reactor.netty.ReactorNetty - [a605dfa6-1, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] Added decoder [azureResponseTimeoutHandler] at the end of the user pipeline, full pipeline: [reactor.left.sslHandler, reactor.left.httpCodec, azureResponseTimeoutHandler, reactor.right.reactiveBridge, DefaultChannelPipeline$TailContext#0]
11:07:07.160 [reactor-http-nio-3] DEBUG reactor.netty.http.client.HttpClientOperations - [a605dfa6-1, L:/192.168.1.43:59873 - R:vialivre-dev-image.servicebus.windows.net/51.144.238.23:443] Received response (auto-read:false) : RESPONSE(decodeResult: success, version: HTTP/1.1)
HTTP/1.1 200 OK
Transfer-Encoding: <filtered>
Content-Type: <filtered>
Location: <filtered>
Server: <filtered>
Serialization-Type: <filtered>
Schema-Id: <filtered>
Schema-Id-Location: <filtered>
Schema-Version: <filtered>
Schema-Versions-Location: <filtered>
Strict-Transport-Security: <filtered>
Date: <filtered>

But if I recreate the schema in the portal (manually), the same request trying to get the same schema fails.

@mgvinuesa mgvinuesa changed the title Deserialization Schema error when change from azure-schemaregistry-kafka-avro 1.0.0.beta to 1.1.0.beta with auto.register.schemas = true Deserialization Schema error when change from azure-schemaregistry-kafka-avro 1.0.0.beta.4 to 1.1.0.beta with auto.register.schemas = true Mar 14, 2023
@mgvinuesa
Copy link
Author

The problem really occurs when upgrade from beta.4 to beta.6 in the 1.0.0. There is some kind of breaking changes in the Azure API, it would be nice if there would be some kind of configuration to allow backward compatibility.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant