-
Notifications
You must be signed in to change notification settings - Fork 250
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upgrade process from previous versions to 2.x #1339
Comments
I'm thinking about a standalone tool/CLI, perhaps based on Kafka Streams (?), that would simply subscribe to the old kafka topics and produce messages on new topics with the new format. I'd like to be able to support both Streams and Kafka+SQL formats if possible. Thoughts? We could probably copy/paste the protobuf code from 1.3.2.Final (renamed/namespaced) for the consumer of the old topics. Then have some custom java logic that would produce new messages in the new format. Easy peasy? |
Although this approach seems reasonable to me, I slightly prefer an API-based approach. Get all keys of a running registry in 1.3.2.Final, get the actual artifact, and send them to a Registry running in 2.0.0.Final. We can tweak this approach to preserve globalIds, I think. |
@Apicurio/developers thoughts? |
This is the idea I have of how this could be implemented:
The keep globalIds flag is useful if the user doesn't need to keep the globalIds identical from the old registry or for importing data to a registry that already have some artifacts in it. If keep globalIds is set to true and there is a globalId conflict the import operation should fail With this three things we cover all of our usecases with the minimum amount of things to implement. |
I like this plan very much. The only part of this that I think requires some thought and analysis is the globalId part. Everything else is pretty straightforward, and is consistent with what I imagined we would have for import/export in our V2 API. I think the utility to upgrade from Streams 1.x to Kafkasql 2.x certainly can use the import API, but it might make that tool slightly harder to write. It needs to produce our export file format rather than just copy the messages to another Kafka topic. That's probably OK though. Note that I'm hoping we can use e.g. |
A couple of questions about migration of apicurio server with SQL storage from 1.3.x to 2.x.
|
Data migration is needed in both cases. The In 2.x we now have an import/export api that will help with the data migration process. it's documented here https://www.apicur.io/registry/docs/apicurio-registry/2.0.0.Final/getting-started/assembly-managing-registry-artifacts-api.html#exporting-importing-using-rest-api And for exporting the data from a 1.3.x registry we have this tool https://github.com/Apicurio/apicurio-registry/tree/master/utils/exportV1 We are working on documenting this whole process properly... |
@famartinrh want to mark this as done? :) |
Now that we have the first CR of 2.x we should start thinking about an upgrade process.
The text was updated successfully, but these errors were encountered: