Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

api.models: add an api_version field to every Node #396

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

hardboprobot
Copy link
Contributor

New field: api_version

This specifies the API version the node conforms to. Saving this information in each node makes it possible to make future changes in the models and manage the transitions in a sane way:

When transitioning to a new API version that includes changes in the models, we can check which nodes are compatible with the new changes and which ones aren't, and we can deal with the incompatible nodes properly, for instance, by handling them using old or adapted utility functions in case the old version remains in a "deprecated" status before being completely removed, or by translating these nodes to comply with the new version.

The format is not relevant for now. I set 0 as a pre-release version and the first version is supposed to start at 1. We have to think of a good numbering scheme for this. For instance, we could keep 0 as a "test" version to be used during development of API changes, or use odd numbers for released versions and even numbers for the "test" versions between them (ie. 0 as the test version for 1, 2 as the test version for 3, etc.)

More about API and model changes

Once an API version is released it must be frozen and all clients must adhere to it. No matter how thorough we try to be when defining the models, we'll probably find that we need to change them to accommodate new requirements. A good approach to this could be the following:

  • After an API version is feature-frozen and released, all the defined models are strictly validated. No extra fields are allowed.
  • When we detect a new requirement that needs an additional field in a model, we'll need to add it but we can't break the validation or the model compatibility. To accommodate these changes, we can add an extra field in the base Node model that will be a dict with arbitrary data (not validated). This will allow to add new fields there and work with them temporarily, testing them until we're sure that's what we need.
  • Once we're sure about the new changes, we can take them out of the extra field and have them defined as regular fields that will be validated by pydantic. Then we can freeze the new version and deploy it.

An example using node definitions:

Assume a node with this format in stable release 1:

{
    "api_version": 1,
    "attr1": "...",
    "attr2": "...",
    "attr3": "...",
    "extra": {}
}

At some point we detect that we won't need "attr3" anymore but that we want to add two new fields: "attr4" and "attr5", so we test them on a new test version (2):

{
    "api_version": 2,
    "attr1": "...",
    "attr2": "...",
    "attr3": "<unused>",
    "extra": {
        "attr4": "...",
        "attr5": "..."
    }
}

During this transition stage, the models are still unchanged and the applications are all compatible with the stable release 1, but the clients that need to work on this new data structure will include changes to process it and handle the transition gracefully. Note that the nodes defined in test version 2 are compatible with the model definitions of stable version 1.

Finally, when we're happy with all the experiments and tests and are ready to publish a new API release, we define the new model like this:

{
    "api_version": 3,
    "attr1": "...",
    "attr2": "...",
    "attr4": "...",
    "attr5": "...",
    "extra": {}
}

All the test nodes can be easily found and removed by searching for the specific API test version ({"api_version": 2}).

Signed-off-by: Ricardo Cañuelo <ricardo.canuelo@collabora.com>
@hardboprobot hardboprobot added the staging-skip Don't test automatically on staging.kernelci.org label Nov 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
staging-skip Don't test automatically on staging.kernelci.org
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant