Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add Model Registry doc to website #3698

Merged
merged 34 commits into from
May 21, 2024
Merged
Show file tree
Hide file tree
Changes from 12 commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
27f4f59
add Model Registry doc to website
tarilabs Mar 14, 2024
7c0f767
Update content/en/docs/components/model-registry/overview.md
tarilabs Apr 3, 2024
2f006e8
Update content/en/docs/components/model-registry/overview.md
tarilabs Apr 3, 2024
1d8d1e2
Update content/en/docs/components/model-registry/getting-started.md
tarilabs Apr 3, 2024
a22603f
fix to updated link
tarilabs Apr 3, 2024
6eb7ab1
update diagram to show inner/outer MLOps loops
tarilabs Apr 15, 2024
458d387
implement review feedback
tarilabs Apr 15, 2024
3bbf750
implement review req for architecture page
tarilabs Apr 15, 2024
2f567ae
implement review req for splitting getting-started
tarilabs Apr 15, 2024
3bc703b
Update content/en/docs/components/model-registry/reference/architectu…
tarilabs Apr 15, 2024
0cfe214
Update content/en/docs/components/model-registry/installation.md
tarilabs Apr 15, 2024
a1af1d9
fix wrong command in installation
tarilabs Apr 15, 2024
4768e55
update for better reproducible kserve add-on example
tarilabs Apr 17, 2024
be3ebf8
add short sentence for each phase in the ML lifecycle
tarilabs Apr 17, 2024
ba73b17
implement review feedback
tarilabs Apr 17, 2024
8ba5899
add "model registry use case" section
tarilabs Apr 17, 2024
45274b3
Update content/en/docs/components/model-registry/overview.md
tarilabs Apr 21, 2024
c8eb466
Update content/en/docs/components/model-registry/overview.md
tarilabs Apr 21, 2024
fca2120
implement review feedback on diagram
tarilabs Apr 21, 2024
017ca5a
Update content/en/docs/components/model-registry/overview.md
tarilabs Apr 21, 2024
c568cae
Update content/en/docs/components/model-registry/overview.md
tarilabs Apr 21, 2024
02214a4
Update content/en/docs/components/model-registry/overview.md
tarilabs Apr 21, 2024
e3c441f
Update content/en/docs/components/model-registry/overview.md
tarilabs Apr 21, 2024
b901113
Update content/en/docs/components/model-registry/overview.md
tarilabs Apr 21, 2024
0659ec5
implement review feedback
tarilabs Apr 21, 2024
1ab5aa4
implement review feedback
tarilabs Apr 21, 2024
3ee7e42
implement review feedback
tarilabs Apr 21, 2024
36e07d1
Update content/en/docs/components/model-registry/overview.md
tarilabs May 20, 2024
9e4c03c
implement review feedback
tarilabs May 20, 2024
d9c3cd6
fix broken link on review feedback
tarilabs May 21, 2024
84f3859
implement review feedback
tarilabs May 21, 2024
5511ebb
implement review feedback
tarilabs May 21, 2024
01490e1
implement review feedback
tarilabs May 21, 2024
f6cad0a
implement review feedback
tarilabs May 21, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions content/en/docs/components/model-registry/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
+++
title = "Model Registry"
description = "Documentation for Kubeflow Model Registry"
weight = 70
+++
167 changes: 167 additions & 0 deletions content/en/docs/components/model-registry/getting-started.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,167 @@
+++
title = "Getting started"
description = "Getting started with Model Registry using examples"
weight = 30

+++

This guide shows how to get started with Model Registry and run a few examples using the
command line or Python clients.

At this time, there is not yet an UI for Model Registry; therefore, this documentation focus on backend services and APIs.
tarilabs marked this conversation as resolved.
Show resolved Hide resolved

For an overview of the logical model of model registry, check the
[Model Registry logical model](https://github.com/kubeflow/model-registry/blob/main/docs/logical_model.md).
The logical model is exposed via the Model Registry [REST API](https://editor.swagger.io/?url=https://raw.githubusercontent.com/kubeflow/model-registry/main/api/openapi/model-registry.yaml).

## Prerequisites
tarilabs marked this conversation as resolved.
Show resolved Hide resolved

To follow along the examples in this guide, you will need a Kubeflow installation and the Model Registry installed:

- Kubeflow [installed](docs/started/installing-kubeflow/)
tarilabs marked this conversation as resolved.
Show resolved Hide resolved
- Model Registry [installed](/docs/components/model-registry/installation/)
tarilabs marked this conversation as resolved.
Show resolved Hide resolved

## Example: track Model Artifacts from a Notebook

This section details a step by step example on using Model Registry from a Notebook, installing and creating a client instance, indexing metadata, and retrieving metadata.

### Install Model Registry Python client

You can install the Model Registry python client in a Notebook, for instance with:

```
!pip install model-registry
```

Note: depending on your Python and Notebook environment, you might need to fine-tune the dependencies of: `ml-metadata`, `protobuf`, `grpcio`, or `tensorflow` if used.

You can now create a client instance pointing to your deployed Model Registry from the previous steps.

```python
from model_registry import ModelRegistry

registry = ModelRegistry(server_address="model-registry-service.kubeflow.svc.cluster.local", port=9090, author="your name")
```

You now have a Model Registry client instance: `registry`.

### Register a Model Artifact metadata

You can use the `register_model` method to index a model artifacts and its metadata, for instance:
tarilabs marked this conversation as resolved.
Show resolved Hide resolved

```python
registeredmodel_name = "mnist"
version_name = "v0.1"
rm = registry.register_model(registeredmodel_name,
"https://github.com/tarilabs/demo20231212/raw/main/v1.nb20231206162408/mnist.onnx",
model_format_name="onnx",
model_format_version="1",
version=version_name,
description="lorem ipsum mnist",
metadata={
"accuracy": 3.14,
"license": "apache-2.0",
}
)
```

You can reference the pydoc documentation for additional information on indexing metadata on Model Registry, using the Model Registry Python client.
tarilabs marked this conversation as resolved.
Show resolved Hide resolved

### Retrieve a given Model Artifact metadata

Continuing on the previous example, you can use the following methods to retrieve the metadata associated with a given Model Artifact:

```python
print("RegisteredModel:")
print(registry.get_registered_model(registeredmodel_name))

print("ModelVersion:")
print(registry.get_model_version(registeredmodel_name, version_name))

print("ModelArtifact:")
print(registry.get_model_artifact(registeredmodel_name, version_name))
```

## Example add-on: deploy inference endpoint using Model Registry metadata

This section details a step by step example on using Model Registry to retrieve indexed ML artifacts metadata, and using that metadata to create an inference endpoint deployment.

Note: the provided example uses the Model Registry Python client and KServe Python SDK. You can analogously make use of the Model Registry REST APIs, and your own Add-on SDK as needed.
tarilabs marked this conversation as resolved.
Show resolved Hide resolved

### Retrieve a given Model Artifact metadata

You can use the Model Registry Python client to retrieve the needed ML artifact metadata, for example:

```python
from model_registry import ModelRegistry

registry = ModelRegistry(server_address="model-registry-service.kubeflow.svc.cluster.local", port=9090, author="mmortari")

lookup_name = "mnist"
lookup_version="v20231206163028"

print("RegisteredModel:")
registered_model = registry.get_registered_model(lookup_name)
print(registered_model)
print("ModelVersion:")
model_version = registry.get_model_version(lookup_name, lookup_version)
print(model_version)
print("ModelArtifact:")
model_artifact = registry.get_model_artifact(lookup_name, lookup_version)
print(model_artifact)

storage_uri = model_artifact.uri
model_format_name = model_artifact.model_format_name
model_format_version = model_artifact.model_format_version
```

These metadata values can be used to create a KServe modelmesh inference endpoint.

### Create an inference endpoint using the retrieved metadata

You can use the retrieved metadata from the previous step with the KServe Python SDK to create an inference endpoint, for example:

```python
from kubernetes import client
from kserve import KServeClient
from kserve import constants
from kserve import utils
from kserve import V1beta1InferenceService
from kserve import V1beta1InferenceServiceSpec
from kserve import V1beta1PredictorSpec
from kserve import V1beta1SKLearnSpec
from kserve import V1beta1ModelSpec
from kserve import V1beta1ModelFormat

namespace = utils.get_default_target_namespace()
name='mnist'
kserve_version='v1beta1'
api_version = constants.KSERVE_GROUP + '/' + kserve_version

isvc = V1beta1InferenceService(api_version=api_version,
kind=constants.KSERVE_KIND,
metadata=client.V1ObjectMeta(
name=name, namespace=namespace, annotations={'sidecar.istio.io/inject':'false', 'serving.kserve.io/deploymentMode': 'ModelMesh'},
labels={'modelregistry/registered-model-id': registered_model.id, 'modelregistry/model-version-id': model_version.id}
),
spec=V1beta1InferenceServiceSpec(
predictor=V1beta1PredictorSpec(
model=V1beta1ModelSpec(
storage_uri=storage_uri,
model_format=V1beta1ModelFormat(name=model_format_name, version=model_format_version),
protocol_version='v2'
)
)))
KServe = KServeClient()
KServe.create(isvc)
```

An inference endpoint is now created using the artifact metadata retrieved from the Model Registry.

## Next steps

- Get involved:
- Model Registry working group: https://www.kubeflow.org/docs/about/community/#kubeflow-community-calendars
- https://github.com/kubeflow/model-registry
- Feedback: {{% alpha-status feedbacklink="https://github.com/kubeflow/model-registry" %}}

tarilabs marked this conversation as resolved.
Show resolved Hide resolved
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
66 changes: 66 additions & 0 deletions content/en/docs/components/model-registry/installation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
+++
title = "Installation"
description = "How to set up Model Registry"
weight = 20

+++

This section details how to set up and configure Model Registry on your Kubernetes cluster with Kubeflow.

## Prerequisites

These are the minimal requirements to install Model Registry:

- Kubernetes >= 1.27
- Kustomize >= 5.0.3 ([see more](https://github.com/kubeflow/manifests/issues/2388))

<a id="model-registry-install"></a>

## Installing Model Registry

You can skip this step if you have already installed Kubeflow >=1.9. Your Kubeflow
deployment includes Model Registry ([see tracker issue](https://github.com/kubeflow/manifests/issues/2631)).

To install Model Registry as part of Kubeflow, follow the
[Kubeflow installation guide](/docs/started/installing-kubeflow/).

If you want to install Model Registry separately from Kubeflow, or to get a later version
of Model Registry, you can use one of the following Model Registry manifests.
Remember to substitute the relevant release (e.g. `v0.1.2`), modify `ref=main` to `ref=v0.1.2`.

The following steps show how to install Model Registry using a default Kubeflow >=1.8 installation.

```shell
kubectl apply -k "https://github.com/kubeflow/model-registry/manifests/kustomize/overlays/db?ref=main"
tarilabs marked this conversation as resolved.
Show resolved Hide resolved
```

As the default installation provides an Istio mesh, apply the necessary manifests:

```shell
kubectl apply -k "https://github.com/kubeflow/model-registry/manifests/kustomize/options/istio?ref=main"
tarilabs marked this conversation as resolved.
Show resolved Hide resolved
```

## Check Model Registry setup

You can check the status of the Model Registry deployment with your Kubernetes tooling, or for example with:

```shell
kubectl wait --for=condition=available -n kubeflow deployment/model-registry-deployment --timeout=1m
kubectl logs -n kubeflow deployment/model-registry-deployment
```

Optionally, you can also manually forward the REST API container port of Model Registry and interact with the [REST API](https://editor.swagger.io/?url=https://raw.githubusercontent.com/kubeflow/model-registry/main/api/openapi/model-registry.yaml),
for example with:
```shell
kubectl port-forward svc/model-registry-service -n kubeflow 8081:8080
# in another terminal:
curl -X 'GET' \
'http://localhost:8081/api/model_registry/v1alpha3/registered_models?pageSize=100&orderBy=ID&sortOrder=DESC' \
-H 'accept: application/json' | jq
```

If you are not receiving a `2xx` response, it might be the case you are trying to consume a different version (`v1alphaX`) of the REST API than intended.

## Next steps

- Run some examples following the [getting started guide](/docs/components/model-registry/getting-started/)
29 changes: 29 additions & 0 deletions content/en/docs/components/model-registry/overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
+++
title = "Overview"
description = "An overview for Kubeflow Model Registry"
weight = 10

+++

{{% alpha-status
feedbacklink="https://github.com/kubeflow/model-registry" %}}

## What is Model Registry?

A model registry is an important component in the life cycle of AI/ML models, an integral component for any MLOps platform and for ML workflows.
tarilabs marked this conversation as resolved.
Show resolved Hide resolved
tarilabs marked this conversation as resolved.
Show resolved Hide resolved

A model registry provides a central index for ML model developers to index and manage models, versions, and ML artifacts metadata.
It fills a gap between model experimentation and production activities.
It provides a central interface for all stakeholders in the ML lifecycle to collaborate on ML models.

<img src="/docs/components/model-registry/images/MLloopinnerouter.png"
alt="Model Registry MLOps loop"
class="mt-3 mb-3">

DevOps, Data Scientists, and developers need to collaborate with other users in the ML workflow to get models into production.
Data scientists need an efficient way to share model versions, artifacts and metadata with other users that need access to those models as part of the MLOps workflow.

## Next steps

- Follow the [installation guide](/docs/components/model-registry/installation/) to set up Model Registry
- Run some examples following the [getting started guide](/docs/components/model-registry/getting-started/)
5 changes: 5 additions & 0 deletions content/en/docs/components/model-registry/reference/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
+++
title = "Reference"
description = "Reference docs for Kubeflow Model Registry"
weight = 100
+++
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
+++
title = "Model Registry Reference"
description = "Reference documentation for the Kubeflow Model Registry"
weight = 20

+++

![Model Registry High Level Architecture](/docs/components/model-registry/reference/images/model-registry-overview.jpg)

{{% alert title="Note" color="warning" %}}
The Model Registry is a passive repository for metadata and is not meant to be a Control Plane. It does not perform any active orchestration or expose APIs to perform actions on underlying Kubernetes components.
{{% /alert %}}


Kubeflow Model Registry makes use of the Google community project [ML-Metadata](https://github.com/google/ml-metadata) as one of its core component. ML-Metadata provides a very extensible schema that is generic, similar to a key-value store, but also allows for the creation of logical schemas that can be queried as if they were physical schemas. Those can be manipulated using their bindings in the Python library. This model is extended to provide the metadata management service capabilities for Model Registry.

The Model Registry uses the ml-metadata project’s C++ server as-is to handle the storing of the metadata, while domain-specific Model Registry features are added as extensions (microservices). As part of these extensions, Model Registry provides:
- Python/Go extensions to support the Model Registry interaction
- an OpenAPI interface to expose the Model Registry API to the clients

## Components
- *[MLMD C++ Server](https://github.com/google/ml-metadata)*

This is the metadata server from Google's ml-metadata project. This component is hosted to communicate with a backend relational database that stores the actual metadata about the models. This server exposes a “gRPC” interface for its clients to communicate with. This server provides a very flexible schema model, where using this model one can define logical data models to fit the needs of different MLOps operations, for example, metadata during the training and experimentation, metadata about metrics or model versioning, etc.

- *[OpenAPI/REST Server](https://github.com/kubeflow/model-registry)*

This component exposes a higher-level REST API of the Model Registry. In contrast, the MLMD server exposes a lower level generic API over gRPC, whereas this REST server exposes a higher level API that is much closer to the domain model of Model Registry, like:
- Register a Model
- Version a Model
- Get a catalog of models
- Manage the deployment statutes of a model

The REST API server converts its requests into one or more underlying gRPC requests on the MLMD Server.

- *[CLI (Python client, SDK)](https://github.com/kubeflow/model-registry/tree/main/clients/python)*

CLI is also called MR Python client/SDK, a command line tool for interacting with Model Registry. This tool can be used by a user to execute operations such as retrieving the registered models, get model’s deployment status, model’s version etc.

The model registry provides logical mappings from the high level [logical model](https://github.com/kubeflow/model-registry/blob/main/docs/logical_model.md) available through the OpenAPI/REST Server, to the underlying ml-metadata entities.

## See also

- Model Registry [project documentation](https://github.com/kubeflow/model-registry?tab=readme-ov-file#pre-requisites).
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.