A plugin that integrates WatsonML with MLflow pipeline.
mlflow_watsonml
enables mlflow users to deploy mlflow pipeline models into WatsonML.
Command line APIs of the plugin (also accessible through mlflow's python package) makes the deployment process seamless.
Plugin package which is available in pypi and can be installed using
pip install mlflow-watsonml
Installing this package uses python's entrypoint mechanism to register the plugin into MLflow's plugin registry. This registry will be invoked each time you launch MLflow script or command line argument.
In order to connect to WatsonML, refer to .env.template
The create
command line argument and create_deployment
python
APIs does the deployment of a model built with MLflow to WatsonML.
mlflow deployments create -t watsonml -m <model-uri> --name <deployment-name> -C "software_spec_type=runtime-22.2-py3.10"
from mlflow.deployments import get_deploy_client
target_uri = 'watsonml'
plugin = get_deploy_client(target_uri)
plugin.create_deployment(
name=<deployment-name>,
model_uri=<model-uri>,
config={"software_spec_type": "runtime-22.2-py3.10"}
)
Update API can used to modify the configuration parameters such as number of workers, version etc., of an already deployed model. WatsonML will make sure the user experience is seamless while changing the model in a live environment.
mlflow deployments update -t watsonml --name <deployment name> -C "software_spec_type=runtime-22.1-py3.10"
plugin.update_deployment(name=<deployment name>, config={"software_spec_type": "runtime-22.1-py3.10"})
Delete an existing deployment. Exception will be raised if the model is not already deployed.
mlflow deployments delete -t watsonml --name <deployment name / version number>
plugin.delete_deployment(name=<deployment name>)
Lists the names of all the models deployed on the configured WatsonML.
mlflow deployments list -t watsonml
plugin.list_deployments()
Get API fetches the details of the deployed model. By default, Get API fetches all the versions of the deployed model.
mlflow deployments get -t watsonml --name <deployment name>
plugin.get_deployment(name=<deployment name>)
Predict API enables to run prediction on the deployed model.
For the prediction inputs, DataFrame and JSON formats are supported. The python API supports all of these three formats. When invoked via command line, one needs to pass the json file path that contains the inputs.
mlflow deployments predict -t watsonml --name <deployment name> --input-path <input file path> --output-path <output file path>
output-path is an optional parameter. Without output path parameter result will be printed in console.
plugin.predict(name=<deployment name>, df=<prediction input>)
Run the following command to get the plugin help string.
mlflow deployments help -t watsonml