-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: integrate with KFServing #1465
Comments
What is the status of this PR? |
@hhsecond is working on a PR that introduces a plugin system for integrating deployment tools. I believe this will be easier to implement once that is complete. @pisymbol, I don't believe a PR has been opened implementing this. I've added a "help wanted" tag, as contributions are welcome. @rakelkar, let me know if you'd like to drive this. I think the first step would be to take a look at @hhsecond's PR, then writing a proposal of how you would integrate KFServing with that plugin system. It would also be useful to see proposed user workflow: What commands would the user need to run to deploy an MLflow model with KFServing? |
If it helps, I have made a RedisAI plugin here. Also, there are a few changes I am making over the coming week. It's not a lot of changes to break the implementation but just a heads up. It should be ready to merge by end of the coming week |
#2327, has been merged. Congrats @hhsecond! This with the KFServing Python API in kserve/kserve#218, makes all things aligned. Myself and @pakelley are interested to see this move forward, and wanted to know how we can contribute. @rakelkar @pisymbol @AveshCSingh , is a proposal in motion? Let us know how we can help, and thanks for these efforts. Awesome to see this functionality emerge! |
Fantastic. Now that #2317 has been merged, we have the capability to add plugins for serving. I would recommend creating a separate repo to host the plugin, then linking to it from https://github.com/mlflow/mlflow/blob/master/docs/source/plugins.rst#deployment-plugins. We'll be pushing an update to document the deployment plugin interface in https://mlflow.org/docs/latest/plugins.html over the next day or so. Until then, you can refer to the docs in the master branch: https://github.com/mlflow/mlflow/blob/master/docs/source/models.rst#deployment-to-custom-targets |
Is KFServing migration a work-in-progress? |
I have a kubeflow setup , do i have to configure mlflow with kubeflow cluster to use RedisAI plugin or is there any other way ? |
KFServing (https://github.com/kubeflow/kfserving) is an open source Kubeflow effort for model serving on Kubernetes. The effort is aligned with MLSpec and already supports a bunch of frameworks and is working on supporting more advanced inference graph scenarios. KFServing is intended to work on-prem as well as on major cloud providers.
Opening this issue to get feedback on if it makes sense to support KFServing as a serving target from MLFlow so that data scientists have a clean unified interface open optimized serving across clouds.
The text was updated successfully, but these errors were encountered: