-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add databricks deployments client skeleton + example #10421
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
from mlflow.deployments import get_deploy_client | ||
|
||
|
||
def main(): | ||
client = get_deploy_client("databricks") | ||
client.create_endpoint( | ||
name="gpt4-chat", | ||
config={ | ||
# TODO: doesn't work yet | ||
}, | ||
) | ||
|
||
|
||
if __name__ == "__main__": | ||
main() |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,44 @@ | ||
from mlflow.deployments import BaseDeploymentClient | ||
|
||
|
||
class DatabricksDeploymentClient(BaseDeploymentClient): | ||
def create_deployment(self, name, model_uri, flavor=None, config=None, endpoint=None): | ||
raise NotImplementedError | ||
|
||
def update_deployment(self, name, model_uri=None, flavor=None, config=None, endpoint=None): | ||
raise NotImplementedError | ||
|
||
def delete_deployment(self, name, config=None, endpoint=None): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Curious what values would be in config or endpoint here? Is there an option to delete an endpoint referenced by name but not the entire named deployment? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I don't think we need |
||
raise NotImplementedError | ||
|
||
def list_deployments(self, endpoint=None): | ||
raise NotImplementedError | ||
|
||
def get_deployment(self, name, endpoint=None): | ||
raise NotImplementedError | ||
|
||
def predict(self, deployment_name=None, inputs=None, endpoint=None): | ||
raise NotImplementedError("TODO") | ||
|
||
def create_endpoint(self, name, config=None): | ||
raise NotImplementedError("TODO") | ||
|
||
def update_endpoint(self, endpoint, config=None): | ||
raise NotImplementedError("TODO") | ||
|
||
def delete_endpoint(self, endpoint): | ||
raise NotImplementedError("TODO") | ||
|
||
def list_endpoints(self): | ||
raise NotImplementedError("TODO") | ||
|
||
def get_endpoint(self, endpoint): | ||
raise NotImplementedError("TODO") | ||
Comment on lines
+23
to
+36
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'll implement these later in a follow-up PR. |
||
|
||
|
||
def run_local(name, model_uri, flavor=None, config=None): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Build a local serving container and validate the capacity to return inference predictions? Is that what this is? (if so, this is awesome) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'm actually not sure what this is for. A deployment plugin must define There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The target_help implementation as explained in the ABC is definitely out of scope for a Databricks plugin (not entirely sure what that would return even if there was an available endpoint to target?). The run_local might also be a "maybe nice to have in the far-off future", but definitely something that would be rather challenging to simulate model serving behavior from within OSS. |
||
pass | ||
|
||
|
||
def target_help(): | ||
pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the model_uri going to be required if we're creating a gateway route, or is gateway route creation purely going to be handled with
create_endpoint()
?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we need the flavor designator here? If we're using model_uri, can it read the configured flavor information from the MLmodel file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
perfect!