Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference parameters are not passed to the predict function in MLFlow runtime #1660

Open
Okamille opened this issue Apr 3, 2024 · 0 comments · May be fixed by #1921
Open

Inference parameters are not passed to the predict function in MLFlow runtime #1660

Okamille opened this issue Apr 3, 2024 · 0 comments · May be fixed by #1921

Comments

@Okamille
Copy link

Okamille commented Apr 3, 2024

How to reproduce

Define a MLFlow model, using a custom params parameter. Example from the MLFlow documentation:

class ModelWrapper(PythonModel):
    def __init__(self):
        self.model = None

    def load_context(self, context):
        from joblib import load

        self.model = load(context.artifacts["model_path"])

    def predict(self, context, model_input, params=None):
        params = params or {"predict_method": "predict"}
        predict_method = params.get("predict_method")

        if predict_method == "predict":
            return self.model.predict(model_input)
        elif predict_method == "predict_proba":
            return self.model.predict_proba(model_input)
        elif predict_method == "predict_log_proba":
            return self.model.predict_log_proba(model_input)
        else:
            raise ValueError(f"The prediction method '{predict_method}' is not supported.")

Log that model into MLFlow, and try to serve it using MLServer. Create an inference request using a specific parameter. The parameter value is not taken into account.

Proposed fix

Currently, the predict function in the MLFlow runtime does not pass the params parameter in the prediction. We can add it, like it was done for the custom invocation endpoint.

PR

idlefella added a commit to idlefella/MLServer that referenced this issue Oct 8, 2024
@idlefella idlefella linked a pull request Oct 8, 2024 that will close this issue
5 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant