-
Notifications
You must be signed in to change notification settings - Fork 183
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mlflow model cannot be loaded #1801
Comments
Hi Francisco -- Were you able to serve this model locally using MLServer before using KServe? |
@ramonpzg As described in the Slack channel: I did follow the instructions in the mlserver docs: I also have all the dependency files in the model directory, as you can see on the screenshot. From the docs available to me, it seemed that further environment configurations weren't necessary, especially because the mlserver would simply work locally and seem to setup everything it needed from the given files in the model directory. As far as I know, there were no instructions on how to generate or use the environment.tar.gz file. Even when I tarball the requirements, conda, and python_env file it doesn't find them:
Do the dependency files need to be inside these directories? ./envs/environment So I tried the folder structure tarball, but that was also not it:
Finally, I followed this doc, so I tarballed the conda env with conda-pack, I redeployed, and now I am getting this 😅
Can I influence the commands that are run after unpacking the environment? |
Thanks for sharing the additional info, Francisco. You mentioned MLServer is working as intended locally but not when you get to the KServe level, I'm wondering if you should ask them what might be the cause of this as KServe is its own project. The docs do have an example on how to use a custom conda environment. Here is the link.. |
I used the custom environments, but that didn't change anything. Still, I think the environment should be created from the dependency files that are in the model directory, instead of me having to tarball a 250MB conda environment. |
@ramonpzg I was trying to figure out the issue connected to what @fschlz mentioned here and also in kserve/kserve#3733 ...
Additional info:
kserver deployment:
|
FYI, this has clearly something to do just with combination of mlflow + mlserve + pyfunc ... just for a test I tried to save the model to mlflow using deploying a model that was saved like this(using Kserve deployment from the previous post) works
output after saving just a lightgbm model:
deploying a model that was saved like this does not work (i.e.
|
ok, ehm, I am deeply sorry, it seems like I was looking for the
|
No problem. I am glad you figured it out @5uperpalo and @fschlz. I will go ahead and close out this issue. If anything changes, please open it again with the details of what has changed. Thanks :) |
What steps did you take and what happened:
Hey guys, I am trying to use KServer on AKS.
I installed all the dependencies on AKS and am trying to deploy a test inference service.
However, the model isn't getting loaded correctly.
Locally, everything works out right.
Unfortunately, the service doesn't seem to recognize the model files I have registered.
Plus, the environment that is created doesn't seem to respect the version numbers that are set in
requirements.txt
Does anyone know what could be wrong?
What did you expect to happen:
Deploy the inference service.
What's the InferenceService yaml:
Anything else you would like to add:
These are the model files in my Storage Account (see screenshot)
Environment:
/etc/os-release
): ubuntu 22.04The text was updated successfully, but these errors were encountered: