You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have successfully resolved the deployment issue; however, the resolution process raised an interesting observation. Despite including all packages in deps.yml file, we encountered deployment failures indicating that the package needed to be installed, even thought the docker built successfully and environment was created from that docker image.
When loading the Azure ML Environment, we specified the conda_file parameter and provided the path to the deps.yml file. For instance:
This approach indeed resolved the problem, although it remains somewhat unclear why the specific dependency was required during environment loading when the same dependency was already included in the environment image creation process.
Would be thankful if anyone could shed some light here.
Operating System
Linux
Version Information
Python Version: 3.10
SDK: V2
azure-ai-ml package version: 1.8.0
Steps to reproduce
Hi,
I am following the notebook to deploy a model to online endpoint.
While deploying using:
the error occurs: A required package azureml-inference-server-http is missing.
The environment we are using is registered in AzureML workspace. Here is how it looks:
The docker and conda dependency file used to create the docker image in ACR is as follows:
Dockerfile:
deps.yml:
The deps has azureml-inference-server-http as dependencies, the docker builds fine and AzureMl environment build from docker image is fine.
Expected behavior
Expected behaviour is that the online endpoint deploys properly.
Actual behavior
Gives following error:
2023-10-25T15:41:55,383013296+00:00 | gunicorn/run |
2023-10-25T15:41:55,384232095+00:00 | gunicorn/run | Entry script directory: /var/azureml-app/onlinescoring/.
2023-10-25T15:41:55,385439495+00:00 | gunicorn/run |
2023-10-25T15:41:55,386724694+00:00 | gunicorn/run | ###############################################
2023-10-25T15:41:55,387960893+00:00 | gunicorn/run | Dynamic Python Package Installation
2023-10-25T15:41:55,389318393+00:00 | gunicorn/run | ###############################################
2023-10-25T15:41:55,390611292+00:00 | gunicorn/run |
2023-10-25T15:41:55,392044492+00:00 | gunicorn/run | Dynamic Python package installation is disabled.
2023-10-25T15:41:55,393430091+00:00 | gunicorn/run |
2023-10-25T15:41:55,394692890+00:00 | gunicorn/run | ###############################################
2023-10-25T15:41:55,395941190+00:00 | gunicorn/run | Checking if the Python package azureml-inference-server-http is installed
2023-10-25T15:41:55,397200089+00:00 | gunicorn/run | ###############################################
2023-10-25T15:41:55,398420089+00:00 | gunicorn/run |
2023-10-25T15:41:55,663463169+00:00 | gunicorn/run | A required package azureml-inference-server-http is missing. Please install azureml-inference-server-http before trying again
2023-10-25T15:41:55,666521767+00:00 - gunicorn/finish 100 0
2023-10-25T15:41:55,667702367+00:00 - Exit code 100 is not normal. Killing image
Addition information
No response
The text was updated successfully, but these errors were encountered: