Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Randomly working and failing model evaluation step and failing during publishing artifacts. #375

Open
lafranke opened this issue Aug 3, 2021 · 2 comments

Comments

@lafranke
Copy link

lafranke commented Aug 3, 2021

I am working on the CI step of the pipeline.

It consist of three main steps: Get Pipeline ID, Trigger ML Training Pipeline, and Publish artifact.

The first steps always goes through.

The second step sometimes works and sometimes crashes. It crashes in the evaluation step because it does not get a MSE value. However, this happens randomly. Without changing the code, the pipeline might suddenly throw an error.

Until now it also always fails at the third step, precisely at the "Determine if evaluation succeeded" step. Until now I could determine that the error occurs in the "automobile-publish-model-artifact-template.yml" file, in line "FOUND_MODEL=$(az ml model list -g $(RESOURCE_GROUP) --workspace-name $(WORKSPACE_NAME) --tag BuildId=$(Build.BuildId) --query '[0]')". I could not work out the error further, as the pipeline randomly fails for longer periods of time in the second step.

@lokijota
Copy link
Contributor

lokijota commented Aug 3, 2021

Hi,

I just did a repro, and the issue I found was that the code in evaluate_model.py is looking at mse, but the model, trained with lightgbm, only publishes the auc and f1 score as metrics.

Updating the code in that file from mse to auc an inverting the sign of the comparison (to check which model is better) made it work. Otherwise, I had the same error you describe.

@Katzmann1983
Copy link
Contributor

For me, this fix helped

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants