-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Iris Example with Model Signature #46
base: master
Are you sure you want to change the base?
Conversation
Signed-off-by: Shrinath Suresh <shrinath@ideas2it.com>
Signed-off-by: Shrinath Suresh <shrinath@ideas2it.com>
Signed-off-by: Shrinath Suresh <shrinath@ideas2it.com>
if self.mlmodel_file_path: | ||
mlmodel = Model.load(self.mlmodel_file_path) | ||
if not hasattr(mlmodel, "signature"): | ||
raise Exception("Model Signature not found") | ||
|
||
input_schema = mlmodel.get_input_schema() | ||
|
||
from mlflow.pyfunc import _enforce_schema | ||
_enforce_schema(data, input_schema) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@shrinath-suresh
Do we really need this change? Have you tried using mlflow.pyfunc.load_model("<model_uri>")
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@shrinath-suresh
Do we really need this change? Have you tried usingmlflow.pyfunc.load_model("<model_uri>")
?
As of now, mlflow.pytorch
library only allows to log the signature. But, it doesnt have the mechanism to validate the signature.
Is it the case, that pytorch users can log the model signature using mlflow.pytorch
and validate it using mlflow.pyfunc
? If that is the case, we don't need this change.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that pytorch users can log the model signature using mlflow.pytorch and validate it using mlflow.pyfunc
I think yes because that's the only way to enforce the schema now.
import pandas as pd | ||
|
||
|
||
class IrisClassification(pl.LightningModule): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we have a similar class in other pytorch examples. Can we reuse it?
# Uncomment this block to check invalid data type enforcement | ||
# for column in df.columns: | ||
# df[column] = df[column].astype("str") | ||
# | ||
# print("Result with invalid datatype: ", model.predict(df)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we uncomment this block and wrap it with try-catch and print out the error message in the except clause?
try:
model.predict(df)
except Exception as e:
print(e)
mlflow.pytorch.save_model(trainer.get_model(), "model", signature=signature) | ||
|
||
model = _load_pyfunc(path="model/data", validate_signature=True) | ||
df = pd.read_json("sample.json") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need really sample.json
? Can we just hard-code its content in pd.DataFrame
?
df = pd.DataFrame({"sepal length (cm)": ...})
input_schema = Schema( | ||
[ | ||
ColSpec("double", "sepal length (cm)"), | ||
ColSpec("double", "sepal width (cm)"), | ||
ColSpec("double", "petal length (cm)"), | ||
ColSpec("double", "petal width (cm)"), | ||
] | ||
) | ||
output_schema = Schema([ColSpec("long")]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we infer the schema from the iris dataframe?
Signed-off-by: Shrinath Suresh shrinath@ideas2it.com
What changes are proposed in this pull request?
Logging model signature and validating model signature for iris classification example.
Reused the model signature enforcements from pyfunc.
How is this patch tested?
Existing Unit Tests
Release Notes
Is this a user-facing change?
(Details in 1-2 sentences. You can just refer to another PR with a description if this PR is part of a larger change.)
What component(s), interfaces, languages, and integrations does this PR affect?
Components
area/artifacts
: Artifact stores and artifact loggingarea/build
: Build and test infrastructure for MLflowarea/docs
: MLflow documentation pagesarea/examples
: Example codearea/model-registry
: Model Registry service, APIs, and the fluent client calls for Model Registryarea/models
: MLmodel format, model serialization/deserialization, flavorsarea/projects
: MLproject format, project running backendsarea/scoring
: Local serving, model deployment tools, spark UDFsarea/server-infra
: MLflow server, JavaScript dev serverarea/tracking
: Tracking Service, tracking client APIs, autologgingInterface
area/uiux
: Front-end, user experience, JavaScript, plottingarea/docker
: Docker use across MLflow's components, such as MLflow Projects and MLflow Modelsarea/sqlalchemy
: Use of SQLAlchemy in the Tracking Service or Model Registryarea/windows
: Windows supportLanguage
language/r
: R APIs and clientslanguage/java
: Java APIs and clientslanguage/new
: Proposals for new client languagesIntegrations
integrations/azure
: Azure and Azure ML integrationsintegrations/sagemaker
: SageMaker integrationsintegrations/databricks
: Databricks integrationsHow should the PR be classified in the release notes? Choose one:
rn/breaking-change
- The PR will be mentioned in the "Breaking Changes" sectionrn/none
- No description will be included. The PR will be mentioned only by the PR number in the "Small Bugfixes and Documentation Updates" sectionrn/feature
- A new user-facing feature worth mentioning in the release notesrn/bug-fix
- A user-facing bug fix worth mentioning in the release notesrn/documentation
- A user-facing documentation change worth mentioning in the release notes