You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As a developer, I want to integrate ONNX into mlflow, so that I can use open format to represent deep learning models and more easily explore state-of-the-art tools.
Hi @slavakurilyak, IMO it could definitely make sense to add support for ONNX as a model flavor (see MLflow Model docs) - it looks like many of the frameworks MLflow currently interfaces with also have support for exporting to / importing from ONNX (see here).
My gut feeling is that we could add logic to optionally generate the ONNX flavor while saving an MLflow model, but I think it'll take some design to figure out how to do this in a generic way extensible to adding support for non-ONNX export formats in the future.
I think next steps would be to concretely outline the pros/cons of adding ONNX support (i.e. what would we gain by adding it to MLflow, and what would it look like?) For example, one potential pro is that you'd be able to persist e.g. a PyTorch model with MLflow and load it back into TensorFlow, while a con could be that it's already easy to do so using ONNX APIs and so MLflow wouldn't provide much value.
Once we've settled that question we can go ahead & start thinking about potential designs - thanks again for bringing this up!
As a developer, I want to integrate ONNX into mlflow, so that I can use open format to represent deep learning models and more easily explore state-of-the-art tools.
Want to learn more? Onnx has an active developer community on Github.
The text was updated successfully, but these errors were encountered: