You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
FYI I already use Sagemaker in order to train and deploy my custom ML models there.
It's not clear to me how to deploy my model to Sagemaker. More specifically, how can I specify my predict function that will be called by the controller?
Additionally, it would be very useful to train my ML models on Sagemaker using the mlflow commands.
Proposal
I'd like to integrate Sagify (https://github.com/Kenza-AI/sagify, it's one of my open source projects) to mlflow so that to train and deploy ML models on Sagemaker. Please, check the workflow that I'd like to add to mlflow:
Proposal for Training and Deploying on SageMaker:
mlflow sagemaker init -d src: This will create all boilerplate code under the directory src where all my ML code lives. My ML code under src uses already mlflow.
Then, I need to implement two functions, train(...) and predict(input_json). The train(...) function should call my ML training logic that already lives under src, and I need to implement my transformer from JSON to an ML friendly data type in the predict(input_json) function.
mlflow sagemaker build: It will build a Docker image that will contains all the code under src
mlflow sagemaker push: It will push the built Docker image to ECS
mlflow sagemaker local-train: It will the training logic that lives in the Docker image. This can be used for testing before trying on SageMaker
mlflow sagemaker local-deploy: It will run the Docker image on deploy mode where I can test the rest endpoint that calls the trained model.
mlflow sagemaker train: It will run the Docker image on train mode on SageMaker
mlflow sagemaker deploy: It will run the Docker image on deploy mode on SageMaker
Proposal Only for Deploying on SageMaker:
mlflow sagemaker init -d src: This will create all boilerplate code under the directory src where all my ML code lives. My ML code under src uses already mlflow.
Then, I need to implement the functionpredict(input_json). I need to implement my transformer from JSON to an ML friendly data type in the predict(input_json) function.
mlflow sagemaker build: It will build a Docker image that will contains all the code under src
mlflow sagemaker push: It will push the built Docker image to ECS
mlflow sagemaker local-deploy --model-path=<model_path> --run-id=<run_id>: It will run the Docker image on deploy mode where I can test the rest endpoint that calls the trained model.
mlflow sagemaker train: It will run the Docker image on train mode on SageMaker
mlflow sagemaker deploy --model-path=<model_path> --run-id=<run_id>: It will run the Docker image on deploy mode on SageMaker
Please, let me know about your thoughts. I'm thinking to proceed for the Proposal Only for Deploying on SageMaker.
The text was updated successfully, but these errors were encountered:
The deploy into Sagemaker is currently done via PyFunc model flavor with high level wrappers available for sklearn and Tensorflow and Spark ML lib coming in next release.
If you want to deploy non-python models we will have similar code for other languages and we would appreciate any contribution.
If you are deploying python models and PyFunc interface does not work for you, could you provide feedback on what it is missing?
Describe the problem
FYI I already use Sagemaker in order to train and deploy my custom ML models there.
It's not clear to me how to deploy my model to Sagemaker. More specifically, how can I specify my predict function that will be called by the controller?
Additionally, it would be very useful to train my ML models on Sagemaker using the mlflow commands.
Proposal
I'd like to integrate Sagify (https://github.com/Kenza-AI/sagify, it's one of my open source projects) to mlflow so that to train and deploy ML models on Sagemaker. Please, check the workflow that I'd like to add to mlflow:
Proposal for Training and Deploying on SageMaker:
mlflow sagemaker init -d src
: This will create all boilerplate code under the directorysrc
where all my ML code lives. My ML code undersrc
uses already mlflow.train(...)
andpredict(input_json)
. Thetrain(...)
function should call my ML training logic that already lives undersrc
, and I need to implement my transformer from JSON to an ML friendly data type in thepredict(input_json)
function.mlflow sagemaker build
: It will build a Docker image that will contains all the code undersrc
mlflow sagemaker push
: It will push the built Docker image to ECSmlflow sagemaker local-train
: It will the training logic that lives in the Docker image. This can be used for testing before trying on SageMakermlflow sagemaker local-deploy
: It will run the Docker image on deploy mode where I can test the rest endpoint that calls the trained model.mlflow sagemaker train
: It will run the Docker image on train mode on SageMakermlflow sagemaker deploy
: It will run the Docker image on deploy mode on SageMakerProposal Only for Deploying on SageMaker:
mlflow sagemaker init -d src
: This will create all boilerplate code under the directorysrc
where all my ML code lives. My ML code undersrc
uses already mlflow.predict(input_json)
. I need to implement my transformer from JSON to an ML friendly data type in thepredict(input_json)
function.mlflow sagemaker build
: It will build a Docker image that will contains all the code undersrc
mlflow sagemaker push
: It will push the built Docker image to ECSmlflow sagemaker local-deploy --model-path=<model_path> --run-id=<run_id>
: It will run the Docker image on deploy mode where I can test the rest endpoint that calls the trained model.mlflow sagemaker train
: It will run the Docker image on train mode on SageMakermlflow sagemaker deploy --model-path=<model_path> --run-id=<run_id>
: It will run the Docker image on deploy mode on SageMakerPlease, let me know about your thoughts. I'm thinking to proceed for the
Proposal Only for Deploying on SageMaker
.The text was updated successfully, but these errors were encountered: