-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Description
Describe the bug
See #1187 for a similar issue.
I'm trying to deploy after updating some script issues inside model.tar.gz and reuploading to S3. In the console the Endpoint displays to a Sagemaker-generated bucket with an old version of the model (even if the timestamp in the S3 path is updated to the current time). It doesn't show the path of the model I pass in model_data as shown below. This happens in both Sagemaker Studio notebook as well as locally using Python.
To reproduce
- Deploy via code shown here:
sagemaker_session = sagemaker.Session()
role = 'arn:aws:iam::12345676890:role/service-role/AmazonSageMaker-ExecutionRole-12345676890'
serving_model = MXNetModel(model_data='s3://my-bucket/model/model.tar.gz',
role=role,
entry_point='inference.py',
py_version='py3',
framework_version='1.6.0')
predictor = serving_model.deploy(initial_instance_count=1,
instance_type='ml.c5.xlarge')
- Delete Endpoint/EndpointConfig/Model artifacts using console/cli.
- Upload new model to the location at
model_data - Re-run code snippet.
Expected behavior
I expected the endpoint to pickup (or repack) the latest version of the model. If I download the model from the shown S3 location in the Endpoint, the contents are from the previous version.
Screenshots or logs
System information
A description of your system. Please provide:
- SageMaker Python SDK version: 2.5.0
- Framework name (eg. PyTorch) or algorithm (eg. KMeans): MXNet
- Framework version: 1.6.0
- Python version: 3.6+
- CPU or GPU: CPU
- Custom Docker image (Y/N): N
Additional context
Worked okay in Tensorflow. Endpoint shows S3 location I specified in model_data.