-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Description
Please fill out the form below.
System Information
- Framework (e.g. TensorFlow) / Algorithm (e.g. KMeans): Tensorflow
- Framework Version: 1.11
- Python Version: 3
- CPU or GPU:cpu
- Python SDK Version: latest
- Are you using a custom image: no
Describe the problem
When I deploy the model I'm getting a message that says "contact customer support". I go to the CloudWatch and I see the following error repeating 100 times
Traceback (most recent call last): File "/sagemaker/serve.py", line 189, in <module> ServiceManager().start() File "/sagemaker/serve.py", line 163, in start self._create_tfs_config() File "/sagemaker/serve.py", line 53, in _create_tfs_config raise ValueError('no SavedModel bundles found!')
On my Jupiter notebook I'm running the following to deploy:
predictor = estimator_call.deploy(initial_instance_count=1, instance_type='ml.m4.xlarge')
On the custom script, I have the training and evaluation functions, but I don't have any code for the serving because I'm assuming that is done by SageMaker, according to the documentation:
After a TensorFlow estimator has been fit, it saves a TensorFlow SavedModel in the S3 location defined by output_path. You can call deploy on a TensorFlow estimator to create a SageMaker Endpoint.
In S3 the model is saved in the right bucket. I'm using the following to specify where to save it
dnn_model = tf.estimator.DNNClassifier(hidden_units=[20, 20, 20, 20], feature_columns=feature_column, n_classes=2, model_dir=model_dirr)
model_dirr = os.environ.get('SM_MODEL_DIR')
I don't have more information, not sure where to even look, Any idea what the problem is?