-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Description
Describe the bug
python version:- 3.6.9
sagemaker version:- 2.15.3
I have saved a keras model trained in sagemaker in S3. Using the saved model artifacts i am creating a endpoint for prediction.
If I am passing a numpy array as input , it is predicting correctly. But If i convert that numpy array to json, it is givng error.
`
from sagemaker.tensorflow.model import TensorFlowModel
model = TensorFlowModel(entry_point= None,
model_data=s3modelartifact_path,
role= role,
framework_version = '2.0.1',
)
predictor = model.deploy(initial_instance_count=1,
instance_type ='ml.c5.xlarge',
endpoint_name = endpoint_name
)
input1=[0.000192526044, 0.000283702917, 7.8021214e-05, 0.997938097, 0.000831755286, 7.82618445e-05,
1.55036887e-05, 0.000505700707, 7.63262287e-05]
print (predictor.predict(input1))
print ('.....................')
input2={'instances':[0.000192526044, 0.000283702917, 7.8021214e-05, 0.997938097, 0.000831755286, 7.82618445e-05,
1.55036887e-05, 0.000505700707, 7.63262287e-05]}
jsonSend = json.dumps(input2)
print (jsonSend)
print ('--------------------')
predictor.predict(jsonSend)
{'predictions': [[0.077790238, 0.229795516, 0.0862368569, 0.114024423, 0.111587279, 0.0515283681, 0.0591780208, 0.16133377, 0.108525455]]}
.....................
{"instances": [0.000192526044, 0.000283702917, 7.8021214e-05, 0.997938097, 0.000831755286, 7.82618445e-05, 1.55036887e-05, 0.000505700707, 7.63262287e-05]}
ModelError Traceback (most recent call last)
in
15 print (jsonSend)
16 print ('--------------------')
---> 17 predictor.predict(jsonSend)
/usr/local/lib/python3.6/dist-packages/sagemaker/tensorflow/model.py in predict(self, data, initial_args)
120 args["CustomAttributes"] = self._model_attributes
121
--> 122 return super(TensorFlowPredictor, self).predict(data, args)
123
124
/usr/local/lib/python3.6/dist-packages/sagemaker/predictor.py in predict(self, data, initial_args, target_model, target_variant)
117
118 request_args = self._create_request_args(data, initial_args, target_model, target_variant)
--> 119 response = self.sagemaker_session.sagemaker_runtime_client.invoke_endpoint(**request_args)
120 return self._handle_response(response)
121
/usr/local/lib/python3.6/dist-packages/botocore/client.py in _api_call(self, *args, **kwargs)
355 "%s() only accepts keyword arguments." % py_operation_name)
356 # The "self" in this scope is referring to the BaseClient.
--> 357 return self._make_api_call(operation_name, kwargs)
358
359 _api_call.name = str(py_operation_name)
/usr/local/lib/python3.6/dist-packages/botocore/client.py in _make_api_call(self, operation_name, api_params)
674 error_code = parsed_response.get("Error", {}).get("Code")
675 error_class = self.exceptions.from_code(error_code)
--> 676 raise error_class(parsed_response, operation_name)
677 else:
678 return parsed_response
ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received client error (400) from model with message "{ "error": "Failed to process element: 0 of 'instances' list. Error: Invalid argument: JSON Value: "{\"instances\": [0.000192526044, 0.000283702917, 7.8021214e-05, 0.997938097, 0.000831755286, 7.82618445e-05, 1.55036887e-05, 0.000505700707, 7.63262287e-05]}" Type: String is not of expected type: float" }".
`
Expected behavior
The model should give the output as I am passing json
Screenshots or logs
If applicable, add screenshots or logs to help explain your problem.
System information
A description of your system. Please provide:
- SageMaker Python SDK version: 2.15.3
- Framework name (eg. PyTorch) or algorithm (eg. KMeans): Tensorflow
- Framework version: while creating the model ,I am passing parameter framework_version = '2.0.1', but firing
! python -c 'import tensorflow; print(tensorflow.__version__)'gives me output as '1.15.2'. - Python version: 3.6.9
- CPU or GPU: CPU
- Custom Docker image (Y/N): N
Additional context
Add any other context about the problem here.