-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Closed

Description
Please fill out the form below.
System Information
- Framework (e.g. TensorFlow) / Algorithm (e.g. KMeans): TensorFlow
- Framework Version: 1.11
- Python Version: 2.7
- CPU or GPU: CPU
- Python SDK Version: 1.18.3
- Are you using a custom image: No
Describe the problem
Getting the following error on inference: <type 'list'>. Valid formats: float, int, str any object that implements iter or classification_pb2.ClassificationRequest
Seems to come from here: https://github.com/aws/sagemaker-tensorflow-container/blob/ba46b9262da8b22e3242a4d35220679e6b9043c2/src/tf_container/proxy_client.py#L262
Saved Model Output
The given SavedModel SignatureDef contains the following input(s):
inputs['inputs'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
outputs['classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 20)
name: dnn/head/Tile:0
outputs['scores'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 20)
name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/classify
Serving Request
model_spec {
name: "generic_model"
signature_name: "serve"
}
inputs {
key: "inputs"
value {
dtype: DT_STRING
tensor_shape {
dim {
size: 1
}
}
string_val: "\n\230\361\004\n\224\361\004\n\006input\022\210\361\004\022\204\361\004\n
Code
INPUT_KEY = 'text'
OUTPUT_KEY = 'response'
ERROR_STRING = 'Requested unsupported ContentType, content type: '
def input_fn(serialized_input_data, content_type=JSON_CONTENT_TYPE):
if content_type == JSON_CONTENT_TYPE:
input_data = json.loads(serialized_input_data)
input_features = transform(preprocess(input_data[INPUT_KEY]))
return build_request("generic_model", input_features.tolist()) #input_features is a list of floats
raise Exception(ERROR_STRING + content_type)
def build_request(name, input_ids, signature_name=DEFAULT_SERVING_SIGNATURE_DEF_KEY):
examples = [make_example(input_ids).SerializeToString()]
request = predict_pb2.PredictRequest()
request.model_spec.name = name
request.model_spec.signature_name = signature_name
request.inputs["inputs"].CopyFrom(
tf.contrib.util.make_tensor_proto(
examples))
return request
def make_example(input_ids, feature_name="input"):
features = {
feature_name:
tf.train.Feature(float_list=tf.train.FloatList(value=input_ids))
}
return tf.train.Example(features=tf.train.Features(feature=features))
def transform(msg):
msg = [preprocess(msg)]
msg = TFIDF_VECTORIZER.transform(msg)
return KBEST_SELECTOR.transform(msg.toarray()).flatten()
def preprocess(line):
return re.sub("[^a-z0-9'.]+", " ", line.lower()).strip()
Metadata
Metadata
Assignees
Labels
No labels