This repository was archived by the owner on Jul 7, 2023. It is now read-only.

Description
I have followed the translation colab notebook tutorial as suggested by this repository
After exporting the model and uploading it to Google's AI Platform engine for online prediction, I am having trouble making requests to the model.
I believe the input to the translation model is a tensor of the source text. But I am receiving an error that TypeError: Object of type 'EagerTensor' is not JSON serializable
def encode(input_str, output_str=None):
"""Input str to features dict, ready for inference"""
inputs = encoders["inputs"].encode(input_str) + [1] # add EOS id
batch_inputs = tf.reshape(inputs, [1, -1, 1]) # Make it 3D.
return {"inputs": batch_inputs}
enfr_problem = problems.problem(PROBLEM)
encoders = enfr_problem.feature_encoders(DATA_DIR)
encoded_inputs = encode("Some text")
model_output = predict_json('project_name','model_name', encoded_inputs,'version_1')["outputs"]
I've tried converting the tensor to numpy but still no luck. Could someone point me in the right direction?