Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference on saved Tensorflow model #45

Open
AlphaNumeric99 opened this issue May 16, 2020 · 2 comments
Open

Inference on saved Tensorflow model #45

AlphaNumeric99 opened this issue May 16, 2020 · 2 comments

Comments

@AlphaNumeric99
Copy link

AlphaNumeric99 commented May 16, 2020

I converted the weights using save_model.py and got the final model.
My question is how to load it via Keras/Tensorflow to do inference?

model = tf.saved_model.load(str(model_dir), tags=['serve'])
model = model.signatures['serving_default']

resized_rgb_image = resized_rgb_image.astype(np.float32)
input_image = np.expand_dims(resized_rgb_image, axis=0)
input_tensor = tf.convert_to_tensor(input_image)
output_dict = model(input_tensor)

I get tensorflow.python.framework.errors_impl.FailedPreconditionError

tensorflow.python.framework.errors_impl.FailedPreconditionError:  Error while reading resource variable batch_normalization_56/moving_mean_60226 from Container: localhost. This could mean that the variable was uninitialized. Not found: Resource localhost/batch_normalization_56/moving_mean_60226/class tensorflow::Var does not exist.
	 [[{{node StatefulPartitionedCall/model_1/batch_normalization_56/FusedBatchNormV3/ReadVariableOp}}]] [Op:__inference_signature_wrapper_8573]
@alexrider1105
Copy link

I also have this issue. Did you manage to resolve it?

@dm0288
Copy link

dm0288 commented Aug 25, 2021

Any solution for this yet?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants