Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

raise ValueError("Shapes %s and %s are incompatible" % (self, other)) ValueError: Shapes (1, 1, 576, 6) and (1, 1, 576, 273) are incompatible #10350

Open
3 tasks done
TobyHuang328 opened this issue Nov 6, 2021 · 2 comments
Assignees
Labels

Comments

@TobyHuang328
Copy link

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • I am using the latest TensorFlow Model Garden release and TensorFlow 2.
  • I am reporting the issue to the correct repository. (Model Garden official or research directory)
  • I checked to make sure that this issue has not already been filed.

1. The entire URL of the file you are using

https://github.com/tensorflow/models/tree/master/research/...

2. Describe the bug

I have a simple a script of python code to convert a .pb file into a .tflite file. But I was unable to convert the model and the error says.

Traceback (most recent call last):
File "C:\Users\godlo\Documents\Tensorflow-FreightFrenzy\scripts\pythonProject\main.py", line 7, in
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\lite\python\lite.py", line 1348, in from_saved_model
saved_model = _load(saved_model_dir, tags)
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\saved_model\load.py", line 864, in load
result = load_internal(export_dir, tags, options)["root"]
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\saved_model\load.py", line 902, in load_internal
loader = loader_cls(object_graph_proto, saved_model_proto, export_dir,
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\saved_model\load.py", line 165, in init
self._restore_checkpoint()
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\saved_model\load.py", line 476, in _restore_checkpoint
load_status = saver.restore(variables_path, self._checkpoint_options)
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\training\tracking\util.py", line 1382, in restore
base.CheckpointPosition(
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\training\tracking\base.py", line 254, in restore
restore_ops = trackable._restore_from_checkpoint_position(self) # pylint: disable=protected-access
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\training\tracking\base.py", line 980, in _restore_from_checkpoint_position
current_position.checkpoint.restore_saveables(
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\training\tracking\util.py", line 351, in restore_saveables
new_restore_ops = functional_saver.MultiDeviceSaver(
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\training\saving\functional_saver.py", line 339, in restore
restore_ops = restore_fn()
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\training\saving\functional_saver.py", line 323, in restore_fn
restore_ops.update(saver.restore(file_prefix, options))
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\training\saving\functional_saver.py", line 115, in restore
restore_ops[saveable.name] = saveable.restore(
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\training\saving\saveable_object_util.py", line 131, in restore
return resource_variable_ops.shape_safe_assign_variable_handle(
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\ops\resource_variable_ops.py", line 309, in shape_safe_assign_variable_handle
shape.assert_is_compatible_with(value_tensor.shape)
File "C:\Users\godlo\anaconda3\envs\pythonProject\lib\site-packages\tensorflow\python\framework\tensor_shape.py", line 1161, in assert_is_compatible_with
raise ValueError("Shapes %s and %s are incompatible" % (self, other))
ValueError: Shapes (1, 1, 576, 6) and (1, 1, 576, 273) are incompatible

3. Steps to reproduce

create and export a .pb file with mobilenet (I use batch size 2), I run the code below, and then the error above is what I got
import tensorflow as tf
print(tf.version.VERSION)
from h5py._hl import dataset

saved_model_dir = "C:\Users\godlo\Documents\Tensorflow-FreightFrenzy\workspace\training_demo\exported-models\mobilenet_model\saved_model"

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)

Convert

converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS, # enable TensorFlow ops.
]

tflite_model = converter.convert()

open('mobilenet_model.tflite', 'wb').write(tflite_model)

4. Expected behavior

I should get a .tflite file in my python folder with the name mobilnet_model.tflite

5. Additional context

It somehow works with some other models I have exported, such as resnet models. Just not the mobilenet model for some reason.

6. System information

  • OS Platform and Distribution: Windows 10
  • TensorFlow installed from (source or binary): pip install tensorflow
  • TensorFlow version (use command below): 2.6.0
  • Python version: 3.9.7
  • CUDA/cuDNN version: Build cuda_11.2.r11.2/compiler.29618528_0
  • GPU model and memory: NVIDIA GeForce MX250 and 16GB of memory
@TobyHuang328 TobyHuang328 added models:research models that come under research directory type:bug Bug in the code labels Nov 6, 2021
@kumariko kumariko self-assigned this Nov 8, 2021
@kumariko
Copy link

kumariko commented Nov 8, 2021

@TobyHuang328 Could you please have a look on the link1, link2, link3 and let us know if it helps? Thanks!

@kumariko kumariko added the stat:awaiting response Waiting on input from the contributor label Nov 8, 2021
@TobyHuang328
Copy link
Author

Link 1 helped me a little bit, but at the end of the day, I am still stuck. Here is my situation, I have trained and exported a mobilenet model using exporter_main_v2.py, and right now I am playing around with convert_tflite and export_tflite_ssd_graph.py. I do not know if my model is frozen or quantized or something else. Sorry I am quite a beginner in object detection

@tensorflowbutler tensorflowbutler removed the stat:awaiting response Waiting on input from the contributor label Nov 15, 2021
@kumariko kumariko added models:research:odapi ODAPI and removed models:research models that come under research directory labels Nov 15, 2021
@kumariko kumariko assigned tombstone, jch1 and pkulzc and unassigned kumariko Nov 15, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants