Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not supported tflite opcode: SHAPE #213

Closed
sswxl opened this issue Apr 30, 2021 · 11 comments
Closed

Not supported tflite opcode: SHAPE #213

sswxl opened this issue Apr 30, 2021 · 11 comments
Milestone

Comments

@sswxl
Copy link

sswxl commented Apr 30, 2021

  1. Import graph...
    Fatal: Not supported tflite opcode: SHAPE
@laolihaile
Copy link

  1. Import graph...
    Fatal: Not supported tflite opcode: SHAPE

I have the same problem, do you solve it?

@sswxl
Copy link
Author

sswxl commented May 8, 2021

  1. Import graph...
    Fatal: Not supported tflite opcode: SHAPE

I have the same problem, do you solve it?

Supported TensorFlow Lite operators do not include Shape. So you can check whether your model uses the shape operator

@laolihaile
Copy link

  1. Import graph...
    Fatal: Not supported tflite opcode: SHAPE

I have the same problem, do you solve it?

Supported TensorFlow Lite operators do not include Shape. So you can check whether your model uses the shape operator

thank you, I do use 'Reshape' layer, So how do I solve the problem?

@sunnycase sunnycase added this to the 1.0-beta1 milestone May 11, 2021
@AIWintermuteAI
Copy link

@sunnycase any progress on this issue?
Since 2.x quite a few keras.layers have tf.shape op added, which honestly is quite annoying...
I'm porting YOLOv3 to my project and already have it working - just to discover that both UpSampling2D and Conv2DTranspose layers in Keras have tf.shape operator in 2.x and thus cannot be converted with nncase. I'm sure there is more use cases for this operator, which is relatively simple.

Here is links to Keras code to illustrate what I'm talking about:
tf.keras 1.x UpSampling2D layer - no tf.shape op
https://github.com/keras-team/keras/blob/7b9c8727760b2a8d02e409efaa6ff9e0333b02e1/keras/layers/convolutional.py#L1974
tf.keras 2.x UpSampling2D layer - tf.shape op present
https://github.com/keras-team/keras/blob/069e95a5c8075555b3bd2895bdbf34ed68bb3bba/keras/layers/convolutional.py#L2618

Is adding SHAPE in the roadmap? If it's not any guidance you can give on adding it? I'm willing to put some work and then make a PR.

@sunnycase
Copy link
Member

@AIWintermuteAI Can your provide the tflite?

@AIWintermuteAI
Copy link

@zhen8838
Copy link
Member

@AIWintermuteAI hi, I find your tflite model has something different, can you provide a .pb model?

The resize layer new shape is not equal to output shape, so it will cause the internal error:

The output shape should be [14,14], but the tflite model display the [1,1]:
截屏2021-07-11 下午12 45 55

So it can't perform concat operation:
截屏2021-07-11 下午12 42 45

@zhen8838
Copy link
Member

@AIWintermuteAI or maybe you can provide a new tflite model with constant shape information. just use the determined size upsample layer like UpSampling2D(2)

@AIWintermuteAI
Copy link

AIWintermuteAI commented Jul 13, 2021

@zhen8838
Hi there! I don't have a .pb model, but here is original Keras .h5 model.
https://drive.google.com/file/d/1M-E8JTMwU5H8YS4oYGaQOScLFICi7Ocu/view?usp=sharing
Additionally, I found the way to convert the model with Reshape and/or UpSampling2D layers - it turns out if static shape is set on tflite conversion, such as here
https://github.com/AIWintermuteAI/aXeleRate/blob/800a488f94c5d92791ee84293d1cd1d4309ba4fe/axelerate/networks/common_utils/convert.py#L194
then SHAPE op is not present in the .tfilte model anymore, so that solves the problem, at least for me.

@zhen8838
Copy link
Member

ok ~

@zhen8838
Copy link
Member

@AIWintermuteAI you can join our QQ group 790699378 or Telegram group

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants