Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]Got "ValueError: not enough values to unpack (expected 2, got 1)" when BatchMatMul using two same tensors #8925

Closed
NoobCheese opened this issue Sep 3, 2021 · 7 comments

Comments

@NoobCheese
Copy link

I got a tf graph , imported it into TVM succesfully using api "relay.frontend.from_tensorflow"
Error occurs when :
snippt

Error log like this :
error

The two input tensors of "batchMatMul" node are the same .both of their shapes are [1, 41, 16].

And i traces the soure code here:
code

i am deeply confused

@vinx13
Copy link
Member

vinx13 commented Sep 3, 2021

Seems that only one placeholder is created. Would you mind sharing the Relay IR?

@NoobCheese
Copy link
Author

Seems that only one placeholder is created. Would you mind sharing the Relay IR?

i am sorry that the Relay IR which is generated from a online business model. But it seems u r right. i modified the sample batchmatmul in tvm and i got the same error hints.

The original smaple code is:
x = placeholder(xxxx),
y = placeholder(xxxx),
And what i did is :
x = placeholder(xxxx),
y = x,

However, my model works in online TF env, and what i should do if this case happened when i want to use TVM?

thanks bro!

@vinx13
Copy link
Member

vinx13 commented Sep 4, 2021 via email

@comaniac
Copy link
Contributor

comaniac commented Sep 4, 2021

I don't think we can reuse the placeholder in this way. If your model results in this IR, then it is probably due to the incorrect Relay TF frontend, or unsupported behavior.

@NoobCheese
Copy link
Author

You can print the prim func here and check if the placeholders are correctly created. https://github.com/apache/tvm/blob/main/src/relay/backend/te_compiler_cache.cc#L125

On Sep 4, 2021, at 9:48 AM, Noob @.***> wrote:  Seems that only one placeholder is created. Would you mind sharing the Relay IR? i am sorry that the Relay IR which is generated from a online business model. But it seems u r right. i modified the sample batchmatmul in tvm and i got the same error hints. The original smaple code is: x = placeholder(xxxx), y = placeholder(xxxx), And what i did is : x = placeholder(xxxx), y = x, However, my model works in online TF env, and what i should do if this case happened when i want to use TVM? thanks bro! — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

My co-worker hacked the tf subgraph, and added a placeholder into the bmm node this morning, Tvm built the Relay IR well :)
Anyway. one thing i am gonna comfirm, if only one placeholder in bmm, i.e a matrix multiply itself, does tvm support this?

Thanks bro!

@vinx13
Copy link
Member

vinx13 commented Sep 8, 2021

I checked the below example, seems Relay currently doesn't support this.

import tvm
from tvm import relay
a = relay.Var('a', relay.TensorType((1,8, 16)))
b = relay.nn.batch_matmul(a, a)
f = relay.Function([a], b)
with tvm.target.Target('llvm'):
    relay.build(f)

@NoobCheese
Copy link
Author

I checked the below example, seems Relay currently doesn't support this.

import tvm
from tvm import relay
a = relay.Var('a', relay.TensorType((1,8, 16)))
b = relay.nn.batch_matmul(a, a)
f = relay.Function([a], b)
with tvm.target.Target('llvm'):
    relay.build(f)

thx, it really helps

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants