Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error about debug_seq2seq #9

Open
htw2012 opened this issue Apr 5, 2016 · 1 comment
Open

error about debug_seq2seq #9

htw2012 opened this issue Apr 5, 2016 · 1 comment

Comments

@htw2012
Copy link

htw2012 commented Apr 5, 2016

Hi, @nicolas-ivanov

I run the training code.Maybe it contains some errors in it.I get the error like below:

`ValueError: Shape mismatch: x has 64 rows but z has 24 rows
Apply node that caused the error: Gemm{no_inplace}(Subtensor{::, int64::}.0, TensorConstant{0.20000000298}, <TensorType(float32, matrix)>, lstm_U_o_copy, TensorConstant{0.20000000298})
Toposort index: 5
Inputs types: [TensorType(float32, matrix), TensorType(float32, scalar), TensorType(float32, matrix), TensorType(float32, matrix), TensorType(float32, scalar)]
Inputs shapes: [(24, 128), (), (64, 128), (128, 128), ()]
Inputs strides: [(32768, 4), (), (512, 4), (512, 4), ()]
Inputs values: ['not shown', array(0.20000000298023224, dtype=float32), 'not shown', 'not shown', array(0.20000000298023224, dtype=float32)]
Outputs clients: [[Elemwise{Composite{(clip((i0 + i1), i2, i3) * tanh(i4))}}(TensorConstant{(1, 1) of 0.5}, Gemm{no_inplace}.0, TensorConstant{(1, 1) of 0}, TensorConstant{(1, 1) of 1}, Elemwise{Composite{((clip((i0 + i1), i2, i3) * i4) + (clip((i5 + i6), i2, i3) * tanh(i7)))}}.0)]]

HINT: Re-running with most Theano optimization disabled could give you a back-trace of when this node was created. This can be done with by setting the Theano flag 'optimizer=fast_compile'. If that does not work, Theano optimizations can be disabled with 'optimizer=None'.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.
Apply node that caused the error: forall_inplace,cpu,scan_fn}(TensorConstant{16}, InplaceDimShuffle{1,0,2}.0, IncSubtensor{InplaceSet;:int64:}.0, DeepCopyOp.0, TensorConstant{16}, lstm_U_o, lstm_U_f, lstm_U_i, lstm_U_c)
Toposort index: 36
Inputs types: [TensorType(int64, scalar), TensorType(float32, 3D), TensorType(float32, (True, False, False)), TensorType(float32, (True, False, False)), TensorType(int64, scalar), TensorType(float32, matrix), TensorType(float32, matrix), TensorType(float32, matrix), TensorType(float32, matrix)]
Inputs shapes: [(), (16, 24, 512), (1, 64, 128), (1, 64, 128), (), (128, 128), (128, 128), (128, 128), (128, 128)]
Inputs strides: [(), (2048, 32768, 4), (32768, 512, 4), (32768, 512, 4), (), (512, 4), (512, 4), (512, 4), (512, 4)]
Inputs values: [array(16), 'not shown', 'not shown', 'not shown', array(16), 'not shown', 'not shown', 'not shown', 'not shown']
Outputs clients: [[], [], [InplaceDimShuffle{0,1,2}(forall_inplace,cpu,scan_fn}.2)]]

HINT: Re-running with most Theano optimization disabled could give you a back-trace of when this node was created. This can be done with by setting the Theano flag 'optimizer=fast_compile'. If that does not work, Theano optimizations can be disabled with 'optimizer=None'.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.`

How can I solve it ?

@kaya27
Copy link

kaya27 commented Nov 16, 2016

how did u solve it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants