Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: Input dimension mis-match. (input[2].shape[0] = 2080, input[3].shape[0] = 32) #43

Open
johnny12150 opened this issue May 8, 2021 · 0 comments

Comments

@johnny12150
Copy link

I disable the custom GPU optimization following the instructions in README.md.

https://github.com/hidasib/GRU4Rec#executing-on-cpu

However, this will trigger the error.

  File "./models/theano/gru4rec\model\gru4rec.py", line 617, in fit
    cost = train_function(in_idx, y, len(iters), reset.reshape(len(reset), 1))
  File "A:\env\sess\lib\site-packages\theano\compile\function_module.py", line 917, in __call__
    storage_map=getattr(self.fn, 'storage_map', None))
  File "A:\env\sess\lib\site-packages\theano\gof\link.py", line 325, in raise_with_op
    reraise(exc_type, exc_value, exc_trace)
  File "A:\env\sess\lib\site-packages\six.py", line 702, in reraise
    raise value.with_traceback(tb)
  File "A:\env\sess\lib\site-packages\theano\compile\function_module.py", line 903, in __call__
    self.fn() if output_subset is None else\
ValueError: Input dimension mis-match. (input[2].shape[0] = 2080, input[3].shape[0] = 32)
Apply node that caused the error: Elemwise{Composite{(i0 + Switch(i1, i2, i3))}}[(0, 2)](TensorConstant{(1,) of 1e-24}, Elemwise{gt,no_inplace}.0, Sum{axis=[0], acc_dtype=float64}.0, Sum{axis=[1], acc_dtype=float64}.0)
Toposort index: 61
Inputs types: [TensorType(float64, (True,)), TensorType(bool, (True,)), TensorType(float64, vector), TensorType(float64, vector)]
Inputs shapes: [(1,), (1,), (2080,), (32,)]
Inputs strides: [(8,), (1,), (8,), (8,)]
Inputs values: [array([1.e-24]), array([False]), 'not shown', 'not shown']
Outputs clients: [[InplaceDimShuffle{x,0}(Elemwise{Composite{(i0 + Switch(i1, i2, i3))}}[(0, 2)].0), InplaceDimShuffle{0,x}(Elemwise{Composite{(i0 + Switch(i1, i2, i3))}}[(0, 2)].0), Elemwise{Log}[(0, 0)](Elemwise{Composite{(i0 + Switch(i1, i2, i3))}}[(0, 2)].0)]]

HINT: Re-running with most Theano optimization disabled could give you a back-trace of when this node was created. This can be done with by setting the Theano flag 'optimizer=fast_compile'. If that does not work, Theano optimizations can be disabled with 'optimizer=None'.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

Process finished with exit code 1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant