Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

out,state = tflib.ops.FreeRunIm2LatexAttention('AttLSTM',emb_seqs,ctx,EMB_DIM,ENC_DIM,DEC_DIM,D,H,W) #19

Open
cltdevelop opened this issue May 13, 2018 · 2 comments

Comments

@cltdevelop
Copy link

Hello!
Thank you for sharing your code!
I have something wrong when I run this code.
Is there something wrong in the attention.py. The order of parameters in out,state = tflib.ops.FreeRunIm2LatexAttention('AttLSTM',emb_seqs,ctx,EMB_DIM,ENC_DIM,DEC_DIM,D,H,W) is not right? I mean the parameter named ctx should in the second place.

@m--i
Copy link

m--i commented Aug 19, 2019

Is that related with the error ValueError: Dimension must be 3 but is 4 for 'transpose' (op: 'Transpose') with input shapes: [?,?,80], [4]

which occurs at
out,state = tflib.ops.FreeRunIm2LatexAttention('AttLSTM',emb_seqs,ctx,EMB_DIM,ENC_DIM,DEC_DIM,D,H,W)

in attention.py.

Is there a solution for it yet?

@m--i
Copy link

m--i commented Aug 19, 2019

If you try to map all parameters given in attention.py and FreeRunIm2LatexAttention(…) in ops.py there a value for output_dim missing in attention.py.

Even in ops.py an explanation for this parameter is missing.

Which is the correct parameter to pass?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants