Skip to content
This repository has been archived by the owner on Dec 29, 2022. It is now read-only.

Error when running evaluation step #139

Closed
gidim opened this issue Mar 31, 2017 · 1 comment
Closed

Error when running evaluation step #139

gidim opened this issue Mar 31, 2017 · 1 comment

Comments

@gidim
Copy link

gidim commented Mar 31, 2017

Hi,

I'm following the NMT tutorial (medium) with my data. The model trains fine but crashes when reaches the eval step. Error:


InvalidArgumentError (see above for traceback): logits and labels must have the same first dimension, got logits shape [1120,69999] and labels shape [1568]
         [[Node: model/att_seq2seq/cross_entropy_sequence_loss/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits = SparseSoftmaxCrossEntropyWithLogits[T=DT_FLOAT, Tlabels=DT_INT64, _device="/job:localhost/replica:0/task:0/gpu:0"](model/att_seq2seq/cross_entropy_sequence_loss/SparseSoftmaxCrossEntropyWithLogits/Reshape, model/att_seq2seq/cross_entropy_sequence_loss/SparseSoftmaxCrossEntropyWithLogits/Reshape_1)]]
         [[Node: mean/broadcast_weights/assert_broadcastable/AssertGuard/switch_f/_282 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/cpu:0", send_device="/job:localhost/replica:0/task:0/gpu:0", send_device_incarnation=1, tensor_name="edge_1273_mean/broadcast_weights/assert_broadcastable/AssertGuard/switch_f", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/cpu:0"]()]]

69999 is my vocab size (-1)

@gidim gidim changed the title Eval metrics not showing on TensorBoard Error when running evaluation step Mar 31, 2017
@dennybritz
Copy link
Contributor

Duplicate #103

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants