Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
Creating an instance of BasicDecoder fails with AttentionMechanism without memory #511
Describe the bug
When creating a
Code to reproduce the issue
units = 32 vocab_size = 1000 attention_mechanism = tfa.seq2seq.LuongAttention(units) cell = tf.keras.layers.LSTMCell(units) attention_wrapper = tfa.seq2seq.AttentionWrapper( cell, attention_mechanism) vocab_proj_layer = tf.keras.layers.Dense(vocab_size) decoder_sampler = tfa.seq2seq.sampler.TrainingSampler() decoder = tfa.seq2seq.BasicDecoder( cell=attention_wrapper, sampler=decoder_sampler, output_layer=vocab_proj_layer)
Other info / logs
The cause of this error is probably the
This issue is probably related to #461
I encountered this issue when i was working on #335.
Thanks for the report!
As it is a private TensorFlow API, I would suggest to implement an alternative that works for us. What do you think?
Yes, I think it would work completely fine. However, it would restrict the type of RNNCells that we could potentially accept. So probably we need to document it somewhere in our code to inform the future users.