You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Google Colab
TensorFlow version and how it was installed (source or binary): 2.0.0-dev20190914
TensorFlow-Addons version and how it was installed (source or binary): 0.6.0-dev
Python version: 3.6
Is GPU used? (yes/no): Yes
Describe the bug
When creating a BasicDecoder using an AttentionWrapper cell which itself is created by using an AttentionMechinsm without a memory, an error is raised.
ValueError: The AttentionMechanism instances passed to this AttentionWrapper should be initialized with a memory first, either by passing it to the AttentionMechanism constructor or calling attention_mechanism.setup_memory()
The cause of this error is probably the rnn_cell_impl.assert_like_rnncell("cell", cell) check which is present in the BasicDecoder's constructor. The above assertion will end up in AttentionWrapper.output_size or AttentionWrapper.state_size.
It appears assert_like_rnncell reimplements its own hasattr in terms of getattr: tensorflow/tensorflow@d70e8ee. This is certainly to support cells overriding __getattribute__ but it's unclear how common this use case is.
As it is a private TensorFlow API, I would suggest to implement an alternative that works for us. What do you think?
Yes, I think it would work completely fine. However, it would restrict the type of RNNCells that we could potentially accept. So probably we need to document it somewhere in our code to inform the future users.
In addition, Maybe we should warn the user to not use our .output_size/.state_size before the memory initialization. And even double-check our code to make sure that we wait until the initialization of the memory.
System information
Describe the bug
When creating a
BasicDecoder
using anAttentionWrapper
cell which itself is created by using anAttentionMechinsm
without a memory, an error is raised.Code to reproduce the issue
Other info / logs
Full trace
The cause of this error is probably the
rnn_cell_impl.assert_like_rnncell("cell", cell)
check which is present in theBasicDecoder
's constructor. The above assertion will end up inAttentionWrapper.output_size
orAttentionWrapper.state_size
.This issue is probably related to #461
I encountered this issue when i was working on #335.
The text was updated successfully, but these errors were encountered: