You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@daniel-v-e I am not sure how hard it is to implement it. I forgot a bit how Bidirectional works. Is it concatenating the outputs at each step from two RNNs going forward and backwards? Does it involve "merging" the states (cells) when doing that (I don't think so but just asking)? Because TCN don't have states compared to GRU or LSTM.
If it's just concatenating outputs at each step where each TCN is independent, then it's def do-able. And it would not be so hard. I guess we can just do:
We need to add go_backwards in the constructor.
If go_backwards=True, we need to flip the time dimension before the TCN layers are called.
Could this TCN implementation hypothetically be modified so that it can be wrapped by
tf.keras.layers.Bidirectional
?Currently it is not possible, as layers need the
go_backwards
attribute in order to be wrapped by the bidirectional layer - see https://www.tensorflow.org/api_docs/python/tf/keras/layers/BidirectionalThe text was updated successfully, but these errors were encountered: