Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bi-LSTM with attention tensor transformation #11

Closed
rejae opened this issue Sep 13, 2019 · 1 comment
Closed

Bi-LSTM with attention tensor transformation #11

rejae opened this issue Sep 13, 2019 · 1 comment

Comments

@rejae
Copy link

rejae commented Sep 13, 2019

                    outputs, current_state = tf.nn.bidirectional_dynamic_rnn(lstm_fw_cell, lstm_bw_cell,embedded_words, dtype=tf.float32, scope="bi-lstm" + str(idx))
                   embedded_words = tf.concat(outputs, 2)

    # 将最后一层Bi-LSTM输出的结果分割成前向和后向的输出
    outputs = tf.split(embedded_words, 2, -1)

为什么先对最后一层进行concat后,又split拆开使用,都是针对axis=2拼接和拆开,直接使用bidirectional_dynamic_rnn, 返回的outputs元组不行么?

@rejae rejae closed this as completed Sep 13, 2019
@rejae
Copy link
Author

rejae commented Sep 13, 2019

在多层的时候embedded_words还需要传入下层,现在get到了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant