You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there, as I did see in your attn_bi_lstm.py (maybe also other approaches), used a random technique in embedding words. I did give it a try to use pre-trained embedding, however, I do not know how to set it up, also get an error with "must have rank at least 3" (so sorry, I am newer to Tensorflow). Thank you and much appreciate.
Hi, according to your description, I guess that you want to use pre-trained word embedding to initialize the embedding matrix. A common way is to assign the pre-trained word embedding to the embedding
# create embedding matrixembeddings_var=tf.Variable(tf.constant(0.0, shape=[vocab_size_after_process, embedding_dim]), trainable=False)
# define a embedding placeholder to pass the pre-trained word embedding inembedding_placeholder=tf.placeholder(tf.float32, [vocab_size_after_process, embedding_dim])
embedding_init=embeddings_var.assign(embedding_placeholder) # an assign operation
Then you can run the embedding_init in your session like
withtf.Session() assess:
sess.run(embedding_init, feed_dict={embedding_placehoder:embedding})
# ... other code
Hi there, as I did see in your attn_bi_lstm.py (maybe also other approaches), used a random technique in embedding words. I did give it a try to use pre-trained embedding, however, I do not know how to set it up, also get an error with "must have rank at least 3" (so sorry, I am newer to Tensorflow). Thank you and much appreciate.
Word embedding
My trial
Note that, I got an error at inputs=batch_embedded
The text was updated successfully, but these errors were encountered: