Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Variable Wemb/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope? #7

Closed
chaiyixuan opened this issue Jan 22, 2018 · 3 comments

Comments

@chaiyixuan
Copy link

code:
with tf.variable_scope(tf.get_variable_scope(), reuse=None):
train_op = tf.train.AdamOptimizer(self.lr).minimize(loss)

error:
Variable Wemb/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?

I dont kown why

@pochih
Copy link
Owner

pochih commented Jan 23, 2018

你是用 train 好的 model 跑的嗎?
還有 tensorflow 版本是否正確

@chaiyixuan
Copy link
Author

chaiyixuan commented Jan 23, 2018

My tensorflow is 1.4, I change the code and it works!

def build_model(self):
with tf.variable_scope(tf.get_variable_scope()):# add this line in model.py
word_vectors = tf.placeholder(tf.float32, [self.batch_size, self.n_encode_lstm_step, self.dim_wordvec])

@pochih
Copy link
Owner

pochih commented Feb 1, 2018

Good Job!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants