Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error for running with TensorFlow: ValueError: Variable d_w1/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope? #2

Open
heathervica opened this issue Jun 7, 2017 · 6 comments

Comments

@heathervica
Copy link

Hello, please maybe could you help me? I had a running for running, I tried to figurate how to solve but I could not, I tried also with Python 2, 3.5, 3.6, And Tensorflow 1 and 1.1

The complete Error was:
heather@heather-ThinkPad-P50:~/Downloads/generative-adversarial-networks-master (2)$ python3 gan-script.py
Extracting MNIST_data/train-images-idx3-ubyte.gz
Extracting MNIST_data/train-labels-idx1-ubyte.gz
Extracting MNIST_data/t10k-images-idx3-ubyte.gz
Extracting MNIST_data/t10k-labels-idx1-ubyte.gz
Traceback (most recent call last):
File "gan-script.py", line 131, in
d_trainer_fake = tf.train.AdamOptimizer(0.0003).minimize(d_loss_fake, var_list=d_vars)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/optimizer.py", line 325, in minimize
name=name)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/optimizer.py", line 446, in apply_gradients
self._create_slots([_get_variable_for(v) for v in var_list])
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/adam.py", line 122, in _create_slots
self._zeros_slot(v, "m", self._name)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/optimizer.py", line 766, in _zeros_slot
named_slots[_var_key(var)] = slot_creator.create_zeros_slot(var, op_name)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/slot_creator.py", line 174, in create_zeros_slot
colocate_with_primary=colocate_with_primary)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/slot_creator.py", line 146, in create_slot_with_initializer
dtype)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/slot_creator.py", line 66, in _create_slot_var
validate_shape=validate_shape)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/variable_scope.py", line 1049, in get_variable
use_resource=use_resource, custom_getter=custom_getter)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/variable_scope.py", line 948, in get_variable
use_resource=use_resource, custom_getter=custom_getter)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/variable_scope.py", line 356, in get_variable
validate_shape=validate_shape, use_resource=use_resource)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/variable_scope.py", line 341, in _true_getter
use_resource=use_resource)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/variable_scope.py", line 671, in _get_single_variable
"VarScope?" % name)
ValueError:
Variable d_w1/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?

@fulkast
Copy link

fulkast commented Jun 28, 2017

I made some necessary namescope fixes to run the code on my machine where I use Tensorflow 1.2.
I've submitted a PR which is here: #3

@ghost
Copy link

ghost commented Oct 5, 2017

does anybody know how to solve this problem? unless this piece of code is useless

@RanaivosonHerimanitra
Copy link

RanaivosonHerimanitra commented Nov 1, 2017

Restart and run kernel using the following inplace of cell number 4 of the original notebook:

sess = tf.Session() 
batch_size = 50 
z_dimensions = 100 
x_placeholder = tf.placeholder("float", shape = [None,28,28,1], name='x_placeholder') 
Gz = generator(batch_size, z_dimensions) 
Dx = discriminator(x_placeholder) 
with tf.variable_scope(tf.get_variable_scope()) as scope:
    pass 
Dg = discriminator(Gz, reuse=True)
g_loss = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=Dg, labels=tf.ones_like(Dg))) 
d_loss_real = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=Dx, labels=tf.fill([batch_size, 1], 0.9))) 
d_loss_fake = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=Dg, labels=tf.zeros_like(Dg))) 
d_loss = d_loss_real + d_loss_fake 
tvars = tf.trainable_variables() 
d_vars = [var for var in tvars if  'd_' in var.name] 
g_vars = [var for var in tvars if not 'g_' in var.name] 

with tf.variable_scope(scope):    
    d_trainer_fake = tf.train.AdamOptimizer(0.0001).minimize(d_loss_fake, var_list=d_vars) 
    d_trainer_real = tf.train.AdamOptimizer(0.0001).minimize(d_loss_real, var_list=d_vars) 
    g_trainer = tf.train.AdamOptimizer(0.0001).minimize(g_loss, var_list=g_vars)
    

@pudepiedj
Copy link

The fix from RanaivosonHerimanitra solves the problem. Thank you.

@chuong98
Copy link

chuong98 commented May 20, 2018

The fix from RanaivosonHerimanitra lets the code run, but does not yield correct result.
Change this line from his code should correct it (remove not ):

g_vars = [var for var in tvars if 'g_' in var.name]

If you also want the code run continuously, change the plt.plot part to:

            plt.imshow(im.reshape([28, 28]), cmap='Greys')
            plt.show(block=False)
            plt.pause(2)
            plt.close()

@7LFB
Copy link

7LFB commented Feb 17, 2020

with tf.variable_scope(scope,reuse=tf.AUTO_REUSE):
optimizer_step=tf.train.AdamOptimizer(0.0001).minimized(loss, var)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants