-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error for running with TensorFlow: ValueError: Variable d_w1/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope? #2
Comments
I made some necessary namescope fixes to run the code on my machine where I use Tensorflow 1.2. |
does anybody know how to solve this problem? unless this piece of code is useless |
Restart and run kernel using the following inplace of cell number 4 of the original notebook:
|
The fix from RanaivosonHerimanitra solves the problem. Thank you. |
The fix from RanaivosonHerimanitra lets the code run, but does not yield correct result.
If you also want the code run continuously, change the plt.plot part to:
|
with tf.variable_scope(scope,reuse=tf.AUTO_REUSE): |
Hello, please maybe could you help me? I had a running for running, I tried to figurate how to solve but I could not, I tried also with Python 2, 3.5, 3.6, And Tensorflow 1 and 1.1
The complete Error was:
heather@heather-ThinkPad-P50:~/Downloads/generative-adversarial-networks-master (2)$ python3 gan-script.py
Extracting MNIST_data/train-images-idx3-ubyte.gz
Extracting MNIST_data/train-labels-idx1-ubyte.gz
Extracting MNIST_data/t10k-images-idx3-ubyte.gz
Extracting MNIST_data/t10k-labels-idx1-ubyte.gz
Traceback (most recent call last):
File "gan-script.py", line 131, in
d_trainer_fake = tf.train.AdamOptimizer(0.0003).minimize(d_loss_fake, var_list=d_vars)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/optimizer.py", line 325, in minimize
name=name)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/optimizer.py", line 446, in apply_gradients
self._create_slots([_get_variable_for(v) for v in var_list])
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/adam.py", line 122, in _create_slots
self._zeros_slot(v, "m", self._name)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/optimizer.py", line 766, in _zeros_slot
named_slots[_var_key(var)] = slot_creator.create_zeros_slot(var, op_name)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/slot_creator.py", line 174, in create_zeros_slot
colocate_with_primary=colocate_with_primary)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/slot_creator.py", line 146, in create_slot_with_initializer
dtype)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/slot_creator.py", line 66, in _create_slot_var
validate_shape=validate_shape)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/variable_scope.py", line 1049, in get_variable
use_resource=use_resource, custom_getter=custom_getter)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/variable_scope.py", line 948, in get_variable
use_resource=use_resource, custom_getter=custom_getter)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/variable_scope.py", line 356, in get_variable
validate_shape=validate_shape, use_resource=use_resource)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/variable_scope.py", line 341, in _true_getter
use_resource=use_resource)
File "/home/heather/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/variable_scope.py", line 671, in _get_single_variable
"VarScope?" % name)
ValueError:
Variable d_w1/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?
The text was updated successfully, but these errors were encountered: