-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TensorFlow 2.0 - ValueError: tf.function-decorated #36574
Comments
@arjun-majumdar , |
Going through the link and the related tutorial. |
Hello, I have the following issue which doesn't make sense. The code that I have is as follows:
If, I re-execute the for i in range(1, 6): block of code again, why do I get the "ValueError: tf.function-decorated function tried to create variables on non-first call." ? What I am not getting is that "train_step()" and "test_step()" "tf.function" annotated functions are already traced and the AutoGraphs (or, tf.Graph object) are created for it. Also, the parameters being provided to these functions are not changing. Is the "ValueError" happening due to the line: grads = tape.gradient(loss, model.trainable_variables) within "train_step()" function and predictions = model(data) within "test_step()" function Since, these are the only two lines within the two functions which are creating a variable, but then again, the gradients with respect to the parameters and the model's predictions will always be made within the "tf.function" annotated functions. You cannot pass such values as parameters to the function(s). Thanks |
@arjun-majumdar |
@arjun-majumdar have you seen this related issue #27120 ? This comment might help you. Try creating a wrapper function for your train_one_step() function and then call separately when you train your different models. |
@nikitamaia Let me have a look and get back to you. Thanks! |
Hi @arjun-majumdar were you able to get your code working by creating a wrapper function? |
Hello @nikitamaia the TF annotated funtion after wrapping function works, however, the performance benefits gained by '@tf.function' annotation is lost. Also, why should the graph generated be retraced if the neural network architecture isn't changing and/or the data type of the tensors are also not changing. |
Sorry for the late response here. Wanted to provide a quick update that this does seem to be a bug. I know the workaround of wrapping the function is not ideal, but I can update this thread when there's progress on this. |
@nikitamaia Thanks for the reply. I will be happy to receive an update if there is a fix/solution found for this bug. |
The Better Performance With tf.function guide has now been updated to provide more detail about this error and about using using tf.variables with multiple Keras models or optimizers. Closing this issue now since there is a workaround provided in the docs. |
@nikitamaia I'm afraid that isn't the workround, it's just the rule, not fitting the user's demand |
Hello, I have a code (for MNIST dataset) in which I am doing the following steps:
I do the following steps iteratively 'n' times.
The code can be found in:
https://github.com/arjun-majumdar/tensorflow_codes/blob/master/Recreating_Error.ipynb
For retraining a pruned model, I use 'GradientTape' along with mask. Now, the first time the model is trained using train_one_step() and test_step() functions which are @tf.function annotated functions, things work fine. But when I try to use them again (in cell 76 of Jupyter notebook), it gives me the error:
The only way of avoiding this "ValueError" is by rerunning the train_one_step() and test_step() @tf.function annotated functions!
Why is this happening?
Thanks!
The text was updated successfully, but these errors were encountered: