-
Notifications
You must be signed in to change notification settings - Fork 19.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: An operation has None
for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval.
#12521
Comments
Which version are you on? I am hitting this on some generic math manipulations in tf 2.0. I think it is weird error message. |
@cottrell same issue |
@Utsav-Patel, @luozhouyang, @cottrell Does your code have any weights that were defined but left unused? That may be the reason for that error. My guess is that since it is not being used, its gradient can not be computed wr.t. loss. Thus gradient is None. This is more difficult to identify if your layer is inheriting from another layer. Calling the super constructor will add weights that you probably don't use. In which case, don't call the super(). I've coded out an example to show this in action (tf version 1.13.1, keras 2.2.4). Comment out the line
inside call(), to get the error. If commented out that, then self.kernelB is never used, and keras gives you an error.
|
@abaxi I could reproduce this error with another example similar to yours. You can find the example here: https://stackoverflow.com/a/58533503/3924118. Just remove the usage of |
@Utsav-Patel This error arises when some of your weights in the model are not used. So, it shows that it is not differentiable. Make sure you use all the weights in the model to overcome this error. |
How do you ensure using all weights in the model |
In my case I just used the left over weights by multiplying by 0, so that all weights are covered. This solved the issue. |
Thank you Sir |
Can you please suggest me a way to check for unused weights ? |
I was using only a part of hidden node weights to calculate the output. So, after getting this error, I multiplied the remaining hidden node weights with zero. So, all the weights are covered. |
I asked my question on StackOverflow. Link.
I tried to make a custom layer using keras. I only want to implement the following 2 lines of code in
call
function which should be trainable.AV = K.dot(A, Vin)
Vout = K.dot(AV, W)
dimensions of
A
,Vin
andW
are(n, n)
,(?, n, c)
and(c, f)
respectively.I would like to train my network on mnist or cifar10 dataset.
Sharky said in her answer that it depends on the dataset and data shapes.
I don't get exactly what is the problem here.
Please, someone, help me to overcome this problem.
Thank you.
The text was updated successfully, but these errors were encountered: