You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for sharing the code! But I still have a question.
In a pac layer, a kernel will be calculated according to the guide input. And the kernel will be calculated with the target input to get the output tensor. So when the network does the backpropagation, will the relationship between the weight of the kernel and the guide input change? Or the weight of the kernel is updated only because of the gradient?
I would be very grateful if you can give me some help. Thanks!
The text was updated successfully, but these errors were encountered:
The backward pass of the layer will provide gradients on all three: a) weights (just like with regular conv2d), b) input features, and c) guide features. What we call kernel can be considered as an intermediate variable that depends only on guide features.
Thanks for sharing the code! But I still have a question.
In a pac layer, a kernel will be calculated according to the guide input. And the kernel will be calculated with the target input to get the output tensor. So when the network does the backpropagation, will the relationship between the weight of the kernel and the guide input change? Or the weight of the kernel is updated only because of the gradient?
I would be very grateful if you can give me some help. Thanks!
The text was updated successfully, but these errors were encountered: