Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A question about updating the weight of the kernel #29

Closed
OmigaXm opened this issue Mar 18, 2022 · 2 comments
Closed

A question about updating the weight of the kernel #29

OmigaXm opened this issue Mar 18, 2022 · 2 comments

Comments

@OmigaXm
Copy link

OmigaXm commented Mar 18, 2022

Thanks for sharing the code! But I still have a question.

In a pac layer, a kernel will be calculated according to the guide input. And the kernel will be calculated with the target input to get the output tensor. So when the network does the backpropagation, will the relationship between the weight of the kernel and the guide input change? Or the weight of the kernel is updated only because of the gradient?

I would be very grateful if you can give me some help. Thanks!

@suhangpro
Copy link
Contributor

The backward pass of the layer will provide gradients on all three: a) weights (just like with regular conv2d), b) input features, and c) guide features. What we call kernel can be considered as an intermediate variable that depends only on guide features.

@OmigaXm
Copy link
Author

OmigaXm commented Apr 22, 2022

oh, I see. Thanks!

@OmigaXm OmigaXm closed this as completed May 4, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants