Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug in 2nd order? #26

Open
Interesting6 opened this issue Jun 13, 2019 · 4 comments
Open

bug in 2nd order? #26

Interesting6 opened this issue Jun 13, 2019 · 4 comments

Comments

@Interesting6
Copy link

Interesting6 commented Jun 13, 2019

I see in your code, just using
self.net(x_spt[i], fast_weights, bn_training=True)

however the torch.autograd.grad() method contain the following parameter:

create_graph (bool, optional) – If True, graph of the derivative will be constructed, allowing to compute higher order derivative products. Default: False.

Is that means your code just calculates the 1st order derivative?

Thank you!

@Interesting6 Interesting6 changed the title bug2nd order? bug in 2nd order? Jun 13, 2019
@jayzhan211
Copy link

self.net is not using torch.autograd.grad()

@jingjingjing-666
Copy link

same confusion

@Vampire-Vx
Copy link

I think it is a 1st order approximation actually

@iamxiaoyubei
Copy link

+1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants