Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

对于更高版本的 PyTorch,您的意思是所有的 backward function,都需要设置 "retain_graph=True, create_graph=True" ? #6

Open
wangxiao5791509 opened this issue Feb 18, 2019 · 4 comments

Comments

@wangxiao5791509
Copy link

“If you use our code based on a high-level version of PyTorch for other tasks, please ensure the "retain_graph=True, create_graph=True" in the backward function. ”

对于更高版本的 PyTorch,您的意思是所有的 backward function,都需要这么设置吗?

@Tomingz
Copy link

Tomingz commented Feb 27, 2019

你好,请问你这个是如何设置的?我是pytorch0.4

@wangxiao5791509
Copy link
Author

@Tomingz I am not sure, the author did not give any response yet. I think all the backward functions should be modified. It is more intuitive to directly use PyTorch 0.2 to train and test this code.

@Tomingz
Copy link

Tomingz commented Feb 28, 2019

ok.thank you,I will test by pytorch0.2

@ghost
Copy link

ghost commented May 22, 2019

where is the backward function?
I cannot find the retain_graph=True, create_graph=True.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants