Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Usage of redner trainning in other network #50

Closed
YuanxunLu opened this issue Jul 20, 2019 · 3 comments
Closed

Usage of redner trainning in other network #50

YuanxunLu opened this issue Jul 20, 2019 · 3 comments

Comments

@YuanxunLu
Copy link

YuanxunLu commented Jul 20, 2019

Thanks for this wonderful work!
I met troubles when I tried to use redner as a differentiable render module in my whole networks. Given an image I, I train an network X to predict the vertices corresponding to it. Network X includes an encoder to learn features about I to construct vertices.
So, I could get

Pred_vertices = X(I)

Next, I want to use redner to learn textures & lighting via rendering the vertices to 2D plane.
My original assume is that: redner achieves gradients about vertices & textures & lighting from loss between images, and gradients about vertices can be back passed to my encoder X. However, I got error like

File "/home/yuanxun/anaconda3/lib/python3.6/site-packages/spyder_kernels/customize/spydercustomize.py", line 827, in runfile
execfile(filename, namespace)

File "/home/yuanxun/anaconda3/lib/python3.6/site-packages/spyder_kernels/customize/spydercustomize.py", line 110, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)

File "/media/yuanxun/E/My Experiment/train.py", line 177, in
train_loss.backward()

File "/home/yuanxun/anaconda3/lib/python3.6/site-packages/torch/tensor.py", line 107, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph)

File "/home/yuanxun/anaconda3/lib/python3.6/site-packages/torch/autograd/init.py", line 93, in backward
allow_unreachable=True) # allow_unreachable flag

RuntimeError: Function RenderFunctionBackward returned an invalid gradient at index 5 - got [1, 3] but expected shape compatible with [1, 53215, 3]

I original thought I can use redner as another part of My whole network but found I thought it too simple.
I guess there's problem set in gradients BP between redner and X. I think I need to write a torch.autograd.Function wrapper API to get the gradients of vertices from RenderFunction.backward() in render_pytorch.py and return the gradients to my network X. But I found difficulties here, I really don't know how to achieve the gradients of redner. Could you tell me how to get the gradients computed by redner?
Thanks!

@BachiLi
Copy link
Owner

BachiLi commented Jul 27, 2019

I don't see fundamental reason why this wouldn't work. If you can create a minimal example for me to debug that will be much easier.

@BachiLi
Copy link
Owner

BachiLi commented Aug 28, 2019

See code at #58 for an example of using network-generated-vertices during optimization.

@YuanxunLu
Copy link
Author

I believe that could solve my problem. Thanks for your attention. Closed this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants