You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to use only the Encoding models for my SDF model, where the SDF network is written in pytorch. I need to calculate the gradients from the SDF network which goes as input to the Color network. However, when calling loss.backward() it says RuntimeError: trying to differentiate twice a function that was markedwith @once_differentiable. Any ideas how to solve it?
The text was updated successfully, but these errors were encountered:
tiny-cuda-nn doesn't support second order derivatives, which means that the surface normal of the SDF network (which is the first derivative) is not allowed to appear anywhere in the loss. Or, alternatively, if it does appear, it must be computed in terms of finite differences such that everything boils down to just first-order derivatives.
I am trying to use only the Encoding models for my SDF model, where the SDF network is written in pytorch. I need to calculate the gradients from the SDF network which goes as input to the Color network. However, when calling
loss.backward()
it saysRuntimeError: trying to differentiate twice a function that was markedwith @once_differentiable
. Any ideas how to solve it?The text was updated successfully, but these errors were encountered: