You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wanted to get grad of input xyz,but function returned None,and requires_grad=True/certain_grad doesnot work.Is it because the input of network concate x and xyz?
def patch_network_forward(self, input,):
xyz = input[:, -original_coordinate_size:]
...,
for layer in range(0, self.num_layers - 1):
lin = getattr(self, "patch_lin" + str(layer))
if layer in self.latent_in:
x = torch.cat([x, input], 1)
elif layer != 0 and self.xyz_in_all:
x = torch.cat([x, xyz], 1)
x = lin(x)
if layer < self.num_layers - 2:
if (
self.norm_layers is not None
and layer in self.norm_layers
and not self.weight_norm
):
bn = getattr(self, "patch_bn" + str(layer))
x = bn(x)
#x = self.softplus(x)
x = self.relu(x)
#x = self.elu(x)
if self.dropout is not None and layer in self.dropout:
x = F.dropout(x, p=self.dropout_prob, training=self.training)
The text was updated successfully, but these errors were encountered:
I wanted to get grad of input xyz,but function returned None,and requires_grad=True/certain_grad doesnot work.Is it because the input of network concate x and xyz?
def patch_network_forward(self, input,):
xyz = input[:, -original_coordinate_size:]
...,
for layer in range(0, self.num_layers - 1):
lin = getattr(self, "patch_lin" + str(layer))
if layer in self.latent_in:
x = torch.cat([x, input], 1)
elif layer != 0 and self.xyz_in_all:
x = torch.cat([x, xyz], 1)
x = lin(x)
if layer < self.num_layers - 2:
if (
self.norm_layers is not None
and layer in self.norm_layers
and not self.weight_norm
):
bn = getattr(self, "patch_bn" + str(layer))
x = bn(x)
#x = self.softplus(x)
x = self.relu(x)
#x = self.elu(x)
if self.dropout is not None and layer in self.dropout:
x = F.dropout(x, p=self.dropout_prob, training=self.training)
The text was updated successfully, but these errors were encountered: