You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is not an issue, just a question. Using softplus activation is significantly slower than using for instance ReLU. Is there any technical (theoretical or practical) reason why you chose softplus in front of other more efficient activations?
Thank you!
The text was updated successfully, but these errors were encountered:
Hi,
The theoretical reason we are using softplus rather than relu is because we enforcing the Eikonal regularization (equation 11 in the paper), hence we need a differentiable activation (relu is not differential at 0).
Note the for the rendere network we do use relu, since we don’t have such restriction.
Hey there,
This is not an issue, just a question. Using
softplus
activation is significantly slower than using for instanceReLU
. Is there any technical (theoretical or practical) reason why you chosesoftplus
in front of other more efficient activations?Thank you!
The text was updated successfully, but these errors were encountered: