You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add final to getParameters() to prevent overriding in subclass
We might need to add a checking to ensure not invoking this method multiple times but alert that in comments maybe enough as it would slightly effect the performance.
Enrich the comments with:
This function returns two tensors. One for the flattened learnable parameters flatParameters and another for the gradients of the energy wrt to the learnable parameters flatGradParameters.
Custom modules should not override this function. They should instead override parameters(...) which is, in turn, called by the present function.
This function will go over all the weights and gradWeights and make them view into a single tensor (one for weights and one for gradWeights). Since the storage of every weight and gradWeight is changed, this function should be called only once on a given network.
The text was updated successfully, but these errors were encountered:
final
to getParameters() to prevent overriding in subclassThis function returns two tensors. One for the flattened learnable parameters flatParameters and another for the gradients of the energy wrt to the learnable parameters flatGradParameters.
Custom modules should not override this function. They should instead override parameters(...) which is, in turn, called by the present function.
This function will go over all the weights and gradWeights and make them view into a single tensor (one for weights and one for gradWeights). Since the storage of every weight and gradWeight is changed, this function should be called only once on a given network.
The text was updated successfully, but these errors were encountered: