You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some activation functions can be improved by using Tensor operations (instead of designated methods in the form of evaluators).
This is especially true for sigmoid and linear activation functions where both the forward and backward steps can be defined with tensor operations (ReLU would require a definition of 2 new tensor operations).
This will likely cause the activation functions to use more auxiliary space, so maybe some kind of basic memory manager could be used (that would contain some working spaces that are shared by activation functions/losses).
The text was updated successfully, but these errors were encountered:
Some activation functions can be improved by using Tensor operations (instead of designated methods in the form of evaluators).
This is especially true for sigmoid and linear activation functions where both the forward and backward steps can be defined with tensor operations (ReLU would require a definition of 2 new tensor operations).
This will likely cause the activation functions to use more auxiliary space, so maybe some kind of basic memory manager could be used (that would contain some working spaces that are shared by activation functions/losses).
The text was updated successfully, but these errors were encountered: