Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Activation function improvements #8

Open
Jaswar opened this issue Feb 13, 2023 · 0 comments
Open

Activation function improvements #8

Jaswar opened this issue Feb 13, 2023 · 0 comments
Labels
enhancement An improvement to existing features new feature Implementation of a new feature

Comments

@Jaswar
Copy link
Owner

Jaswar commented Feb 13, 2023

Some activation functions can be improved by using Tensor operations (instead of designated methods in the form of evaluators).

This is especially true for sigmoid and linear activation functions where both the forward and backward steps can be defined with tensor operations (ReLU would require a definition of 2 new tensor operations).

This will likely cause the activation functions to use more auxiliary space, so maybe some kind of basic memory manager could be used (that would contain some working spaces that are shared by activation functions/losses).

@Jaswar Jaswar added enhancement An improvement to existing features new feature Implementation of a new feature labels Feb 13, 2023
@Jaswar Jaswar mentioned this issue Feb 13, 2023
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement An improvement to existing features new feature Implementation of a new feature
Projects
None yet
Development

No branches or pull requests

1 participant