Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I apply activation to each factor matrix? #556

Closed
SCIKings opened this issue May 6, 2024 · 2 comments
Closed

How can I apply activation to each factor matrix? #556

SCIKings opened this issue May 6, 2024 · 2 comments

Comments

@SCIKings
Copy link

SCIKings commented May 6, 2024

I want to apply an activation layer to each factor matrix, and then reconstruct the result after activation, and then the resulting result is approximated with the original tensor, so how should I use the library tensorly? For example (tl.decomposition.partial_tucker): w is approximated as f(w1)*f(w2)*f(w3), where f() denotes the activation function. I would be grateful if you could provide me with some help.

@yngvem
Copy link
Contributor

yngvem commented May 6, 2024

Closing as this is not an issue with TensorLy.

@yngvem yngvem closed this as not planned Won't fix, can't repro, duplicate, stale May 6, 2024
@JeanKossaifi
Copy link
Member

@SCIKings TensorLy-Torch provides pytorch based layers and factorizations on top of TensorLy. You can have a look at the hooks we provide (e.g. tensor dropout), you can write a similar hook for the activations you are interested in.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants