Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sign tensor operator for copying sign of one tensor onto another #522

Closed
wbrickner opened this issue Jul 24, 2023 · 6 comments · Fixed by #1446
Closed

sign tensor operator for copying sign of one tensor onto another #522

wbrickner opened this issue Jul 24, 2023 · 6 comments · Fixed by #1446
Assignees
Labels
feature The feature request good first issue Good for newcomers

Comments

@wbrickner
Copy link
Contributor

wbrickner commented Jul 24, 2023

I need to take tensor a, and apply its sign of all of its elements to the elements of tensor b.

There seems to be no autodiff compatible way to do this, although all backends support it AFAIK.

Thank you

@antimora antimora added the feature The feature request label Jul 24, 2023
@antimora antimora changed the title How to copy sign of one tensor onto another? abs and sign tensor operators for copying sign of one tensor onto another Jul 24, 2023
@antimora
Copy link
Collaborator

We are crucially missing two tensor operations for this: sign() and abs(). If we had them, then you could do this:

# Get the sign of tensor_a
sign_a = tensor_a.sign()

# Make a new tensor that is the absolute value of tensor_b, multiplied by the sign of tensor_a
output_tensor = tensor_b.abs() * sign_a

For missing ops there are workarounds with the existing ops, which I can provide tomorrow after verifying they work on my computer. Stay tuned.

I'll leave this issue open and convert to a feature request for abs and sign

@wbrickner
Copy link
Contributor Author

For now I have done the (very silly):

let sign = t.clone() / (t.clone() * t.clone()).sqrt();

although this is not guaranteed to yield elements in {-1, +1}.
What is the blocker for adding these operations to Tensor?

Also, I only care about the tch backend, so is there some way to specialize my code and just use tensor ops from tch, while maintaining compatibility with the burn autodiff / ml system?

@nathanielsimard
Copy link
Member

There is no example on how to add code specialized for a backend, but it is possible. The best way to do it is to create another trait MyBackend: Backend + MyAdditionalFunctions where you can implement the trait MyAdditionalFunctions for the backend you want to use. To support autodiff, you would need to implement the function for two backends: TchBackend and ADBackendDecorator<TchBackend>. For now, I would not recommend to do that, since it would probably be faster to add the ops to burn, and the lack of documentation/example can slow you down.

The sign function can simply be the following:

let sign = t.ones_like().mask_fill(t.lower_elem(0.0), -1.0);

@antimora
Copy link
Collaborator

I guess with the missing ops you could do this:

let sign = tensor.ones_like().mask_fill(tensor.lower_elem(0.0), -1.0);

let output_tensor =  tensor.powf(2.0).sqrt().mul(sign);

Note: I haven't tried it yet myself.

@antimora antimora changed the title abs and sign tensor operators for copying sign of one tensor onto another sign tensor operators for copying sign of one tensor onto another Jul 24, 2023
@antimora
Copy link
Collaborator

Linking a related issue here: #506

@antimora antimora changed the title sign tensor operators for copying sign of one tensor onto another sign tensor operator for copying sign of one tensor onto another Jul 24, 2023
@antimora antimora added the good first issue Good for newcomers label Nov 20, 2023
@antimora antimora self-assigned this Mar 9, 2024
@antimora
Copy link
Collaborator

There is no example on how to add code specialized for a backend, but it is possible. The best way to do it is to create another trait MyBackend: Backend + MyAdditionalFunctions where you can implement the trait MyAdditionalFunctions for the backend you want to use. To support autodiff, you would need to implement the function for two backends: TchBackend and ADBackendDecorator<TchBackend>. For now, I would not recommend to do that, since it would probably be faster to add the ops to burn, and the lack of documentation/example can slow you down.

The sign function can simply be the following:

let sign = t.ones_like().mask_fill(t.lower_elem(0.0), -1.0);

This does not account for 0.0 case which should be 0.0 according to PyTorch's implementation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature The feature request good first issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants