Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add argmin and min method in AST #3961

Merged
merged 6 commits into from
Aug 9, 2020
Merged

Add argmin and min method in AST #3961

merged 6 commits into from
Aug 9, 2020

Conversation

marload
Copy link
Member

@marload marload commented Aug 8, 2020

Description

I add min and argmin.

Affected Dependencies

List any dependencies that are required for this change.

How has this been tested?

  • Describe the tests that you ran to verify your changes.
  • Provide instructions so we can reproduce.
  • List any relevant details for your test configuration.

Checklist

@marload marload requested a review from a team as a code owner August 8, 2020 10:56
@codecov
Copy link

codecov bot commented Aug 8, 2020

Codecov Report

Merging #3961 into master will increase coverage by 0.01%.
The diff coverage is 98.48%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #3961      +/-   ##
==========================================
+ Coverage   94.85%   94.87%   +0.01%     
==========================================
  Files         202      202              
  Lines       20735    20801      +66     
==========================================
+ Hits        19669    19734      +65     
- Misses       1066     1067       +1     
Impacted Files Coverage Δ
...orks/torch/tensors/interpreters/additive_shared.py 92.38% <85.71%> (-0.08%) ⬇️
...frameworks/torch/tensors/interpreters/precision.py 97.10% <100.00%> (+0.02%) ⬆️
test/torch/tensors/test_additive_shared.py 100.00% <100.00%> (ø)

keepdim: keep the dimension of the tensor when dim is not None
algorithm: method to compute the minimum
Returns:
the max of the tensor self
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here is the minimum, right?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, it's my mistake. Thank You


t = torch.tensor([3, 1.0, 2])
x = t.fix_prec().share(*args, **kwargs)
print(x.argmin)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove this print

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

complete!

t = torch.tensor([[1, 2.0, 4], [3, 1.0, 2.0]])
x = t.fix_prec().share(*args, **kwargs)
ids = x.argmin(dim=1).get().float_prec()
assert (ids.long() == torch.argmin(t, dim=1)).all()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Q: Could you also add a test with one hot and keepdim?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added it 😄

@marload marload requested a review from gmuraru August 9, 2020 13:57
@gmuraru
Copy link
Member

gmuraru commented Aug 9, 2020

LGTM!

@gmuraru gmuraru merged commit 90a6f69 into OpenMined:master Aug 9, 2020
@marload
Copy link
Member Author

marload commented Aug 9, 2020

Thank you for your review! 😄

@marload marload deleted the feature/shared_tensor_min branch August 9, 2020 14:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants