Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix and simplify pytorch backend: array, einsum, from_numpy #1550

Merged
merged 1 commit into from May 31, 2022

Conversation

luisfpereira
Copy link
Collaborator

This PR acts specially in array for torch backend. It avoids unnecessary conversions to numpy.

Additionally, dtype has been added to from_numpy to cast when required. It needs to be extended to the other backends for compatibility, but it will play a role when scipy is used with torch or tensorflow (to ensure precision is kept).

einsum is also simplified. @ninamiolane the tests are not failing with this simplification, but I'm not sure if I'm missing something (I was not able to understand why einsum had such a complex wrapper). If you have an example that justifies the wrapper, please let me know so I revert the changes there.

It closes #1537 and starts paving the path to better performance in GPU.

@codecov
Copy link

codecov bot commented May 31, 2022

Codecov Report

Merging #1550 (7324258) into master (f284cc3) will decrease coverage by 0.77%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master    #1550      +/-   ##
==========================================
- Coverage   90.73%   89.96%   -0.76%     
==========================================
  Files         102      102              
  Lines       10414    10288     -126     
==========================================
- Hits         9448     9255     -193     
- Misses        966     1033      +67     
Flag Coverage Δ
autograd ?
numpy 87.15% <ø> (?)
pytorch 80.97% <100.00%> (-0.08%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
geomstats/_backend/pytorch/__init__.py 95.60% <100.00%> (+0.40%) ⬆️
geomstats/information_geometry/gamma.py 55.04% <0.00%> (-35.90%) ⬇️
geomstats/geometry/hypersphere.py 80.91% <0.00%> (-7.11%) ⬇️
geomstats/learning/geodesic_regression.py 84.42% <0.00%> (-5.19%) ⬇️
geomstats/geometry/grassmannian.py 93.69% <0.00%> (-1.05%) ⬇️
geomstats/learning/frechet_mean.py 96.00% <0.00%> (-0.88%) ⬇️
geomstats/geometry/special_euclidean.py 88.68% <0.00%> (-0.24%) ⬇️
geomstats/_backend/__init__.py 79.37% <0.00%> (ø)
geomstats/_backend/autograd/linalg.py
geomstats/_backend/autograd/__init__.py
... and 11 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f284cc3...7324258. Read the comment docs.

Copy link
Collaborator

@ninamiolane ninamiolane left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for this great simplification! I can't remember exactly why einsum was this complicated, it could be to allow einsums:

  • of the type ("ijk, jkl->ijl", a, b) i.e. with tensors of order 3 and more
  • of the type ("ijk, jkl, mjkl,->mijl", a, b, c) i.e. with 3 tensors or more
  • with ellipsis, i.e. of the type ("...ijk, ...jkl..., mjkl,->...mijl...", a, b, c)
  • a combination of the above,
    and it could be that previous versions of pytorch were not allowing it but new ones do?

In any case, if tests do not fail we can keep it like this... and remember where to look for if we encounter problems.

Thanks!

@ninamiolane ninamiolane merged commit 750d466 into geomstats:master May 31, 2022
@luisfpereira luisfpereira deleted the torch branch May 31, 2022 15:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

belongs throws error when checking double precision tensors
2 participants