Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Features/torch proxy #856

Merged
merged 6 commits into from Aug 23, 2021
Merged

Features/torch proxy #856

merged 6 commits into from Aug 23, 2021

Conversation

ClaudiaComito
Copy link
Contributor

@ClaudiaComito ClaudiaComito commented Aug 23, 2021

Description

I propose to implement a DNDarray method to return a 1-byte torch.Tensor strided as the original DNDarray global shape. At @ben-bou 's suggestion, we have been using this "proxy" tensor for sanitation, as it allows us to easily bypass many of the instance checks. It is also quite practical to assess the shape of the output DNDarray of any operation. See getitem/setitem, or manipulations.tile() for examples.

Issue/s resolved: None. Implemented with an eye on long-term maintenance / minimizing repetitions / simplifying code.

Changes proposed:

  • Implement DNDarray.__torch_proxy__()
  • Implement tests
  • replace "proxy" definitions in dndarray and manipulations with DNDarray.__torch_proxy__() calls
  • enforce DNDarray.size returning int even if DNDarray.ndim is 0

Type of change

  • New feature (non-breaking change which adds functionality)

Due Diligence

  • All split configurations tested
  • Multiple dtypes tested in relevant functions
  • Documentation updated (if needed)
  • Updated changelog.md under the title "Pending Additions"

Does this change modify the behaviour of other functions? If so, which?

no

@ClaudiaComito ClaudiaComito added this to In progress in Internal overhaul `ht.array()` --> `DNDarray()` via automation Aug 23, 2021
@ClaudiaComito ClaudiaComito added this to the 1.2.x milestone Aug 23, 2021
@codecov
Copy link

codecov bot commented Aug 23, 2021

Codecov Report

Merging #856 (be8979d) into master (e2f75c3) will increase coverage by 0.05%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #856      +/-   ##
==========================================
+ Coverage   95.40%   95.45%   +0.05%     
==========================================
  Files          64       64              
  Lines        9380     9382       +2     
==========================================
+ Hits         8949     8956       +7     
+ Misses        431      426       -5     
Flag Coverage Δ
gpu 94.62% <100.00%> (+0.05%) ⬆️
unit 91.22% <100.00%> (+0.05%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
heat/core/dndarray.py 97.31% <100.00%> (+<0.01%) ⬆️
heat/core/manipulations.py 98.94% <100.00%> (+0.37%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update e2f75c3...be8979d. Read the comment docs.

@ClaudiaComito
Copy link
Contributor Author

rerun tests

Internal overhaul `ht.array()` --> `DNDarray()` automation moved this from In progress to Reviewer approved Aug 23, 2021
@coquelin77 coquelin77 merged commit cef003d into master Aug 23, 2021
Internal overhaul `ht.array()` --> `DNDarray()` automation moved this from Reviewer approved to Done Aug 23, 2021
@coquelin77 coquelin77 deleted the features/torch_proxy branch August 23, 2021 14:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants