Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Features/176 flatten #501

Merged
merged 22 commits into from Apr 24, 2020
Merged

Features/176 flatten #501

merged 22 commits into from Apr 24, 2020

Conversation

mtar
Copy link
Collaborator

@mtar mtar commented Mar 12, 2020

Description

Include a summary of the change/s.
Please also include relevant motivation and context. List any dependencies that are required for this change.

implementation of flatten(). It returns a new flattened tensor.

Issue/s resolved: #176

Changes proposed:

  • add flatten function

Type of change

Remove irrelevant options:

  • New feature (non-breaking change which adds functionality)

Due Diligence

  • All split configurations tested
  • Multiple dtypes tested in relevant functions
  • Documentation updated (if needed)
  • Updated changelog.md under the title "Pending Additions"

Does this change modify the behaviour of other functions? If so, which?

no

@codecov
Copy link

codecov bot commented Mar 12, 2020

Codecov Report

Merging #501 into master will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #501   +/-   ##
=======================================
  Coverage   96.39%   96.39%           
=======================================
  Files          75       75           
  Lines       14849    14887   +38     
=======================================
+ Hits        14313    14351   +38     
  Misses        536      536           
Impacted Files Coverage Δ
heat/core/dndarray.py 96.76% <100.00%> (+<0.01%) ⬆️
heat/core/manipulations.py 99.30% <100.00%> (+0.01%) ⬆️
heat/core/tests/test_dndarray.py 99.31% <100.00%> (+<0.01%) ⬆️
heat/core/tests/test_manipulations.py 99.32% <100.00%> (+0.01%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1012a08...724a741. Read the comment docs.

Copy link
Member

@coquelin77 coquelin77 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this does flatten the array but i think that it might also destroy the global ordering. can you look into this?

@mtar
Copy link
Collaborator Author

mtar commented Mar 23, 2020

this does flatten the array but i think that it might also destroy the global ordering. can you look into this?

What do you mean?

@coquelin77
Copy link
Member

this does flatten the array but i think that it might also destroy the global ordering. can you look into this?

What do you mean?

x = torch.arange(18).reshape(3, 6)
x1 = ht.array(x, split=1)
print(x1)
print(ht.flatten(x1))
[0] tensor([[ 0,  1,  2],
[0]         [ 6,  7,  8],
[0]         [12, 13, 14]])
[1] tensor([[ 3,  4,  5],
[1]         [ 9, 10, 11],
[1]         [15, 16, 17]])

[0] tensor([ 0,  2,  7,  3,  5, 10,  1,  6,  8])
[1] tensor([ 4,  9, 11, 12, 13, 14, 15, 16, 17])

also there should be an inline function for flatten so one can call x.flatten()

@mtar
Copy link
Collaborator Author

mtar commented Mar 24, 2020

It's a problem of the ht.resplit function then, isn't it?

x = torch.arange(18).reshape(3, 6)
x1 = ht.array(x, split=1)
x0 = ht.resplit(x1, 0)
print(ht.MPI_WORLD.rank, x0)
0 tensor([[ 0,  2,  7,  3,  5, 10],
        [ 1,  6,  8,  4,  9, 11]])
1 tensor([[12, 13, 14, 15, 16, 17]])

@coquelin77
Copy link
Member

#425 is already created. the resplit problem is known and it is being worked on but it is more complex then it first appears.

that being said, your algorithm works for split=0. i do not have an eta on when the resplit function solution will be up. in the meantime a raise/warning can be implemented and then leave a comment on the mentioned error. that way when the issue is solved then the raise/warning will be removed

@coquelin77
Copy link
Member

#425 is already created. the resplit problem is known and it is being worked on but it is more complex then it first appears.

that being said, your algorithm works for split=0. i do not have an eta on when the resplit function solution will be up. in the meantime a raise/warning can be implemented and then leave a comment on the mentioned error. that way when the issue is solved then the raise/warning will be removed

replit fix is in the works. im going to say that we pause this PR until that is done. I have a working resplit algo (not touching split=None yet though) that i am working on finishing up. hopefully done in the coming days

# The resplit function scramble the tensor when switching axes, see issue 425
if a.split > 0:
a = resplit(a, 0)
warnings.warn("The flattened tensor may have a wrong order for split axes > 0", UserWarning)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Warning should be removed, after resplit is working.

heat/core/manipulations.py Show resolved Hide resolved
coquelin77
coquelin77 previously approved these changes Apr 21, 2020
@coquelin77 coquelin77 merged commit bf88e5b into master Apr 24, 2020
@coquelin77 coquelin77 deleted the features/176-flatten branch April 24, 2020 13:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement flatten()
3 participants