Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add local_shape to DistributedArray #59

Closed
2 tasks
mrava87 opened this issue Aug 8, 2023 · 0 comments · Fixed by #61
Closed
2 tasks

Add local_shape to DistributedArray #59

mrava87 opened this issue Aug 8, 2023 · 0 comments · Fixed by #61
Assignees

Comments

@mrava87
Copy link
Contributor

mrava87 commented Aug 8, 2023

Motivation

The class DistributedArray currently requires users to provide the global shape of the array (via global_shape), however it does not allow users to choose how such an array is split between ranks: local_shape is found by the local_split method to ensure best load balancing between ranks. There are however scenarios where users may want to choose local_shape, for example when they know they array will be re-interpreted as nd-array by some operator and the split would need to take into account the fact dimension zero should be split whilst the other dimensions should be on the same rank.

As such, I suggest to add local_shape as new input parameter for DistributedArray.

Tasks

  • Add local_shape=None as new input parameter for DistributedArray. If provided, force such splitting if not provided default to what we currently have.
  • Add test and example in plot_distributed_array.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants