Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[autoparallel]add bcast matmul strategies #1605

Merged

Conversation

YuliangLiu0306
Copy link
Contributor

In previous PR #1600, the strategies of operators like operator.add is created by the BcastOpHandler. However, torch.matmul is not covered in previous PR, due to its extra matrix dimensions.

In this PR, BcastOpHandler will create strategies for the broadcast matmul operator and analyse the memory cost, computation cost, communication cost and resharding costs of those strategies.

@FrankLeeeee FrankLeeeee merged commit 47b11c4 into hpcaitech:main Sep 20, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants