Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[autoparallel] apply repeat block to reduce solving time #2912

Conversation

YuliangLiu0306
Copy link
Contributor

@YuliangLiu0306 YuliangLiu0306 commented Feb 27, 2023

馃搶 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

馃毃 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

馃摑 What does this PR do?

  • Build alias set in StrategyConstructor for all repeating blocks.
  • Remove graph analysis from auto parallel workflow
  • Using alias set to prune the solving space

馃挜 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

猸愶笍 Do you enjoy contributing to Colossal-AI?

  • 馃対 Yes, I do.
  • 馃寶 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@github-actions
Copy link
Contributor

The code coverage for the changed files is 16%.

Click me to view the complete report
Name                                                                                 Stmts   Miss  Cover
--------------------------------------------------------------------------------------------------------
colossalai/auto_parallel/tensor_shard/initialize.py                                    118     91    23%
colossalai/auto_parallel/tensor_shard/solver/solver.py                                 275    252     8%
colossalai/auto_parallel/tensor_shard/solver/strategies_constructor.py                 116     97    16%
tests/test_auto_parallel/test_tensor_shard/test_gpt/test_solver_with_gpt_module.py      65     45    31%
tests/test_auto_parallel/test_tensor_shard/test_node_handler/utils.py                  122    105    14%
tests/test_auto_parallel/test_tensor_shard/test_solver_with_resnet_v2.py                54     41    24%
--------------------------------------------------------------------------------------------------------
TOTAL                                                                                  750    631    16%

@YuliangLiu0306 YuliangLiu0306 merged commit 197d0bf into hpcaitech:main Feb 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto-parallel related to the auto-parallel feature
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FEATURE]: build alias set for repeat blocks and use it to reduce solving time
2 participants