Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature] Add clip_grad_norm for hybrid_parallel_plugin #4837

Merged
merged 6 commits into from
Oct 12, 2023

Conversation

littsk
Copy link
Contributor

@littsk littsk commented Sep 28, 2023

馃搶 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

馃毃 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

馃摑 What does this PR do?

Add clip_grad_norm for hybrid_parallel_plugin

馃挜 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

猸愶笍 Do you enjoy contributing to Colossal-AI?

  • 馃対 Yes, I do.
  • 馃寶 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@github-actions
Copy link
Contributor

github-actions bot commented Oct 8, 2023

The code coverage for the changed files is 84%.

Click me to view the complete report
Name                                                                                 Stmts   Miss  Cover
--------------------------------------------------------------------------------------------------------
colossalai/amp/naive_amp/mixed_precision_optimizer.py                                  132     19    86%
colossalai/booster/plugin/hybrid_parallel_plugin.py                                    338     37    89%
colossalai/zero/low_level/_utils.py                                                     94     59    37%
colossalai/zero/low_level/bookkeeping/gradient_store.py                                 39      1    97%
colossalai/zero/low_level/low_level_optim.py                                           374     32    91%
tests/test_shardformer/test_hybrid_parallel_grad_clip_norm/test_amp_optimizer.py       100     16    84%
tests/test_shardformer/test_hybrid_parallel_grad_clip_norm/test_naive_optimizer.py      90     19    79%
tests/test_shardformer/test_hybrid_parallel_grad_clip_norm/test_zero_optimizer.py       94     15    84%
--------------------------------------------------------------------------------------------------------
TOTAL                                                                                 1261    198    84%

@github-actions
Copy link
Contributor

The code coverage for the changed files is 84%.

Click me to view the complete report
Name                                                                                 Stmts   Miss  Cover
--------------------------------------------------------------------------------------------------------
colossalai/amp/naive_amp/mixed_precision_optimizer.py                                  112     23    79%
colossalai/booster/plugin/hybrid_parallel_plugin.py                                    366     41    89%
colossalai/zero/low_level/_utils.py                                                     94     59    37%
colossalai/zero/low_level/bookkeeping/gradient_store.py                                 39      1    97%
colossalai/zero/low_level/low_level_optim.py                                           361     30    92%
tests/test_shardformer/test_hybrid_parallel_grad_clip_norm/test_amp_optimizer.py       100     16    84%
tests/test_shardformer/test_hybrid_parallel_grad_clip_norm/test_naive_optimizer.py      90     19    79%
tests/test_shardformer/test_hybrid_parallel_grad_clip_norm/test_zero_optimizer.py       94     15    84%
--------------------------------------------------------------------------------------------------------
TOTAL                                                                                 1256    204    84%

@Fridge003 Fridge003 changed the title [feature] Add clip_grad_norm for hibrid_parallel_plugin [feature] Add clip_grad_norm for hybrid_parallel_plugin Oct 11, 2023
@Fridge003 Fridge003 merged commit 83b52c5 into hpcaitech:main Oct 12, 2023
36 of 37 checks passed
flybird11111 pushed a commit to flybird11111/ColossalAI that referenced this pull request Oct 18, 2023
* Add clip_grad_norm for hibrid_parallel_plugin

* polish code

* add unittests

* Move tp to a higher-level optimizer interface.

* bug fix

* polish code
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants