Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer #1442

Merged
merged 4 commits into from
Aug 11, 2022

Conversation

ver217
Copy link
Member

@ver217 ver217 commented Aug 11, 2022

from colossalai.utils.common import clip_grad_norm
total_norm = clip_grad_norm(params, 1.0, norm_type=2.0)

Grads can be on CUDA and CPU. They can be either fp16 or fp32.

CsRic and others added 4 commits August 10, 2022 15:01
* remade clip_grad_norm_fp32() with the new ColoTensor API

* test world size

* remove old function

* preserve old function

* simplify program, speedup

* delete useless import

* remade clip_grad_norm_fp32() with the new ColoTensor API

* test world size

* remove old function

* preserve old function

* simplify program, speedup

* delete useless import

Co-authored-by: csric <mkkt_bkkt@mail.ustc.edu.cn>
@ver217 ver217 changed the title Impl clip_grad_norm for ColoTensor and ZeroOptimizer [utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer Aug 11, 2022
@feifeibear feifeibear merged commit 821c617 into main Aug 11, 2022
@ver217 ver217 deleted the feature/clip-grad branch August 12, 2022 03:40
@densechen
Copy link

How can I use this function under lightning? It seems that "on_after_backward()" hook does not work...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants