Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

auto parallel bf16 #49079

Merged
merged 26 commits into from Dec 29, 2022
Merged

auto parallel bf16 #49079

merged 26 commits into from Dec 29, 2022

Conversation

xu98bin
Copy link
Contributor

@xu98bin xu98bin commented Dec 14, 2022

PR types

New features

PR changes

Others

Describe

Add Mixed Precision BF16 Pass o1 for Auto parallel

Usage

    dist_strategy = paddle.distributed.fleet.auto.Strategy()
    dist_strategy.amp =True
    dist_strategy.enable_bf16 =True
    dist_strategy.custom_bf16_list = ['gelu'],
    dist_strategy.custom_fp32_list = ['layer_norm', 'softmax']

@paddle-bot
Copy link

paddle-bot bot commented Dec 14, 2022

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

Copy link
Contributor

@zhaoyinglia zhaoyinglia left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@JZ-LIANG JZ-LIANG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@JZ-LIANG JZ-LIANG merged commit 418edae into PaddlePaddle:develop Dec 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants