Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BF16_Optimizer: add support for bf16 grad acc #4713

Merged
merged 3 commits into from
Dec 8, 2023

Commits on Nov 21, 2023

  1. bf16_optimizer: add support for bf16 grad acc

    the default accumulation data type is fp32
    by adding the below to deepspeed json file:
    "data_types" : {"grad_accum_dtype": "bf16"}
    gradient accumulation will be performed in bf16.
    nelyahu committed Nov 21, 2023
    Configuration menu
    Copy the full SHA
    f2f278f View commit details
    Browse the repository at this point in the history

Commits on Dec 4, 2023

  1. Configuration menu
    Copy the full SHA
    805e209 View commit details
    Browse the repository at this point in the history

Commits on Dec 7, 2023

  1. Configuration menu
    Copy the full SHA
    5b71c7d View commit details
    Browse the repository at this point in the history