Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add allreduce test #8

Closed
jeffra opened this issue Feb 4, 2020 · 0 comments · Fixed by #7
Closed

add allreduce test #8

jeffra opened this issue Feb 4, 2020 · 0 comments · Fixed by #7
Assignees
Labels
enhancement New feature or request

Comments

@jeffra
Copy link
Collaborator

jeffra commented Feb 4, 2020

No description provided.

@jeffra jeffra linked a pull request Feb 4, 2020 that will close this issue
@jeffra jeffra added the enhancement New feature or request label Feb 4, 2020
@jeffra jeffra closed this as completed in #7 Feb 4, 2020
jithunnair-amd referenced this issue in jithunnair-amd/DeepSpeed Apr 19, 2021
* Add hiprand and rocrand include paths for transformers extension

* Add patched HIP CG headers to enable transformer extension
rraminen added a commit to rraminen/DeepSpeed that referenced this issue Jun 23, 2021
liamcli pushed a commit to determined-ai/DeepSpeed that referenced this issue Sep 27, 2021
* fixing buffers in transformer kernel when gelu-checkpoint is enabled

* fixing the test issue for other memory optimization flags

* fixing a bug for when attn_dropout_checkpoint is enabled

Co-authored-by: Reza Yazdani <44502768+RezaYazdaniAminabadi@users.noreply.github.com>
pengwa pushed a commit to pengwa/DeepSpeed that referenced this issue Oct 14, 2022
pengwa pushed a commit to pengwa/DeepSpeed that referenced this issue Oct 14, 2022
sywangyi pushed a commit to sywangyi/DeepSpeed that referenced this issue Mar 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants