Skip to content

Fix backward compatibility of torch.amp.custom_fwd for PyTorch < 2.4#7920

Merged
tohtana merged 4 commits intodeepspeedai:masterfrom
tohtana:tohtata/add-faullback-pt23
Mar 25, 2026
Merged

Fix backward compatibility of torch.amp.custom_fwd for PyTorch < 2.4#7920
tohtana merged 4 commits intodeepspeedai:masterfrom
tohtana:tohtata/add-faullback-pt23

Conversation

@tohtana
Copy link
Copy Markdown
Collaborator

@tohtana tohtana commented Mar 24, 2026

torch.amp.custom_fwd was introduced in PyTorch 2.4, so installing DeepSpeed from source with an older PyTorch fails because setup.py triggers an import of the function.
This PR adds a fallback to torch.cuda.amp.custom_fwd for PyTorch < 2.4.

Signed-off-by: Masahiro Tanaka <mtanaka@anyscale.com>
@tohtana tohtana requested a review from tjruwase as a code owner March 24, 2026 13:31
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 52f3d04267

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Signed-off-by: Masahiro Tanaka <mtanaka@anyscale.com>
@tohtana tohtana requested a review from loadams as a code owner March 25, 2026 13:07
Signed-off-by: Masahiro Tanaka <mtanaka@anyscale.com>
Copy link
Copy Markdown
Collaborator

@PKUWZP PKUWZP left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for submitting the PR, LGTM.

@tohtana tohtana enabled auto-merge (squash) March 25, 2026 21:21
@tohtana tohtana merged commit 138f20d into deepspeedai:master Mar 25, 2026
5 checks passed
nathon-lee pushed a commit to nathon-lee/DeepSpeed_woo that referenced this pull request Mar 27, 2026
…eepspeedai#7920)

`torch.amp.custom_fwd` was introduced in PyTorch 2.4, so installing
DeepSpeed from source with an older PyTorch fails because `setup.py`
triggers an import of the function.
This PR adds a fallback to `torch.cuda.amp.custom_fwd` for PyTorch <
2.4.

---------

Signed-off-by: Masahiro Tanaka <mtanaka@anyscale.com>
nathon-lee pushed a commit to nathon-lee/DeepSpeed_woo that referenced this pull request Mar 28, 2026
…eepspeedai#7920)

`torch.amp.custom_fwd` was introduced in PyTorch 2.4, so installing
DeepSpeed from source with an older PyTorch fails because `setup.py`
triggers an import of the function.
This PR adds a fallback to `torch.cuda.amp.custom_fwd` for PyTorch <
2.4.

---------

Signed-off-by: Masahiro Tanaka <mtanaka@anyscale.com>
Signed-off-by: nathon-lee <leejianwoo@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants