Skip to content

Support NVFP4 fp32 bias#3525

Merged
andrewor14 merged 1 commit intomainfrom
nvfp4-fp32-bias
Dec 23, 2025
Merged

Support NVFP4 fp32 bias#3525
andrewor14 merged 1 commit intomainfrom
nvfp4-fp32-bias

Conversation

@andrewor14
Copy link
Copy Markdown
Contributor

Summary: Today we hit this error with fp32 inputs + bias:

RuntimeError: Bias is not supported when module weight is in fp32
(out_dtype=Float32). Please use bfloat16 or float16 weights,
or remove the bias from the linear layer.

This is thrown by NVFP4DynamicActivationNVFP4WeightConfig but it's trying to guard against this underlying _scaled_mm error:

RuntimeError: Bias is not supported when out_dtype is set to Float32

This commit works around these errors by adding the bias separately in this case, similar to what float8 does.

Test Plan:

pytest test/prototype/mx_formats/test_inference_workflow.py -k test_inference_workflow_nvfp4

@pytorch-bot
Copy link
Copy Markdown

pytorch-bot bot commented Dec 22, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/3525

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit ece0c5e with merge base 428bbcf (image):

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Dec 22, 2025
@andrewor14 andrewor14 requested review from drisspg and vkuzo December 22, 2025 16:15
@andrewor14 andrewor14 added the topic: improvement Use this tag if this PR is an improvement (doesn't fit into any of the other categories) label Dec 22, 2025
# since bias is not quantized
should_add_bias_separately = (scale_result is not None) and (bias is not None)
#
# (2) RuntimeError: Bias is not supported when out_dtype is set to Float32
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay this is what I thought would happen

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah but this only happens if per_tensor_scale=None (by default it is not), so users generally won't run into the _scaled_mm error. Either way this PR fixes that case

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you just create a note somewhere on the conversions / casting path for the paths?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added a comment

**Summary:** Today we hit this error with fp32 inputs + bias:

```
RuntimeError: Bias is not supported when module weight is in fp32
(out_dtype=Float32). Please use bfloat16 or float16 weights,
or remove the bias from the linear layer.
```

This is thrown by `NVFP4DynamicActivationNVFP4WeightConfig` but
it's trying to guard against this underlying `_scaled_mm` error:

```
RuntimeError: Bias is not supported when out_dtype is set to Float32
```

This commit works around these errors by adding the bias separately
in this case, similar to what float8 does.

**Test Plan:**

```
pytest test/prototype/mx_formats/test_inference_workflow.py -k test_inference_workflow_nvfp4
```
@andrewor14 andrewor14 merged commit 962ea18 into main Dec 23, 2025
20 of 21 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: improvement Use this tag if this PR is an improvement (doesn't fit into any of the other categories)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants