Skip to content

Conversation

@digantdesai
Copy link
Contributor

Summary:
FYI - there are hardcoded float in rmsnorm which makes bunch of nodes in the graph as fp32.

aten_embedding_default: "f16[1, 3, 64]" = executorch_exir_dialects_edge__ops_aten_embedding_default(arg11_1, arg55_1);  arg11_1 = arg55_1 = None
            aten_slice_copy_tensor: "f16[3, 4]" = executorch_exir_dialects_edge__ops_aten_slice_copy_Tensor(arg48_1, 0, 0, 3);  arg48_1 = None
            aten_slice_copy_tensor_1: "f16[3, 4]" = executorch_exir_dialects_edge__ops_aten_slice_copy_Tensor(arg49_1, 0, 0, 3);  arg49_1 = None
            aten__to_copy_default: "f32[1, 3, 64]" = executorch_exir_dialects_edge__ops_aten__to_copy_default(aten_embedding_default, dtype = torch.float32)

Copy from - https://www.internalfb.com/code/fbsource/%5B7e45e7bcd969%5D/xplat/executorch/examples/models/llama2/model.py?lines=78

Differential Revision: D53596500

Summary: Add fp16 Linear

Differential Revision: D53333693
Summary:
FYI - there are hardcoded `float` in rmsnorm which makes bunch of nodes in the graph as fp32.

```
aten_embedding_default: "f16[1, 3, 64]" = executorch_exir_dialects_edge__ops_aten_embedding_default(arg11_1, arg55_1);  arg11_1 = arg55_1 = None
            aten_slice_copy_tensor: "f16[3, 4]" = executorch_exir_dialects_edge__ops_aten_slice_copy_Tensor(arg48_1, 0, 0, 3);  arg48_1 = None
            aten_slice_copy_tensor_1: "f16[3, 4]" = executorch_exir_dialects_edge__ops_aten_slice_copy_Tensor(arg49_1, 0, 0, 3);  arg49_1 = None
            aten__to_copy_default: "f32[1, 3, 64]" = executorch_exir_dialects_edge__ops_aten__to_copy_default(aten_embedding_default, dtype = torch.float32)
```

Copy from - https://www.internalfb.com/code/fbsource/%5B7e45e7bcd969%5D/xplat/executorch/examples/models/llama2/model.py?lines=78

Differential Revision: D53596500
@pytorch-bot
Copy link

pytorch-bot bot commented Feb 9, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/1902

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (5 Unrelated Failures)

As of commit 829d304 with merge base 0ea44aa (image):

FLAKY - The following jobs failed but were likely due to flakiness present on trunk:

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Feb 9, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D53596500

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in eb50c46.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants