Skip to content

Conversation

michaelmaitland
Copy link
Contributor

Summary:
GELU accepts an approximate argument which is either none by default, or tanh

When the approximate kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Differential Revision: D75454999

@michaelmaitland michaelmaitland requested a review from tarun292 as a code owner May 30, 2025 01:36
Copy link

pytorch-bot bot commented May 30, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/11246

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit c9eb4e9 with merge base 1bc36c7 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label May 30, 2025
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75454999

michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng

Differential Revision: D75454999
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75454999

michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:
Pull Request resolved: pytorch#11246

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng

Differential Revision: D75454999
@michaelmaitland michaelmaitland added the release notes: none Do not include this in the release notes label May 30, 2025
michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng

Differential Revision: D75454999
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75454999

michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:
Pull Request resolved: pytorch#11246

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng

Differential Revision: D75454999
michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75454999

michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:
Pull Request resolved: pytorch#11246

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75454999

michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:
Pull Request resolved: pytorch#11246

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75454999

michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:
Pull Request resolved: pytorch#11246

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75454999

michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:
Pull Request resolved: pytorch#11246

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75454999

michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request May 30, 2025
Summary:
Pull Request resolved: pytorch#11246

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
@michaelmaitland michaelmaitland force-pushed the export-D75454999 branch 2 times, most recently from 605e1fb to ac8662a Compare June 2, 2025 15:26
michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request Jun 2, 2025
Summary:

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75454999

michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request Jun 2, 2025
Summary:
Pull Request resolved: pytorch#11246

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
@michaelmaitland michaelmaitland force-pushed the export-D75454999 branch 2 times, most recently from faff995 to 0465717 Compare June 2, 2025 19:07
michaelmaitland pushed a commit to michaelmaitland/executorch that referenced this pull request Jun 2, 2025
Summary:

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
Summary:
Pull Request resolved: pytorch#11246

GELU accepts an `approximate` argument which is either `none` by default, or `tanh`

When the `approximate` kwarg is present, decompose the op.

We already have an existing test in test_aten_gelu_out to make sure the op is supported.

Reviewed By: zonglinpeng, hsharma35

Differential Revision: D75454999
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75454999

@facebook-github-bot facebook-github-bot merged commit b5567be into pytorch:main Jun 2, 2025
96 of 98 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported release notes: none Do not include this in the release notes topic: not user facing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants