Skip to content

Conversation

mergennachin
Copy link
Contributor

Summary:
Keep llama_transformer.py to look like stock implementation, so that it can be reused everywhere.

Do module swap

Differential Revision: D56048640

Copy link

pytorch-bot bot commented Apr 12, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/3007

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure, 3 Unrelated Failures

As of commit fbd4d36 with merge base 6acc86f (image):

NEW FAILURE - The following job has failed:

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 12, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56048640

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56048640

mergennachin added a commit to mergennachin/executorch-1 that referenced this pull request Apr 12, 2024
Summary:
Pull Request resolved: pytorch#3007

Keep llama_transformer.py to look like stock implementation, so that it can be reused everywhere.

Do module swap

Differential Revision: D56048640
Summary:
This is a no-op

Pull Request resolved: pytorch#3005

Test Plan:
CI

Run with

`python -m examples.models.llama2.export_llama -c stories110M.pt -p params.json -kv --use_sdpa_with_kv_cache -X`

and with

`python -m examples.models.llama2.export_llama -c stories110M.pt -p params.json -kv -X`

Make sure both work

Differential Revision: D56048177

Pulled By: mergennachin
Summary:
Pull Request resolved: pytorch#3007

Keep llama_transformer.py to look like stock implementation, so that it can be reused everywhere.

Do module swap

Differential Revision: D56048640
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56048640

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 74eb8b3.

cccclai pushed a commit to cccclai/executorch-1 that referenced this pull request Apr 16, 2024
Summary:
Pull Request resolved: pytorch#3007

Keep llama_transformer.py to look like stock implementation, so that it can be reused everywhere.

Do module swap

Reviewed By: cccclai

Differential Revision: D56048640

fbshipit-source-id: 76de1b09b7f5d79422bb3b32bc830a9a7ecd935c
(cherry picked from commit 74eb8b3)
guangy10 pushed a commit that referenced this pull request Apr 17, 2024
Summary:
Pull Request resolved: #3007

Keep llama_transformer.py to look like stock implementation, so that it can be reused everywhere.

Do module swap

Reviewed By: cccclai

Differential Revision: D56048640

fbshipit-source-id: 76de1b09b7f5d79422bb3b32bc830a9a7ecd935c
(cherry picked from commit 74eb8b3)

Co-authored-by: Mergen Nachin <mnachin@meta.com>
This was referenced Apr 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants