Skip to content

Conversation

@pytorchbot
Copy link
Collaborator

The LlamaModel.get_example_inputs() in model.py doesn't accept a use_kv_cache parameter. The method internally checks self.use_kv_cache instead.

The LlamaModel.get_example_inputs() in model.py doesn't accept a
use_kv_cache parameter. The method internally checks self.use_kv_cache
instead.

(cherry picked from commit bd071e9)
@pytorch-bot
Copy link

pytorch-bot bot commented Jan 25, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/16862

Note: Links to docs will display an error until the docs builds have been completed.

❌ 9 New Failures

As of commit a189d5b with merge base 7492d0d (image):

NEW FAILURES - The following jobs have failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jan 25, 2026
@mergennachin
Copy link
Contributor

Since #16819 is merged, we also need the follow-up fix too

@rascani rascani merged commit f35288e into release/1.1 Jan 26, 2026
157 of 167 checks passed
@rascani rascani deleted the cherry-pick-16746-by-pytorch_bot_bot_ branch January 26, 2026 16:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants