Skip to content

Fix llama_v2_7b_16h for torch.jit.trace#2121

Open
thiagocrepaldi wants to merge 1 commit intopytorch:mainfrom
thiagocrepaldi:thiagofc/fix-llama_v2_7b_16h
Open

Fix llama_v2_7b_16h for torch.jit.trace#2121
thiagocrepaldi wants to merge 1 commit intopytorch:mainfrom
thiagocrepaldi:thiagofc/fix-llama_v2_7b_16h

Conversation

@thiagocrepaldi
Copy link
Contributor

Original error: Attention using SDPA can not be traced with torch.jit.trace when no attention_mask is provided. To solve this issue, please either load your model with the argument attn_implementation="eager" or pass an attention_mask input when tracing the model.

Original error: Attention using SDPA can not be traced with torch.jit.trace
when no attention_mask is provided. To solve this issue, please either load
your model with the argument attn_implementation="eager" or
pass an attention_mask input when tracing the model.
@thiagocrepaldi
Copy link
Contributor Author

@huydhn do you think that change addresses #117752?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants