New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
check if position_ids exists before using it #29306
Conversation
@@ -1274,7 +1274,11 @@ def prepare_inputs_for_generation( | |||
|
|||
# TODO @gante we should only keep a `cache_position` in generate, and do +=1. | |||
# same goes for position ids. Could also help with continued generation. | |||
cache_position = torch.arange(past_length, past_length + position_ids.shape[-1], device=position_ids.device) | |||
cache_position = ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the attention mask is always passed to the model so the position ids are always created before this!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If so, should we set attention_mask
as a required parameter? like
def prepare_inputs_for_generation(
self, input_ids, attention_mask, past_key_values=None, inputs_embeds=None, **kwargs
):
Because we cannot control user behavior, I think we should avoid this error in codes or set a warning.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
have not seen any issue so far. If past key values is not None, the attention mask is created as well and always passed to the model by the generate function. Let's just check if positions ids exist or just use input_ids instead for device placement and shape as it always exists
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great idea! Have fixed it according to your comments:)
The CI is weird, I can pass the CI locally, could you please help to re-run the CI? Thx! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM let's have a second look from @gante
feel free to merge from main and run |
@jiqing-feng I hope you don't mind, I took the liberty to fix the failing test case 🤗 TL;DR when |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
Thanks for your fix! |
Hi @ArthurZucker and @younesbelkada
I think we should check if
position_ids
isNone
or not before we use it.