-
Notifications
You must be signed in to change notification settings - Fork 93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TypeError: forward () got an unexpected keyword argument "position_ids" #19
Comments
Hi @GZHU-DVL, Thank you for your interest in our work. Please make sure that you followed the mentioned environment setup process and using the correct versions of the libraries. If the issue still exists, please provide the script and command that you are running to understand the issue. I hope it will help. Thanks |
The versions of the libraries are as follows: The command are as follows: |
The problem was solved after I changed the version of Transformer. |
Hi! |
The root cause can be seen in this issue: huggingface/transformers#24130 |
Actually, I was wrong. The problem is with the The specific commit with this fix is this one: lm-sys/FastChat@daa9c11 But after that, you also need to add a kwarg, # ...video_chatgpt/train/llama_flash_attn_monkey_patch.py
...
def forward(
self,
hidden_states: torch_Tensor,
attention_mask: Optional[torch_Tensor] = None,
position_ids: Optional[torch_Tensor] = None,
past_key_value: Optional[Tuple[torch_Tensor]] = None,
output_attentions: bool = False,
use_cache: bool = False,
padding_mask: Optional[torch_LongTensor] = None,
) -> Tuple[torch_Tensor, Optional[torch_Tensor], Optional[Tuple[torch_Tensor]]]:
if output_attentions:
... (Ignore my |
According to the tutorial, I can execute this project, but the execution will report an error when I reach this position.
The text was updated successfully, but these errors were encountered: