-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Usage] TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position' #1218
Comments
I encountered the same error while running "eval_mode".
|
I encountered the same error too. Any ideas? |
Got this error as well. Not been able to fix yet. Tracking this Issue. |
I had the same issue. Fixed it by ensuring the transformers version to be same as the one mentioned in pyproject.toml, i.e. transformers==4.37.2 |
Problem solved. Thanks! |
|
this was because transformers 4.38.0 added static cache. So have to use any version below. |
您的来信已收到,祝您每天有个好心情。
|
#1218 (comment) |
TY <3 |
thank you!!!!!! |
new models use transformers>4.39 is there a way to actually fix this ? |
您的来信已收到,祝您每天有个好心情。
|
TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position' |
Hey, adding cache_position=None to the forward method also works. Check here |
您的来信已收到,祝您每天有个好心情。
|
Describe the issue
Issue:
Model Worker is showing this error when i chat with it.
TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'
Log:
2024-03-04 08:41:32 | ERROR | stderr | Exception in thread Thread-3 (generate):
2024-03-04 08:41:32 | ERROR | stderr | Traceback (most recent call last):
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
2024-03-04 08:41:32 | ERROR | stderr | self.run()
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/threading.py", line 953, in run
2024-03-04 08:41:32 | ERROR | stderr | self._target(*self._args, **self._kwargs)
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
2024-03-04 08:41:32 | ERROR | stderr | return func(*args, **kwargs)
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/LLaVA/llava/model/language_model/llava_llama.py", line 138, in generate
2024-03-04 08:41:32 | ERROR | stderr | return super().generate(
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
2024-03-04 08:41:32 | ERROR | stderr | return func(*args, **kwargs)
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/utils.py", line 1592, in generate
2024-03-04 08:41:32 | ERROR | stderr | return self.sample(
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/utils.py", line 2696, in sample
2024-03-04 08:41:32 | ERROR | stderr | outputs = self(
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
2024-03-04 08:41:32 | ERROR | stderr | return self._call_impl(*args, **kwargs)
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
2024-03-04 08:41:32 | ERROR | stderr | return forward_call(*args, **kwargs)
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
2024-03-04 08:41:32 | ERROR | stderr | output = old_forward(*args, **kwargs)
2024-03-04 08:41:32 | ERROR | stderr | TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'
Screenshots:
The text was updated successfully, but these errors were encountered: