Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Usage] TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position' #1218

Open
alvinxjm opened this issue Mar 4, 2024 · 16 comments

Comments

@alvinxjm
Copy link

alvinxjm commented Mar 4, 2024

Describe the issue

Issue:
Model Worker is showing this error when i chat with it.
TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'

Log:

2024-03-04 08:41:32 | ERROR | stderr | Exception in thread Thread-3 (generate):
2024-03-04 08:41:32 | ERROR | stderr | Traceback (most recent call last):
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
2024-03-04 08:41:32 | ERROR | stderr | self.run()
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/threading.py", line 953, in run
2024-03-04 08:41:32 | ERROR | stderr | self._target(*self._args, **self._kwargs)
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
2024-03-04 08:41:32 | ERROR | stderr | return func(*args, **kwargs)
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/LLaVA/llava/model/language_model/llava_llama.py", line 138, in generate
2024-03-04 08:41:32 | ERROR | stderr | return super().generate(
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
2024-03-04 08:41:32 | ERROR | stderr | return func(*args, **kwargs)
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/utils.py", line 1592, in generate
2024-03-04 08:41:32 | ERROR | stderr | return self.sample(
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/utils.py", line 2696, in sample
2024-03-04 08:41:32 | ERROR | stderr | outputs = self(
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
2024-03-04 08:41:32 | ERROR | stderr | return self._call_impl(*args, **kwargs)
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
2024-03-04 08:41:32 | ERROR | stderr | return forward_call(*args, **kwargs)
2024-03-04 08:41:32 | ERROR | stderr | File "/home/ec2-user/anaconda3/envs/llava/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
2024-03-04 08:41:32 | ERROR | stderr | output = old_forward(*args, **kwargs)
2024-03-04 08:41:32 | ERROR | stderr | TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'

Screenshots:
image
image
image

@londee
Copy link

londee commented Mar 4, 2024

I encountered the same error while running "eval_mode".

eval_model(args)
You are using a model of type llava to instantiate a model of type llava_llama. This is not supported for all configurations of models and can yield errors.
Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:04<00:00, 2.27s/it]
/root/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/configuration_utils.py:410: UserWarning: do_sample is set to False. However, temperature is set to 0 -- this flag is only used in sample-based generation modes. You should set do_sample=True or unset temperature.
warnings.warn(
Traceback (most recent call last):
File "", line 1, in
File "/home/humaodi/code/LLaVA/llava/eval/run_llava.py", line 115, in eval_model
output_ids = model.generate(
File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/humaodi/code/LLaVA/llava/model/language_model/llava_llama.py", line 137, in generate
return super().generate(
File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/utils.py", line 1544, in generate
return self.greedy_search(
File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/utils.py", line 2404, in greedy_search
outputs = self(
File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/anaconda3/envs/llava/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'

@scp92
Copy link

scp92 commented Mar 4, 2024

I encountered the same error too. Any ideas?

@zfreeman32
Copy link

Got this error as well. Not been able to fix yet. Tracking this Issue.

@shashwat14
Copy link

I had the same issue. Fixed it by ensuring the transformers version to be same as the one mentioned in pyproject.toml, i.e. transformers==4.37.2

@FanHengbo
Copy link

I had the same issue. Fixed it by ensuring the transformers version to be same as the one mentioned in pyproject.toml, i.e. transformers==4.37.2

Problem solved. Thanks!

@lixiaoxiangzhi
Copy link

Problem solved. Thanks!

@aliencaocao
Copy link

this was because transformers 4.38.0 added static cache. So have to use any version below.

@lixiaoxiangzhi
Copy link

lixiaoxiangzhi commented Mar 6, 2024 via email

@YFCYFC
Copy link

YFCYFC commented Mar 11, 2024

#1218 (comment)
Thank you so much, my problem was solved via degrading transformers==4.37.2.The transformers package interfaces changes frequently, which makes me so confused that I have to spend too much time to debug the meaningless bugs.

@RandomInternetPreson
Copy link

I had the same issue. Fixed it by ensuring the transformers version to be same as the one mentioned in pyproject.toml, i.e. transformers==4.37.2

TY <3

@wyxscir
Copy link

wyxscir commented Mar 23, 2024

I had the same issue. Fixed it by ensuring the transformers version to be same as the one mentioned in pyproject.toml, i.e. transformers==4.37.2

thank you!!!!!!

@segalinc
Copy link

new models use transformers>4.39 is there a way to actually fix this ?

@lixiaoxiangzhi
Copy link

lixiaoxiangzhi commented Apr 23, 2024 via email

@foundwant
Copy link

TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'
my transformers version is 4.37.2,also have this problem.

@baichuanzhou
Copy link

Hey, adding cache_position=None to the forward method also works. Check here

@lixiaoxiangzhi
Copy link

lixiaoxiangzhi commented May 16, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests