Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LlavaNextProcessor.__init__() got an unexpected keyword argument 'image_token' #31465

Closed
2 tasks done
M-Fannilla opened this issue Jun 18, 2024 · 5 comments
Closed
2 tasks done

Comments

@M-Fannilla
Copy link

M-Fannilla commented Jun 18, 2024

System Info

transformers 4.41.2
Python 3.10.12
Nvidia A40 / Runpod.io

Whe working with LlavaNextProcessor I get errors that were not there.

Error:
LlavaNextProcessor.init() got an unexpected keyword argument 'image_token'

Which previously did not happen (~week ago).

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Reproduction

model_name = "llava-hf/llava-v1.6-vicuna-7b-hf"
processor = LlavaNextProcessor.from_pretrained(
model_name, padding_side='left'
)
model = LlavaNextForConditionalGeneration.from_pretrained(
model_name, torch_dtype=torch.float16, low_cpu_mem_usage=True
).to("cuda").eval()

Expected behavior

I would expect to initiate the objects properly

@zucchini-nlp
Copy link
Member

@M-Fannilla hi, I just updates llava-weights in the hub which caused the error. I will revert the changes soon

@M-Fannilla
Copy link
Author

@zucchini-nlp Great, Thanks!

@zucchini-nlp
Copy link
Member

@M-Fannilla should be working now, closing the issue as resolved!

@M-Fannilla
Copy link
Author

M-Fannilla commented Jun 18, 2024

There is new issue:

Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):
/usr/local/lib/python3.10/dist-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops5zeros4callEN3c108ArrayRefINS2_6SymIntEEENS2_8optionalINS2_10ScalarTypeEEENS6_INS2_6LayoutEEENS6_INS2_6DeviceEEENS6_IbEE

I did not have this one before.

@zucchini-nlp
Copy link
Member

@M-Fannilla this one isn't related to the model checkpoint, but rather to the installed packages you have. Seems like flash-attn wasn't installed properly or has some dependency issues. Try to uninstall flash-attn and load again

If it doesn't work feel free to open a new issue :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants