Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

qwen2_5_vl processor padding side is wrong. #36100

Closed
2 of 4 tasks
habaohaba opened this issue Feb 8, 2025 · 4 comments
Closed
2 of 4 tasks

qwen2_5_vl processor padding side is wrong. #36100

habaohaba opened this issue Feb 8, 2025 · 4 comments
Labels

Comments

@habaohaba
Copy link

habaohaba commented Feb 8, 2025

System Info

Image

Image

Image
the padding side should be left as qwen2 vl do .

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

run conditional generation using qwen2_5_vl using flash attention 2 .

Expected behavior

Image

@habaohaba habaohaba added the bug label Feb 8, 2025
@echoht
Copy link

echoht commented Feb 10, 2025

你好,transformers 4.49.0.dev0 怎么安装呢?

@zucchini-nlp
Copy link
Member

zucchini-nlp commented Feb 10, 2025

You can set the padding side when loading the processor as AutoProcessor.from_pretrained(model_id, padding_side="left"). We usually recommend to make sure padding side is on the correct side, before generating :)

We might need to update the model docs, if example code for batch generation failed with the same error. Feel free to open a PR if so

@guoshiqiufeng
Copy link

你好,transformers 4.49.0.dev0 怎么安装呢?

pip install git+https://github.com/huggingface/transformers.git

Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants