Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Phi3Model ImportError #16

Open
zimenglan-sysu-512 opened this issue Jul 8, 2024 · 2 comments
Open

Phi3Model ImportError #16

zimenglan-sysu-512 opened this issue Jul 8, 2024 · 2 comments

Comments

@zimenglan-sysu-512
Copy link

when run the script, met the problem: ImportError: cannot import name 'Phi3Model' from 'transformers'

@mmaaz60
Copy link
Member

mmaaz60 commented Jul 8, 2024

Hi @zimenglan-sysu-512,

Thank you for your interest in our work. Which transformers version are you using? Please make sure that you are using transformers==4.41.0. Please let me know if it fixes the issue.

@zimenglan-sysu-512
Copy link
Author

Get it. and need to update accelerate to 0.27.0. but still meet some problem: [rank6]: RuntimeError: FlashAttention only supports Ampere GPUs or newer.
hi @mmaaz60 can u show me how to close the flash attention?
thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants