-
Notifications
You must be signed in to change notification settings - Fork 404
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lift upper version limit of transformers for habana #1895
Comments
We still need the
|
Thank you for the update. We can wait a bit longer for the update. I prefer to avoid install from git. Although a |
Hi, do you haven an ETA for the new releases of optimum and optimum-habana? We are facing the issue that Habana's vLLM fork wants |
optimum 1.20.0 and optimum-habana 1.12.0 were released over the weekend. The new version requires |
Feature request
optimium currently limits transformers to
>= 4.38.0, < 4.39.0
. @regisss bumped the upper version limit in PR #1851 a month ago. Is there any technical reason to limit the upper version to< 4.39
? Other dependencies allow for more recent versions. For example neuronx allows< 4.42.0
, see #1881.Motivation
We would like to use newer versions of transformers and tokenizers in InstructLab. The upper version limit for optimum makes this harder on us. We need optimum-habana for Intel Gaudi support.
Your contribution
I can create a PR. It's a trivial one line change.
Testing is less trivial. I have access to an 8-way Gaudi 2 system, but the system is currently busy. I can do some testing in about two weeks from now after I have updated the system from 1.15.1 to 1.16.0.
The text was updated successfully, but these errors were encountered: