Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lift upper version limit of transformers for habana #1895

Closed
tiran opened this issue Jun 6, 2024 · 4 comments
Closed

Lift upper version limit of transformers for habana #1895

tiran opened this issue Jun 6, 2024 · 4 comments

Comments

@tiran
Copy link

tiran commented Jun 6, 2024

Feature request

optimium currently limits transformers to >= 4.38.0, < 4.39.0. @regisss bumped the upper version limit in PR #1851 a month ago. Is there any technical reason to limit the upper version to < 4.39? Other dependencies allow for more recent versions. For example neuronx allows < 4.42.0, see #1881.

Motivation

We would like to use newer versions of transformers and tokenizers in InstructLab. The upper version limit for optimum makes this harder on us. We need optimum-habana for Intel Gaudi support.

Your contribution

I can create a PR. It's a trivial one line change.

Testing is less trivial. I have access to an 8-way Gaudi 2 system, but the system is currently busy. I can do some testing in about two weeks from now after I have updated the system from 1.15.1 to 1.16.0.

@regisss
Copy link
Contributor

regisss commented Jun 6, 2024

We still need the >= 4.38.0, < 4.39.0 dependency on Transformers for Optimum Habana because the latest stable release of Optimum Habana (i.e. v1.11.1) doesn't work with more recent version of Transformers.
huggingface/optimum-habana#1027 was merged a few days ago to have compatibility with Transformers v4.40 and Optimum Habana v1.12 will be released soon. You can also install the library from source if you want to benefit from this change now:

pip install git+https://github.com/huggingface/optimum-habana.git

@tiran
Copy link
Author

tiran commented Jun 6, 2024

Thank you for the update. We can wait a bit longer for the update.

I prefer to avoid install from git. Although a requirements.txt can contain a git URL, PyPI does not permit packages that have dependencies with direct URL references.

@tiran
Copy link
Author

tiran commented Jun 20, 2024

Hi,

do you haven an ETA for the new releases of optimum and optimum-habana? We are facing the issue that Habana's vLLM fork wants transfomers >= 4.40, which conflicts with optimum-habana 1.11.1's >= 4.38.0, < 4.39.0.

@tiran
Copy link
Author

tiran commented Jun 24, 2024

optimum 1.20.0 and optimum-habana 1.12.0 were released over the weekend. The new version requires transformers <4.41.0,>=4.40.0.

@tiran tiran closed this as completed Jun 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants