You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Successfully built transformers
Installing collected packages: transformers
Attempting uninstall: transformers
Found existing installation: transformers 4.37.0
Uninstalling transformers-4.37.0:
Successfully uninstalled transformers-4.37.0
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
aqlm 1.0.1 requires transformers==4.37.0, but you have transformers 4.38.0.dev0 which is incompatible.
Successfully installed transformers-4.38.0.dev0
If you change == to >= it would allow for later versions with updates (in this case, the update was for AQLM compatibility!)
The text was updated successfully, but these errors were encountered:
lmmx
changed the title
Do not hard pin the transformers version
Do not hard pin the transformers dependency version
Feb 17, 2024
lmmx
added a commit
to lmmx/AQLM
that referenced
this issue
Feb 17, 2024
We're waiting for a proper transformers4.38.0 release to revamp the dispatch code.
Once it's released, we'll update the dependencies and model checkpoints on Hugging Face.
P.S. here's what new dispatch will look like: Colab Example. No trust_remote_code=True and no custom modeling_* files needed.
Hard pinning the transformers version is causing a dependency conflict in pip
AQLM/inference_lib/setup.cfg
Line 35 in 48c44b1
If you change
==
to>=
it would allow for later versions with updates (in this case, the update was for AQLM compatibility!)The text was updated successfully, but these errors were encountered: