-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
please update compat bounds #175
Comments
Any plans for this? |
EDIT: I am wrong -- it's an issue on 1.10, not Transfomers. I don't understand it, but it has to do with CUDA / GPUCompiler. I believe this makes Transformers.jl uninstallable now. I'm not sure how it's possible, but I've tried from fresh environment and it fails in resolving. Clearly, some dependency has moved on :/ Example
|
I think this is again an issue of the compat specification of Transformers.jl. On 1.10, only GPUCompiler 0.22.0 upwards are available (to my understanding because of this line). However, these never versions of GPUCompiler are only compatible with newer versions of CUDA.jl, but Transformers.jl restricts the CUDA version (explicitly, and implicitly via the Flux compat entry). |
I spent a bit trying to chase down the compat issues today and wow what an absolute nightmare.
|
Oh this is great, it doesn't seem like the package imports NNlibCUDA.jl anywhere, perhaps we can simply drop the compat. Experimenting. |
Opened #185, which should address some of this. Testing still ongoing. |
There are a lot of unmerged CompatHelper PRs in this repo.
@chengchingwen could you update the compact bounds, in particular the latest Flux and NNlib version?
The text was updated successfully, but these errors were encountered: