-
-
Notifications
You must be signed in to change notification settings - Fork 610
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Adapt.jl v4 #2374
Support Adapt.jl v4 #2374
Conversation
@maleadt can you help me understand the error here? https://github.com/FluxML/Flux.jl/actions/runs/7714124765/job/21025646354?pr=2374#step:6:41 |
That's because AMDGPU.jl doesn't yet support LLVM 16 which is Julia nightly's current version. |
This is a dupe of #2362, FYI. |
Ah, glad it's being worked on. Sorry for the dupe |
@ToucheSir , just spoke to @christiangnrd . We're abandoning #2362 in favor of this PR. Can you run workflows here? |
I did not say that and I don’t have any authority to make those decisions. What I meant is that the other PR doesn’t update the Metal compat so if you remove the adapt fix here it could (not would) get merged for that and then allow the tests on the original PR to pass. |
Apologies for mis-phrasing. The changes in #2362 are already in this PR, along with the Metal compat update. So this PR should (hopefully) pass. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess opening a new PR worked out because the Metal.jl compat bump was also required, thanks. Could you bump the package version quickly so that we can tag immediately after merging?
@ToucheSir done. Thank you |
cc: @maleadt. Hopefully this should just work: I didn't find any reference to
Adapt.eltype
orAdapt.ndims
in the repo withgrep
.