You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wonder whether Optim.jl supports efficient optimisations on the GPU. For me this is essential because each function evaluation is quite expensive and I have a big design vector (length ~10^5) that should stay on on the GPU throughout the optimisation to avoid unnecessary communication between host/device.
Here is a minimum example of a simple optimisation that does not seem to work:
using Optim
functiontest(x)
returnsum(x.^2)
endfunction∇test!(gradient, x)
gradient[:] = (2.* x)[:]
end# This works:
result =optimize(test, ∇test!, [1., 2.])
# This does not:
result =optimize(test, ∇test!, cu([1., 2.]))
I think you will likely get a better answer if you can ask a slightly more precise question given that GPU evaluations are supported and other people have worked with them in the past (e.g. #946). Is your goal to use CuArray with L-BFGS?
Hi,
I wonder whether
Optim.jl
supports efficient optimisations on the GPU. For me this is essential because each function evaluation is quite expensive and I have a big design vector (length ~10^5) that should stay on on the GPU throughout the optimisation to avoid unnecessary communication between host/device.Here is a minimum example of a simple optimisation that does not seem to work:
Error message:
The text was updated successfully, but these errors were encountered: