New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] mlx crashes with msg - uncaught exception of type std::invalid_argument: [Scatter::eval_gpu] Does not support int64 #1076
Comments
If you are just asking for a catchable exception then #1077 should close this. We would like to eventually allow int64 and other 8 byte types to work with scatter, but that is more involved. |
Thank you Awni. some observations:
zeros = mx.zeros(shape, dtype=values.dtype)
zeros = zeros.at[indices].add(values) i tried this and it does not work as add does not take if zeros.dtype in [mx.int64, mx.uint64] and mx.get_default_device == mx.DeviceType.gpu :
device = mx.Device(type=mx.DeviceType.cpu)
zeros = zeros.at[indices].add(values, device=device)
else:
zeros = zeros.at[indices].add(values) It would be helpful if mlx can fallback to cpu for scatter ops which are not supported on gpu or allow device kw_arg for all scatter ops. Additional ops which are impacted by this bug:
|
I improved the message in #1077. The problem is with the values.
We prefer not to silently route to the CPU for ops without a GPU back-end. You can do this in the API by changing the default stream to the CPU before calling the scatter when the dytpe is int64/uint64.
Just a few. FFT and some of the lapack ops (QR / Inverse). Metal support for FFT is coming soon in #981 .
You can use a context manager. For most free ops v = mx.array([1, 2, 3])
u = mx.array([1, 2])
idx = mx.array([0, 1])
with mx.stream(mx.cpu):
out = v.at[idx].add(u) |
Thank you @awni for the fix. |
Describe the bug
A clear and concise description of what the bug is.
To Reproduce
Include code snippet
Expected behavior
Mlx should not crash - it should throw an exception or error.
Desktop (please complete the following information):
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: