We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
7 import jax 8 import jax.numpy as jnp ----> 9 import klujax 10 from natsort import natsorted 12 from ..saxtypes import SDense, SType, scoo, sdense File ~/miniconda3/lib/python3.11/site-packages/klujax.py:164 155 return ShapedArray(b.shape, b.dtype) 158 # XLA Implementations ================================================================= 161 @xla_register_cpu(solve_f64, klujax_cpp.solve_f64) 162 @xla_register_cpu(solve_c128, klujax_cpp.solve_c128) 163 @xla_register_cpu(coo_mul_vec_f64, klujax_cpp.coo_mul_vec_f64) --> 164 @xla_register_cpu(coo_mul_vec_c128, klujax_cpp.coo_mul_vec_c128) 165 def coo_vec_operation_xla(primitive_name, c, Ai, Aj, Ax, b): 166 Ax_shape = c.get_shape(Ax) 167 Ai_shape = c.get_shape(Ai) File ~/miniconda3/lib/python3.11/site-packages/klujax.py:64, in xla_register_cpu.<locals>.decorator(fun) 59 def decorator(fun): 60 xla_client.register_custom_call_target( 61 name, 62 cpp_fun(), 63 ) ---> 64 xla.backend_specific_translations["cpu"][primitive] = partial(fun, name) 65 return fun AttributeError: module 'jax.interpreters.xla' has no attribute 'backend_specific_translations'
@flaport
We can pin both jax and jaxlib to 0.4.23
The text was updated successfully, but these errors were encountered:
Successfully merging a pull request may close this issue.
@flaport
We can pin both jax and jaxlib to 0.4.23
The text was updated successfully, but these errors were encountered: