Skip to content

Latest commit

 

History

History
219 lines (166 loc) · 9.23 KB

CHANGELOG.rst

File metadata and controls

219 lines (166 loc) · 9.23 KB

Change Log

These are the release notes for JAX.

jax 0.1.66 (unreleased)

jaxlib 0.1.46 (unreleased)

jax 0.1.65 (April 30, 2020)

jaxlib 0.1.45 (April 21, 2020)

  • Fixes segfault: google#2755
  • Plumb is_stable option on Sort HLO through to Python.

jax 0.1.64 (April 21, 2020)

  • GitHub commits.
  • New features:
    • Add syntactic sugar for functional indexed updates #2684.
    • Add jax.numpy.linalg.multi_dot #2726.
    • Add jax.numpy.unique #2760.
    • Add jax.numpy.rint #2724.
    • Add jax.numpy.rint #2724.
    • Add more primitive rules for jax.experimental.jet.
  • Bug fixes:
    • Fix logaddexp and logaddexp2 differentiation at zero #2107.
    • Improve memory usage in reverse-mode autodiff without jit #2719.
  • Better errors:
    • Improves error message for reverse-mode differentiation of lax.while_loop #2129.

jaxlib 0.1.44 (April 16, 2020)

  • Fixes a bug where if multiple GPUs of different models were present, JAX would only compile programs suitable for the first GPU.
  • Bugfix for batch_group_count convolutions.
  • Added precompiled SASS for more GPU versions to avoid startup PTX compilation hang.

jax 0.1.63 (April 12, 2020)

  • GitHub commits.
  • Added jax.custom_jvp and jax.custom_vjp from #2026, see the tutorial notebook. Deprecated jax.custom_transforms and removed it from the docs (though it still works).
  • Add scipy.sparse.linalg.cg #2566.
  • Changed how Tracers are printed to show more useful information for debugging #2591.
  • Made jax.numpy.isclose handle nan and inf correctly #2501.
  • Added several new rules for jax.experimental.jet #2537.
  • Fixed jax.experimental.stax.BatchNorm when scale/center isn't provided.
  • Fix some missing cases of broadcasting in jax.numpy.einsum #2512.
  • Implement jax.numpy.cumsum and jax.numpy.cumprod in terms of a parallel prefix scan #2596 and make reduce_prod differentiable to arbitray order #2597.
  • Add batch_group_count to conv_general_dilated #2635.
  • Add docstring for test_util.check_grads #2656.
  • Add callback_transform #2665.
  • Implement rollaxis, convolve/correlate 1d & 2d, copysign, trunc, roots, and quantile/percentile interpolation options.

jaxlib 0.1.43 (March 31, 2020)

  • Fixed a performance regression for Resnet-50 on GPU.

jax 0.1.62 (March 21, 2020)

  • GitHub commits.
  • JAX has dropped support for Python 3.5. Please upgrade to Python 3.6 or newer.
  • Removed the internal function lax._safe_mul, which implemented the convention 0. * nan == 0.. This change means some programs when differentiated will produce nans when they previously produced correct values, though it ensures nans rather than silently incorrect results are produced for other programs. See #2447 and #1052 for details.
  • Added an all_gather parallel convenience function.
  • More type annotations in core code.

jaxlib 0.1.42 (March 19, 2020)

  • jaxlib 0.1.41 broke cloud TPU support due to an API incompatibility. This release fixes it again.
  • JAX has dropped support for Python 3.5. Please upgrade to Python 3.6 or newer.

jax 0.1.61 (March 17, 2020)

  • GitHub commits.
  • Fixes Python 3.5 support. This will be the last JAX or jaxlib release that supports Python 3.5.

jax 0.1.60 (March 17, 2020)

  • GitHub commits.
  • New features:
    • :pyjax.pmap has static_broadcast_argnums argument which allows the user to specify arguments that should be treated as compile-time constants and should be broadcasted to all devices. It works analogously to static_argnums in :pyjax.jit.
    • Improved error messages for when tracers are mistakenly saved in global state.
    • Added :pyjax.nn.one_hot utility function.
    • Added :pyjax.experimental.jet for exponentially faster higher-order automatic differentiation.
    • Added more sanity checking to arguments of :pyjax.lax.broadcast_in_dim.
  • The minimum jaxlib version is now 0.1.41.

jaxlib 0.1.40 (March 4, 2020)

  • Adds experimental support in Jaxlib for TensorFlow profiler, which allows tracing of CPU and GPU computations from TensorBoard.
  • Includes prototype support for multihost GPU computations that communicate via NCCL.
  • Improves performance of NCCL collectives on GPU.
  • Adds TopK, CustomCallWithoutLayout, CustomCallWithLayout, IGammaGradA and RandomGamma implementations.
  • Supports device assignments known at XLA compilation time.

jax 0.1.59 (February 11, 2020)

  • GitHub commits.
  • Breaking changes
    • The minimum jaxlib version is now 0.1.38.
    • Simplified :pyJaxpr by removing the Jaxpr.freevars and Jaxpr.bound_subjaxprs. The call primitives (xla_call, xla_pmap, sharded_call, and remat_call) get a new parameter call_jaxpr with a fully-closed (no constvars) jaxpr. Also, added a new field call_primitive to primitives.
  • New features:
    • Reverse-mode automatic differentiation (e.g. grad) of lax.cond, making it now differentiable in both modes (google#2091)
    • JAX now supports DLPack, which allows sharing CPU and GPU arrays in a zero-copy way with other libraries, such as PyTorch.
    • JAX GPU DeviceArrays now support __cuda_array_interface__, which is another zero-copy protocol for sharing GPU arrays with other libraries such as CuPy and Numba.
    • JAX CPU device buffers now implement the Python buffer protocol, which allows zero-copy buffer sharing between JAX and NumPy.
    • Added JAX_SKIP_SLOW_TESTS environment variable to skip tests known as slow.

jaxlib 0.1.39 (February 11, 2020) --------------------------------

  • Updates XLA.

jaxlib 0.1.38 (January 29, 2020)

  • CUDA 9.0 is no longer supported.
  • CUDA 10.2 wheels are now built by default.

jax 0.1.58 (January 28, 2020)

  • GitHub commits.
  • Breaking changes
    • JAX has dropped Python 2 support, because Python 2 reached its end of life on January 1, 2020. Please update to Python 3.5 or newer.
  • New features

    • Forward-mode automatic differentiation (jvp) of while loop (google#1980)
    • New NumPy and SciPy functions:
      • :pyjax.numpy.fft.fft2
      • :pyjax.numpy.fft.ifft2
      • :pyjax.numpy.fft.rfft
      • :pyjax.numpy.fft.irfft
      • :pyjax.numpy.fft.rfft2
      • :pyjax.numpy.fft.irfft2
      • :pyjax.numpy.fft.rfftn
      • :pyjax.numpy.fft.irfftn
      • :pyjax.numpy.fft.fftfreq
      • :pyjax.numpy.fft.rfftfreq
      • :pyjax.numpy.linalg.matrix_rank
      • :pyjax.numpy.linalg.matrix_power
      • :pyjax.scipy.special.betainc
    • Batched Cholesky decomposition on GPU now uses a more efficient batched kernel.

Notable bug fixes

  • With the Python 3 upgrade, JAX no longer depends on fastcache, which should help with installation.