Skip to content

Latest commit

 

History

History
117 lines (99 loc) · 6.87 KB

type_promotion.rst

File metadata and controls

117 lines (99 loc) · 6.87 KB

Type promotion semantics

JAX's type promotion rules (i.e., the result of :func:`jax.numpy.promote_types` for each pair of types) are given by the following table, where, for example

  • "b1" means np.bool_,
  • "s2" means np.int16,
  • "u4" means np.uint32,
  • "bf" means np.bfloat16,
  • "f2" means np.float16, and
  • "c8" means np.complex128.
#types table { border: 2px solid #aaa; } #types td, #types th { border: 1px solid #ddd; padding: 3px; } #types th { border-bottom: 1px solid #aaa; } #types tr:nth-child(even){background-color: #f2f2f2;} #types .d { background-color: #ccf2cc; } #types td:first-child{ background-color: #f2f2f2; border-right: 1px solid #aaa; font-weight: bold; } #types tr:first-child{background-color: #f2f2f2;}
b1u1u2u4u8i1i2i4i8bff2f4f8c4c8
b1b1u1u2u4u8i1i2i4i8bff2f4f8c4c8
u1u1u1u2u4u8i2i2i4i8bff2f4f8c4c8
u2u2u2u2u4u8i4i4i4i8bff2f4f8c4c8
u4u4u4u4u4u8i8i8i8i8bff2f4f8c4c8
u8u8u8u8u8u8f8f8f8f8bff2f4f8c4c8
i1i1i2i4i8f8i1i2i4i8bff2f4f8c4c8
i2i2i2i4i8f8i2i2i4i8bff2f4f8c4c8
i4i4i4i4i8f8i4i4i4i8bff2f4f8c4c8
i8i8i8i8i8f8i8i8i8i8bff2f4f8c4c8
bfbfbfbfbfbfbfbfbfbfbff4f4f8c4c8
f2f2f2f2f2f2f2f2f2f2f4f2f4f8c4c8
f4f4f4f4f4f4f4f4f4f4f4f4f4f8c4c8
f8f8f8f8f8f8f8f8f8f8f8f8f8f8c8c8
c4c4c4c4c4c4c4c4c4c4c4c4c4c8c4c8
c8c8c8c8c8c8c8c8c8c8c8c8c8c8c8c8

Jax's type promotion rules differ from those of NumPy, as given by :func:`numpy.promote_types`, in those cells highlighted with a green background in the table above. There are two key differences:

  • when promoting an integer or boolean type against a floating-point or complex type, JAX always prefers the type of the floating-point or complex type.

    Accelerator devices, such as GPUs and TPUs, either pay a significant performance penalty to use 64-bit floating point types (GPUs) or do not support 64-bit floating point types at all (TPUs). Classic NumPy's promotion rules are too willing to overpromote to 64-bit types, which is problematic for a system designed to run on accelerators.

    JAX uses floating point promotion rules that are more suited to modern accelerator devices and are less aggressive about promoting floating point types. The promotion rules used by JAX for floating-point types are similar to those used by PyTorch.

  • JAX supports the bfloat16 non-standard 16-bit floating point type (jax.numpy.bfloat16), which is useful for neural network training. The only notable promotion behavior is with respect to IEEE-754 float16, with which bfloat16 promotes to a float32.