Skip to content

Conversation

@t-kalinowski
Copy link
Member

@t-kalinowski t-kalinowski commented Sep 27, 2025

New/Updated Activations, Losses, Optimizers

  • activation_sparse_sigmoid() — LoRA-related sparse activation.
  • loss_categorical_generalized_cross_entropy() — generalized CE loss wrapper.
  • optimizer_muon() — new Muon optimizer (with AdamW fallback, LoRA exclusions, Newton–Schulz
    controls).

New Config functions

  • Added getters/setters for training limits:
    • config_max_epochs(), config_set_max_epochs()
    • config_max_steps_per_epoch(), config_set_max_steps_per_epoch()
    • Shared config_training_limits Rd page documents both caps and their env-var overrides.
  • Added config_is_nnx_enabled() to surface the JAX NNX feature flag.

Layers & Random Augmentations

  • layer_random_elastic_transform() — new preprocessing layer
  • layer_conv*transopose gain output_padding arg.

Expanded Ops Coverage

  • Numerical/FFT/window helpers: op_angle(), op_bartlett(), op_blackman(), op_cbrt(), op_corrcoef(),
    op_deg2rad(), op_hamming(), op_hanning(), op_heaviside(), op_kaiser().
  • Neural/statistical helpers: op_layer_normalization(), op_sparse_sigmoid().
  • Complex utilities: op_view_as_complex(), op_view_as_real().
  • Image augmentation: op_image_elastic_transform().

New S3 Methods

  • Added Arg() method for both keras.src.backend.Tensor and
    keras.src.backend.common.keras_tensor.KerasTensor.

@t-kalinowski t-kalinowski merged commit f1e3ff8 into main Sep 29, 2025
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants