Skip to content

v0.1.17

Compare
Choose a tag to compare
@okunator okunator released this 29 Dec 11:40
· 126 commits to main since this release

0.1.17 — 2022-12-29

Features

  • Add transformer modules
  • Add exact, slice, and memory efficient (xformers) self attention computations
  • Add transformers modules to Decoder modules
  • Add common transformer mlp activation functions: star-relu, geglu, approximate-gelu.
  • Add Linformer self-attention mechanism.
  • Add support for model intialization from yaml-file in MultiTaskUnet.
  • Add a new cross-attention long-skip module. Works with long_skip='cross-attn'

Refactor

  • Added more verbose error messages for the abstract wrapper-modules in modules.base_modules
  • Added more verbose error catching for xformers.ops.memory_efficient_attention.