Skip to content

Releases: transformerlab/transformerlab-app

0.19.1

19 Jun 18:10
Compare
Choose a tag to compare

Highlights

  • support for schedulers when using diffusion
  • support for minP as a generation parameter
  • several fixes for docker support, and we now use a single Dockerfile for all platforms
  • trains will now use the first unused split in a dataset if there is no "train" split
  • upgrades to MLX LoRA trainer to better support Qwen models
  • fix for embedding dataset creation using Synthesizer Docs plugin
  • run a model on an inference engine even if the architecture is not listed as supported

Details

Full Changelog: v0.19.0...v0.19.1

0.19.0

12 Jun 13:58
Compare
Choose a tag to compare

This release introduces support for diffusion models including inference, training and improved management of image datasets.

What's Changed

Full Changelog: v0.18.0...v0.19.0

0.18.0

02 Jun 18:29
Compare
Choose a tag to compare

The most exciting new feature is support for multimodal model training. This includes a VLM trainer and the ability to work with vision datasets.

Notable bug fixes:

  • Canceling downloads is a lot more responsive
  • LlamaIndex plugin setup error causing issues for some RAG users
  • removed limit on the number of samples being used for trains on STF Llama Factory plugin
  • Eval and Generate jobs no longer show as COMPLETE when they were STOPPED

What's Changed

  • v1 of the Recipes Modal (hidden in DEV_MODE) by @aliasaria in #495
  • Preview Image Data in Datasets Tab by @deep1401 in #470
  • Changed the visuals in Download Progress Bar by @Sourenm in #491
  • Mark plugin as deprecated if supported_hardware_architectures is an empty list by @deep1401 in #494
  • Add MCP Server support in Tools tab within Interact tab by @deep1401 in #468
  • Recipe navigates to Notes page by @aahaanmaini in #509

Full Changelog: v0.17.1...v0.18.0

0.17.1

23 May 19:04
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.17.0...v0.17.1

0.17.0

23 May 18:07
Compare
Choose a tag to compare
  • AMD GPU Support!
  • New Model Zoo groups models to make it easier to navigate
  • Export screen updated to match style of other job pages
  • Dataset fixes: Better viewing of ChatML datasets, fixed issues using some of our gallery datasets

What's Changed

New Contributors

Full Changelog: v0.16.4...v0.17.0

0.16.4

16 May 18:56
Compare
Choose a tag to compare

What's Changed

  • Github build upset about outputs in publish.yml by @dadmobile in #475

Full Changelog: v0.16.3...v0.16.4

0.16.1

12 May 19:48
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.16.0...v0.16.1

0.16.0

12 May 17:38
Compare
Choose a tag to compare
  • Major update to CUDA and torch in order to add support for NVIDIA 50 series GPUS
  • new MLX PPO Trainer plugin!
  • ability to run 4-bit bitsandbytes models in fastchat

Bug fixes

  • local and generated datasets viewing doesn’t throw an error
  • ability to manually set fallback context length used by MLX
  • names of task templates update correctly

What's Changed

Full Changelog: v0.15.2...v0.16.0

0.15.2

07 May 12:59
Compare
Choose a tag to compare

Highlights

This build fixes an installer bug that was caused by the latest API update.

What's Changed

New Contributors

Full Changelog: v0.15.1...v0.15.2

0.15.1

02 May 15:28
Compare
Choose a tag to compare

This is a patch release with a number of bug fixes:

  • Web app will auto-connect to API running on the same machine
  • Installer no longer halts if local_server.log is not accessible/writable
  • Console automatically opens to show error if inference server fails to start correctly
  • Console logger scrolls way faster
  • Plugins no longer always say they are unsupported for non-GPU/mac users
  • if you upload docs before installing RAG plugin it will index the docs for you before running your query
  • You can now edit the setup script for plugins inside Transformer Lab (good for debugging or making custom modifications to a plugin’s venv)

git Details

  • Open console logger upon a failed inference server run by @deep1401 in #415
  • Fix supported arch message for local plugins by @deep1401 in #417
  • Fix error caused by deleting plugins by @deep1401 in #419
  • If you are on the webapp, correctly find the local server we should first try to connect to by @aliasaria in #421
  • Don't hang or crash if local_server.log is not writeable by @dadmobile in #423
  • Reduce console scroll delay from 100ms to 10ms by @dadmobile in #428

Full Changelog: v0.15.0...v0.15.1