Wagtail AI 2.0.0
Our first major version increase for Wagtail AI is out! For this release we've focused on improving the developer experience, making the user interface more resilient, and adding support for loads more LLMs.
Added
- New setting for specifying AI backends, with a new default backend using
llm
(Tomasz Knapik) - Support for many different LLMs such as GPT-4, local models, Mistral and Claude using
llm
plugins` (Tomasz Knapik) - Customisable text splitting backends (Tomasz Knapik)
- More complete documentation (Tomasz Knapik)
- Custom prompts can now be managed through Wagtail admin (Ben Morse)
Changed
- Removed Langchain dependency. Text splitting is now customisable and defaults to a vendorised version of Langchain's text splitter. (Tomasz Knapik)
- Various developer experience improvements. (Tomasz Knapik, Dan Braghis)
- Minimum supported versions increased to Wagtail 5.2, Django 4.2 and Python 3.11 (Dan Braghis)
- Improved how prompts are passed to the admin (Ian Meigh)
Upgrade Considerations
Prompts managed in Wagtail admin
The WAGTAIL_AI_PROMPTS
setting is no longer used. Prompts are now managed through the Wagtail admin under Settings -> Prompts.
Any custom prompts should be migrated to this new model, the WAGTAIL_AI_PROMPTS
setting can then be removed.
New Contributors/Thanks
- @tm-kn - AI backends/text splitting restructure
- @zerolab - support with developer tooling
- @Morsey187 - frontend refinements and admin prompt management
- @ianmeigh - improvements to admin integration