Skip to content

0.5

Compare
Choose a tag to compare
@simonw simonw released this 12 Jul 14:22
· 319 commits to main since this release

LLM now supports additional language models, thanks to a new plugins mechanism for installing additional models.

Plugins are available for 19 models in addition to the default OpenAI ones:

  • llm-gpt4all adds support for 17 models that can download and run on your own device, including Vicuna, Falcon and wizardLM.
  • llm-mpt30b adds support for the MPT-30B model, a 19GB download.
  • llm-palm adds support for Google's PaLM 2 via the Google API.

A comprehensive tutorial, writing a plugin to support a new model describes how to add new models by building plugins in detail.

New features

  • Python API documentation for using LLM models, including models from plugins, directly from Python. #75
  • Messages are now logged to the database by default - no need to run the llm init-db command any more, which has been removed. Instead, you can toggle this behavior off using llm logs off or turn it on again using llm logs on. The llm logs status command shows the current status of the log database. If logging is turned off, passing --log to the llm prompt command will cause that prompt to be logged anyway. #98
  • New database schema for logged messages, with conversations and responses tables. If you have previously used the old logs table it will continue to exist but will no longer be written to. #91
  • New -o/--option name value syntax for setting options for models, such as temperature. Available options differ for different models. #63
  • llm models list --options command for viewing all available model options. #82
  • llm "prompt" --save template option for saving a prompt directly to a template. #55
  • Prompt templates can now specify default values for parameters. Thanks, Chris Mungall. #57
  • llm openai models command to list all available OpenAI models from their API. #70
  • llm models default MODEL_ID to set a different model as the default to be used when llm is run without the -m/--model option. #31

Smaller improvements

  • llm -s is now a shortcut for llm --system. #69
  • llm -m 4-32k alias for gpt-4-32k.
  • llm install -e directory command for installing a plugin from a local directory.
  • The LLM_USER_PATH environment variable now controls the location of the directory in which LLM stores its data. This replaces the old LLM_KEYS_PATH and LLM_LOG_PATH and LLM_TEMPLATES_PATH variables. #76
  • Documentation covering Utility functions for plugins.
  • Documentation site now uses Plausible for analytics. #79