0.5
LLM now supports additional language models, thanks to a new plugins mechanism for installing additional models.
Plugins are available for 19 models in addition to the default OpenAI ones:
- llm-gpt4all adds support for 17 models that can download and run on your own device, including Vicuna, Falcon and wizardLM.
- llm-mpt30b adds support for the MPT-30B model, a 19GB download.
- llm-palm adds support for Google's PaLM 2 via the Google API.
A comprehensive tutorial, writing a plugin to support a new model describes how to add new models by building plugins in detail.
New features
- Python API documentation for using LLM models, including models from plugins, directly from Python. #75
- Messages are now logged to the database by default - no need to run the
llm init-db
command any more, which has been removed. Instead, you can toggle this behavior off usingllm logs off
or turn it on again usingllm logs on
. Thellm logs status
command shows the current status of the log database. If logging is turned off, passing--log
to thellm prompt
command will cause that prompt to be logged anyway. #98 - New database schema for logged messages, with
conversations
andresponses
tables. If you have previously used the oldlogs
table it will continue to exist but will no longer be written to. #91 - New
-o/--option name value
syntax for setting options for models, such as temperature. Available options differ for different models. #63 llm models list --options
command for viewing all available model options. #82llm "prompt" --save template
option for saving a prompt directly to a template. #55- Prompt templates can now specify default values for parameters. Thanks, Chris Mungall. #57
llm openai models
command to list all available OpenAI models from their API. #70llm models default MODEL_ID
to set a different model as the default to be used whenllm
is run without the-m/--model
option. #31
Smaller improvements
llm -s
is now a shortcut forllm --system
. #69llm -m 4-32k
alias forgpt-4-32k
.llm install -e directory
command for installing a plugin from a local directory.- The
LLM_USER_PATH
environment variable now controls the location of the directory in which LLM stores its data. This replaces the oldLLM_KEYS_PATH
andLLM_LOG_PATH
andLLM_TEMPLATES_PATH
variables. #76 - Documentation covering Utility functions for plugins.
- Documentation site now uses Plausible for analytics. #79