Releases: acon96/home-llm
v0.3.2
v0.3.1
Adds basic area support in prompting, Fix for broken requirements, fix for issue with formatted tools, fix custom API not registering on startup properly
v0.3
NOTE: This is a breaking change and will require you to re-configure any models you have set up.
Adds support for Home Assistant LLM APIs, improved model prompting and tool formatting options, and automatic detection of GGUF quantization levels on HuggingFace
v0.2.17
Disable native llama.cpp wheel optimizations, add Command R prompt format
v0.2.16
Fix for missing huggingface_hub package preventing startup
v0.2.13
Add support for Llama 3, build llama.cpp wheels that are compatible with non-AVX systems, fix an error with exposing script entities, fix multiple small Ollama backend issues, and add basic multi-language support
v0.2.12
Fix cover ICL examples, allow setting number of ICL examples, add min P and typical P sampler options, recommend models during setup, add JSON mode for Ollama backend, fix missing default options
v0.2.11
Add prompt caching, expose llama.cpp runtime settings, build llama-cpp-python wheels using GitHub actions, and install wheels directly from GitHub