A0 LocalAI adds LocalAI as an OpenAI-compatible LLM provider in Agent Zero.
It is a lightweight provider-definition plugin. It does not run LocalAI itself and does not add background services. Instead, it exposes a dedicated LocalAI provider option that can be configured with your own LocalAI/OpenAI-compatible endpoint.
The plugin contributes provider definitions through:
conf/model_providers.yaml
- Runtime plugin name:
a0_localai - Provider id:
localai - Display name:
LocalAI - LiteLLM provider:
openai - Model listing endpoint:
/models - Chat models: supported
- Embedding models: supported
Both chat and embedding provider sections mirror Agent Zero's generic Other OpenAI compatible provider shape, but are labeled specifically for LocalAI.
No API base is hardcoded in conf/model_providers.yaml.
Configure the LocalAI endpoint per model in Agent Zero's Model Configuration UI, the same way you would configure another OpenAI-compatible endpoint.
Common examples:
http://localhost:8080/v1
http://host.docker.internal:8080/v1
http://your-localai-host:8080/v1
Use the API base that matches your LocalAI deployment.
Install through the Agent Zero Plugin Hub when available, or copy this plugin folder into Agent Zero's user plugins directory:
/a0/usr/plugins/a0_localai
Then enable A0 LocalAI from the Plugins UI if it is not already enabled.
- This plugin is intentionally small: it only contributes model provider metadata.
- It uses LiteLLM's
openaiprovider compatibility mode viacustom_llm_provider: openaiso non-registry LocalAI model ids can be used. - You must run and secure your LocalAI server separately.