Skip to content

0.1.193

Compare
Choose a tag to compare
@pchalasani pchalasani released this 12 Feb 22:07
· 269 commits to main since this release

Support ollama OpenAI API-compatility, i.e ollama LLM server now mimics the OpenAI API, so any code that used to work for OpenAI LLMs will now work with a simple change of api_base.

Langroid takes care of setting the api_base behind the scenes, when you specify the local LLM using chat_model = "ollama/mistral", e.g.

import langroid.language_models as lm
import langroid as lr

llm_config = lm.OpenAIGPTConfig(
    chat_model="ollama/mistral:7b-instruct-v0.2-q8_0",
    chat_context_length=16_000, # adjust based on model
)
agent = lr.ChatAgent(llm=llm_config)
...

See more in this tutorial on Local LLM Setup with Langroid