Skip to content

andrao/llm-client

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🤖 @andrao/llm-client

npm version build

This repo provides a single interface for interacting with LLMs from Anthropic, OpenAI, Together.ai, and, locally, Ollama.

Primary exports

Function Description
runChatCompletion Interoperable chat completion function

Secondary exports

Function Description
getAnthropicClient Lazy-init an Anthropic SDK client
getOllamaClient Lazy-init an Ollama client via the OpenAI SDK
getOpenAIClient Lazy-init an OpenAI SDK client
getTogetherClient Lazy-init a Togeter.ai client via the OpenAI SDK