Cisharpai is a unified .NET client library for interacting with multiple LLM providers. It exposes shared interfaces so you can switch providers (OpenAI, Azure OpenAI, Azure AI Inference, Anthropic, Cohere) with minimal code changes.
- One shared
IChatCompletionClientandIEmbeddingClientinterface - Unified request/response models across all providers
- Provider-specific packages:
Cisharpai.OpenAi,Cisharpai.Azure,Cisharpai.Anthropic,Cisharpai.Cohere - Built-in HTTP resilience for retries and timeouts
- Feature Collection pattern for optional capabilities: JSON output, tool calling, grounded chat (RAG), image embeddings, multimodal embeddings
- No exceptions for API errors -- consistent
IsSuccess/ErrorMessageerror handling - Full debug support with
RawRequestJson/RawResponseJson
- Add references to the core library and a provider package:
- Cisharpai
- Cisharpai.OpenAi or Cisharpai.Azure or Cisharpai.Anthropic or Cisharpai.Cohere
- Register and call the client:
using Cisharpai;
using Cisharpai.Models;
using Cisharpai.OpenAi;
using Microsoft.Extensions.DependencyInjection;
var services = new ServiceCollection();
services.AddOpenAiClient(options =>
{
options.ApiKey = "YOUR_API_KEY";
});
var provider = services.BuildServiceProvider();
var client = provider.GetRequiredService<IChatCompletionClient>();
var request = new ChatCompletionRequest(
Messages: [new LlmMessage(LlmRole.User, "Say hello in one sentence.")],
Model: "gpt-4.1-nano",
Temperature: 0.2,
MaxTokens: 100);
var response = await client.GetChatCompletionAsync(request);
Console.WriteLine(response.Content);Start here:
- Getting Started
- Provider Feature Matrix -- see what each provider supports
- OpenAI Quickstart
- Embeddings -- text, image, and multimodal embeddings across providers
- JSON Output -- JSON Mode and Structured Outputs
- Tool Calling -- function calling across providers
- Grounded Chat (RAG) -- document grounding with citations
- Feature Extensions -- Feature Collection pattern
- Interactive console demo: src/Cisharp.Console/ -- covers all providers and features
The project includes a PowerShell build script that handles versioning, building, testing, and NuGet packaging.
- .NET 8 SDK and .NET 10 SDK
- PowerShell 7+ (
pwsh)
pwsh scripts/build.ps1This will:
- Restore dotnet tools (including GitVersion)
- Calculate the version using GitVersion (ContinuousDeployment mode)
- Restore NuGet packages
- Build the solution in Release configuration
- Run unit tests on both net8.0 and net10.0
- Pack NuGet packages into
artifacts/NuGet/
# Skip tests
pwsh scripts/build.ps1 -skiptest
# Build and publish to nuget.org
pwsh scripts/build.ps1 -nugetApiKey "YOUR_KEY" -nugetPublish $true| Output | Location |
|---|---|
NuGet packages (.nupkg + .snupkg) |
artifacts/NuGet/ |
Test results (.trx) |
artifacts/TestResults/ |
Integration tests require environment variables to be set. Create a .env file in the repository root or set them in your environment.
| Variable | Format | Example |
|---|---|---|
OPENAI_TEST_API_KEY |
OpenAI API key | sk-proj-... |
ANTHROPIC_TEST_API_KEY |
Anthropic API key | sk-ant-... |
AZURE_OPENAI_TEST_ENDPOINT |
Azure OpenAI endpoint URL (no trailing slash) | https://myresource.openai.azure.com |
AZURE_OPENAI_TEST_API_KEY |
Azure OpenAI API key | abc123... |
AZURE_OPENAI_TEST_DEPLOYMENTS |
Comma-separated deployment names | gpt-4o,gpt-4o-mini |
AZURE_OPENAI_TEST_EMBEDDING_DEPLOYMENT |
Single embedding deployment name | text-embedding-ada-002 |
AZURE_INFERENCE_TEST_ENDPOINT |
Azure AI Inference endpoint URL | https://mymodel.eastus.models.ai.azure.com |
AZURE_INFERENCE_TEST_API_KEY |
Azure AI Inference API key | abc123... |
AZURE_INFERENCE_TEST_MODELS |
Comma-separated model IDs for Azure AI Inference | Phi-3-mini-4k-instruct |
AZURE_INFERENCE_TEST_EMBEDDING_ENDPOINT |
Azure AI Inference embedding endpoint URL | https://myembedding.eastus.models.ai.azure.com |
AZURE_INFERENCE_TEST_EMBEDDING_KEY |
Azure AI Inference embedding API key | abc123... |
AZURE_INFERENCE_TEST_EMBEDDING_MODEL |
Model ID for Azure AI Inference embedding | Cohere-embed-v3-english |
COHERE_TEST_API_KEY |
Cohere API key | ... |
# OpenAI
OPENAI_TEST_API_KEY=sk-proj-your-key-here
# Azure OpenAI
AZURE_OPENAI_TEST_ENDPOINT=https://myresource.openai.azure.com
AZURE_OPENAI_TEST_API_KEY=your-azure-openai-key
AZURE_OPENAI_TEST_DEPLOYMENTS=gpt-4o,gpt-4o-mini
AZURE_OPENAI_TEST_EMBEDDING_DEPLOYMENT=text-embedding-ada-002
# Azure AI Inference
AZURE_INFERENCE_TEST_ENDPOINT=https://mymodel.eastus.models.ai.azure.com
AZURE_INFERENCE_TEST_API_KEY=your-azure-inference-key
AZURE_INFERENCE_TEST_MODELS=Phi-3-mini-4k-instruct
AZURE_INFERENCE_TEST_EMBEDDING_ENDPOINT=https://myembedding.eastus.models.ai.azure.com
AZURE_INFERENCE_TEST_EMBEDDING_KEY=your-azure-inference-embedding-key
AZURE_INFERENCE_TEST_EMBEDDING_MODEL=Cohere-embed-v3-english
# Anthropic
ANTHROPIC_TEST_API_KEY=sk-ant-your-key-here
# Cohere
COHERE_TEST_API_KEY=your-cohere-key
Note: The .env file parser supports both quoted and unquoted values, and lines starting with # are treated as comments.
See the repository license file for terms.