Modern .NET SDK for direct Gonka Network inference with OpenAI-compatible chat completions and ECDSA request signing.
Built from a maintained OpenAPI description because Gonka does not currently publish a direct SDK OpenAPI document.
Signs each request with secp256k1 ECDSA and sends the Gonka requester headers required by provider endpoints.
Targets current .NET practices including nullability, trimming, NativeAOT awareness, and source-generated serialization.
using Gonka;
using var client = await GonkaClient.CreateFromEnvironmentAsync();
var response = await client.CreateChatCompletionAsync(
new CreateChatCompletionRequest
{
Model = "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
Messages =
[
new ChatCompletionMessage
{
Role = ChatCompletionMessageRole.User,
Content = "Hello, Gonka!",
},
],
});Set GONKA_PRIVATE_KEY plus either GONKA_ENDPOINTS (https://host/v1;gonka1provider...) or GONKA_SOURCE_URL. GONKA_ADDRESS can override the derived requester address.
Basic example showing how to create a client and make a request.
// Direct Gonka requests are signed with your Gonka private key.
// Provide either GONKA_ENDPOINTS (`https://host/v1;gonka1provider...`) or GONKA_SOURCE_URL for endpoint discovery.
using var client = await GetAuthenticatedClientAsync().ConfigureAwait(false);
var model = Environment.GetEnvironmentVariable("GONKA_CHAT_MODEL") is { Length: > 0 } modelValue
? modelValue
: "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8";
var response = await client.CreateChatCompletionAsync(
new CreateChatCompletionRequest
{
Model = model,
Messages =
[
new ChatCompletionMessage
{
Role = ChatCompletionMessageRole.User,
Content = "Hello, Gonka!",
},
],
}).ConfigureAwait(false);Streaming example showing how to read server-sent chat completion chunks.
// Streaming requests use the same direct Gonka signing layer as non-streaming requests.
using var client = await GetAuthenticatedClientAsync().ConfigureAwait(false);
var model = Environment.GetEnvironmentVariable("GONKA_CHAT_MODEL") is { Length: > 0 } modelValue
? modelValue
: "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8";
var chunks = new List<CreateChatCompletionResponse>();
await foreach (var chunk in client.CreateChatCompletionStreamingAsync(
new CreateChatCompletionRequest
{
Model = model,
Messages =
[
new ChatCompletionMessage
{
Role = ChatCompletionMessageRole.User,
Content = "Write one short sentence about Gonka.",
},
],
}))
{
chunks.Add(chunk);
}MEAI example showing how to use Gonka through the standard IChatClient abstraction.
// The Gonka client implements Microsoft.Extensions.AI.IChatClient for shared chat workflows.
using var client = await GetAuthenticatedClientAsync().ConfigureAwait(false);
Microsoft.Extensions.AI.IChatClient chatClient = client;
var model = Environment.GetEnvironmentVariable("GONKA_CHAT_MODEL") is { Length: > 0 } modelValue
? modelValue
: "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8";
var response = await chatClient.GetResponseAsync(
[new Microsoft.Extensions.AI.ChatMessage(Microsoft.Extensions.AI.ChatRole.User, "Write one short sentence about Gonka.")],
new Microsoft.Extensions.AI.ChatOptions { ModelId = model }).ConfigureAwait(false);Open an issue in tryAGI/Gonka.
Use GitHub Discussions for design questions and usage help.
Join the tryAGI Discord for broader discussion across SDKs.
This project is supported by JetBrains through the Open Source Support Program.
