This project demonstrates integration with the GitHub AI model inference endpoint using the Azure.AI.OpenAI SDK.
- .NET 8.0 SDK
- GitHub Personal Access Token with
models:readpermissions
-
Set your GitHub token as an environment variable:
PowerShell:
$Env:GITHUB_TOKEN="<your-github-token-goes-here>"
Command Prompt:
set GITHUB_TOKEN=<your-github-token-goes-here>
Bash:
export GITHUB_TOKEN="<your-github-token-goes-here>"
-
Restore dependencies:
dotnet restore
Run the project using:
dotnet runThe program will demonstrate several examples:
- Basic chat completion
- Multi-turn conversation
- Streaming output
- Tool usage with a flight information example
Each example will be clearly marked in the console output.
- Basic chat completion with the GPT-4o model
- Multi-turn conversation support
- Streaming responses for better user experience
- Function calling with a flight information example
- Make sure your GitHub token has the required
models:readpermissions, or the requests will return unauthorized errors. - This code uses the Azure.AI.OpenAI SDK version 1.0.0-beta.9, not the OpenAI SDK.
- The endpoint used is
https://models.inference.ai.azure.comfor accessing GitHub AI models.