The GitHub Chat Agent is a Streamlit-based application that allows users to interact with a GitHub repository in a conversational manner. It leverages Semantic Kernel and Azure OpenAI to provide intelligent responses and assist with repository queries. Additionally, it supports creating GitHub issues directly from the app.
- Chat with GitHub Repositories: Query and retrieve information from a GitHub repository in a conversational format.
- Create GitHub Issues: Easily create issues in a repository with a user-friendly form.
- Streamlit Interface: Interactive and responsive UI for seamless user experience.
- Azure OpenAI Integration: Powered by Azure OpenAI for intelligent and context-aware responses.
git-chat-agent/
βββ .devcontainer/ # Dev container configuration
β βββ Dockerfile # Dockerfile for the development container
β βββ devcontainer.json # Devcontainer configuration
βββ samples/ # Contains concepts samples
βββ tutorial/ # Notebook file for step by step process of creating the kernel with plugins
βββ app.py # Main Streamlit application
βββ git_plugin.py # Plugin for interacting with GitHub repositories
βββ requirements.txt # Python dependencies
βββ .env.example # Example environment variables
βββ .env # Environment variables (not included in the repo)
- Query repository details in a conversational format.
- View chat history and interact with the assistant.
- Fill out a form to create issues in the repository.
- Add labels and descriptions for better issue tracking.
The git_plugin.py
file provides functionality to interact with GitHub repositories programmatically. It includes features such as:
- Fetching repository details (e.g., branches, commits, pull requests).
- Searching for files or content within the repository.
- Creating and managing GitHub issues.
- Authenticating with GitHub using a Personal Access Token (PAT).
This plugin is a core component of the application, enabling seamless integration with GitHub.
The notebook provides an interactive environment to explore and experiment with the Semantic Kernel framework. It demonstrates how to use Semantic Kernel to build intelligent, context-aware chat agents and plugins for interacting with GitHub repositories.
- Introduction to Semantic Kernel:
- Overview of key concepts like Kernel, Plugins, Agents, Chat Completion, and Vector Stores.
- Chat Completion Service:
- Demonstrates how to create and interact with a chat completion service using Azure OpenAI.
- Chat Completion Agent:
- Explains the difference between a chat completion service and an agent, and how to create an agent for context-aware interactions.
- GitHub Plugin:
- Shows how to create a plugin to interact with the GitHub API for tasks like retrieving repository details, managing issues, and analyzing commits.
- Practical Examples:
- Includes examples of querying user profiles, describing repositories, listing issues, and analyzing commits.
This repository includes a .devcontainer
configuration for Visual Studio Code. To use it:
- Open the repository in VS Code.
- Install the Dev Containers extension.
- Reopen the project in the container.
git clone https://github.com/hannapureddy_microsoft/git-agent.git
cd git-chat-agent
Create a .env
file in the root directory by copying the .env.example
file:
cp .env.example .env
Update the values in the .env
file with your own credentials:
GLOBAL_LLM_SERVICE="AzureOpenAI"
AZURE_OPENAI_ENDPOINT=""
AZURE_OPENAI_API_KEY="YOUR_API_KEY"
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME="gpt-4.1"
AZURE_OPENAI_API_VERSION="2024-12-01-preview"
GITHUB_PAT="YOUR_GITHUB_PERSONAL_ACCESS_TOKEN"
Follow these steps to create an Azure OpenAI endpoint and keys for Azure Open AI Model:
- Log in to the Azure Portal.
- Search for Azure OpenAI in the search bar and select Azure OpenAI.
- Click Create to start creating a new resource.
- Fill in the required details:
- Subscription: Select your Azure subscription.
- Resource Group: Create a new resource group or select an existing one.
- Region: Choose a supported region (e.g., East US, West Europe).
- Name: Provide a unique name for your OpenAI resource.
- Click Review + Create and then Create.
- Navigate to your newly created Azure OpenAI resource.
- Go to the Model Deployments section in the left-hand menu.
- Click Create to deploy a new model.
- Select the Model(ex:GPT-4.1) model from the list.
- Provide a Deployment Name (e.g.,
gpt-4.1
) and configure the model settings as needed. - Click Deploy to start the deployment process.
- Once the deployment is complete, go to the Keys and Endpoint section in your Azure OpenAI resource.
- Copy the Endpoint URL and API Key. These will be used in your
.env
file.
Below is an example of the Keys and Endpoint section in the Azure portal:

Install the required Python packages:
pip install -r requirements.txt
Start the Streamlit app:
streamlit run app.py
docker build -f .devcontainer/Dockerfile -t git-chat-agent .
docker run -p 8501:8501 --env-file .env git-chat-agent
Access the app at http://localhost:8501.