Agentic Observability is a Streamlit-based application that integrates with JIRA and Datadog to fetch data, process it using OpenAI's API(or meta ai API or other), and provide actionable insights via a chatbot interface.
-
Install Ollama: Follow the Ollama installation guide for Linux. Typically, you can use the following command:
curl -fsSL https://ollama.ai/install.sh | sh -
Verify Installation: Check that Ollama is installed correctly:
ollama --version
-
Pull the
phi3Model: Thephi3model is lightweight and suitable for local testing. Pull it using:ollama pull phi3
-
Run Ollama: Start the Ollama service:
ollama serve
-
Verify Ollama Service: Ensure the service is running on port
11434:curl http://localhost:11434
-
Clone the Repository:
git clone https://github.com/your-username/agentic-ai-observability.git cd agentic-ai-observability -
Set Up a Virtual Environment:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install Dependencies:
pip install --upgrade pip pip install -r requirements.txt
-
Set Up Environment Variables: Create a
.envfile in the root directory and add the following:OPENAI_API_KEY=your-openai-api-key JIRA_BASE_URL=https://your-jira-instance.atlassian.net JIRA_EMAIL=your-email@example.com JIRA_API_TOKEN=your-jira-api-token DATADOG_API_KEY=your-datadog-api-key DATADOG_APP_KEY=your-datadog-app-key OLLAMA_MODEL_NAME=phi3 OLLAMA_HOST=http://localhost:11434
-
Run the Application:
streamlit run ui/streamlit_chat.py
-
Access the Application: Open your browser and navigate to:
http://localhost:8502
- Prompt test example:
Show Unauthorized SSH login from datadog.Show high-priority Jira tickets.
Important Note:
To test with real APIs, you need to comment out the code for fake data and uncomment the code for real API calls in the following files:
Once you have tested the application with fake data, you can configure it to use real APIs for JIRA and Datadog.
-
Update the
.envFile: Ensure your.envfile contains valid API keys and credentials for JIRA and Datadog. -
Run the Application:
streamlit run ui/streamlit_chat.py
-
Verify the APIs:
- Ensure that your JIRA and Datadog credentials are valid.
- Check that the application fetches real data from the APIs.
-
Ensure Docker and Docker Compose Are Installed:
- Install Docker.
- Install Docker Compose.
-
Clone the Repository:
git clone https://github.com/your-username/agentic-ai-observability.git cd agentic-observability -
Set Up the
.envFile: Create a.envfile in the root directory with the following:OPENAI_API_KEY=your-openai-api-key JIRA_BASE_URL=https://your-jira-instance.atlassian.net JIRA_EMAIL=your-email@example.com JIRA_API_TOKEN=your-jira-api-token DATADOG_API_KEY=your-datadog-api-key DATADOG_APP_KEY=your-datadog-app-key OLLAMA_MODEL_NAME=phi3 OLLAMA_HOST=http://ollama:11434
-
Build and Start the Services:
docker-compose up --build
-
Verify the Services:
- Check that the
chat-uiservice is running on port8501. - Check that the
ollamaservice is running on port11434.
- Check that the
-
Access the Application: Open your browser and navigate to:
http://localhost:8502
-
Error: Port Already in Use:
- Stop any process using the conflicting port:
netstat -ano | findstr :8501 # Replace 8501 with the conflicting port taskkill /PID <PID> /F # Replace <PID> with the process ID
Another solution is to change the mapped port in the
docker-compose.ymlfile.
- Stop any process using the conflicting port:
-
Error: Failed to Connect to Ollama:
- Ensure the
ollamaservice is running:docker-compose up -d ollama
- Verify the
phi3model is available:If not, download it:docker exec -it <ollama-container-id> ollama list
docker exec -it <ollama-container-id> ollama pull phi3
- Ensure the
-
Error: File Not Found:
- Ensure the
datadirectory and its files (sample_jira_tickets.json,sample_datadog_logs.json) are present in the project.
- Ensure the
- Jodionísio Muachifi - GitHub Profile
This project is licensed under the MIT License. See the LICENSE file for details.
