Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 4 additions & 5 deletions .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
@@ -1,9 +1,8 @@
{
"name": "python-agentframework-demos",
"build": {
"dockerfile": "Dockerfile",
"context": ".."
},
"dockerComposeFile": "docker-compose.yml",
"service": "app",
"workspaceFolder": "/workspaces/python-agentframework-demos",
"features": {
"ghcr.io/azure/azure-dev/azd:latest": {}
},
Expand All @@ -22,4 +21,4 @@
}
},
"remoteUser": "vscode"
}
}
17 changes: 17 additions & 0 deletions .devcontainer/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
services:
app:
build:
context: ..
dockerfile: .devcontainer/Dockerfile
volumes:
- ..:/workspaces/python-agentframework-demos:cached
command: sleep infinity
environment:
- OTEL_EXPORTER_OTLP_ENDPOINT=http://aspire-dashboard:18889

aspire-dashboard:
image: mcr.microsoft.com/dotnet/aspire-dashboard:latest
ports:
- "18888:18888"
environment:
- DASHBOARD__FRONTEND__AUTHMODE=Unsecured
1 change: 1 addition & 0 deletions .env.sample
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@ OPENAI_MODEL=gpt-3.5-turbo
# Configure for GitHub models: (GITHUB_TOKEN already exists inside Codespaces)
GITHUB_MODEL=gpt-5-mini
GITHUB_TOKEN=YOUR-GITHUB-PERSONAL-ACCESS-TOKEN
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
2 changes: 1 addition & 1 deletion .github/prompts/review_pr_comments.prompt.md
Original file line number Diff line number Diff line change
Expand Up @@ -133,4 +133,4 @@ The thread ID starts with `PRRT_` and can be found in the GraphQL query response
Note: This skill can be removed once the GitHub MCP server has added built-in support for replying to PR review comments and resolving threads.
See:
https://github.com/github/github-mcp-server/issues/1323
https://github.com/github/github-mcp-server/issues/1768
https://github.com/github/github-mcp-server/issues/1768
58 changes: 58 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -173,6 +173,64 @@ You can run the examples in this repository by executing the scripts in the `exa
| [agent_mcp_local.py](examples/agent_mcp_local.py) | An agent connected to a local MCP server (e.g. for expense logging). |
| [openai_tool_calling.py](examples/openai_tool_calling.py) | Tool calling with the low-level OpenAI SDK, showing manual tool dispatch. |
| [workflow_basic.py](examples/workflow_basic.py) | A workflow-based agent. |
| [agent_otel_aspire.py](examples/agent_otel_aspire.py) | An agent with OpenTelemetry tracing, metrics, and structured logs exported to the [Aspire Dashboard](https://aspire.dev/dashboard/standalone/). |

## Using the Aspire Dashboard for telemetry

The [agent_otel_aspire.py](examples/agent_otel_aspire.py) example can export OpenTelemetry traces, metrics, and structured logs to a [Aspire Dashboard](https://aspire.dev/dashboard/standalone/).

### In GitHub Codespaces / Dev Containers

The Aspire Dashboard runs automatically as a service alongside the dev container. No extra setup is needed.

1. The `OTEL_EXPORTER_OTLP_ENDPOINT` environment variable is already set by the dev container.

2. Run the example:

```sh
uv run agent_otel_aspire.py
```

3. Open the dashboard at <http://localhost:18888> and explore:

* **Traces**: See the full span tree — agent invocation → chat completion → tool execution
* **Metrics**: View token usage and operation duration histograms
* **Structured Logs**: Browse conversation messages (system, user, assistant, tool)
* **GenAI visualizer**: Select a chat completion span to see the rendered conversation

### Local environment (without Dev Containers)

If you're running locally without Dev Containers, you need to start the Aspire Dashboard manually:

1. Start the Aspire Dashboard:

```sh
docker run --rm -it -d -p 18888:18888 -p 4317:18889 --name aspire-dashboard \
-e DASHBOARD__FRONTEND__AUTHMODE=Unsecured \
mcr.microsoft.com/dotnet/aspire-dashboard:latest
```

2. Add the OTLP endpoint to your `.env` file:

```sh
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
```

3. Run the example:

```sh
uv run agent_otel_aspire.py
```

4. Open the dashboard at <http://localhost:18888> and explore.

5. When done, stop the dashboard:

```shell
docker stop aspire-dashboard
```

For the full Python + Aspire guide, see [Use the Aspire dashboard with Python apps](https://aspire.dev/dashboard/standalone-for-python/).

## Resources

Expand Down
101 changes: 101 additions & 0 deletions examples/agent_otel_aspire.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
import asyncio
import logging
import os
import random
from datetime import datetime, timezone
from typing import Annotated

from agent_framework import ChatAgent
from agent_framework.observability import configure_otel_providers
from agent_framework.openai import OpenAIChatClient
from azure.identity.aio import DefaultAzureCredential, get_bearer_token_provider
from dotenv import load_dotenv
from pydantic import Field
from rich import print
from rich.logging import RichHandler

# Setup logging
handler = RichHandler(show_path=False, rich_tracebacks=True, show_level=False)
logging.basicConfig(level=logging.WARNING, handlers=[handler], force=True, format="%(message)s")
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)

# Configure OpenTelemetry export to the Aspire Dashboard (if endpoint is set)
otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT")
if otlp_endpoint:
os.environ.setdefault("OTEL_EXPORTER_OTLP_PROTOCOL", "grpc")
os.environ.setdefault("OTEL_SERVICE_NAME", "agent-framework-demo")
configure_otel_providers(enable_sensitive_data=True)
logger.info(f"OpenTelemetry export enabled — sending to {otlp_endpoint}")
else:
logger.info(
"Set OTEL_EXPORTER_OTLP_ENDPOINT in .env to export telemetry to the Aspire Dashboard. "
"Use http://aspire-dashboard:18889 in Codespaces/Dev Containers or http://localhost:4317 locally."
)

# Configure OpenAI client based on environment
load_dotenv(override=True)
API_HOST = os.getenv("API_HOST", "github")

async_credential = None
if API_HOST == "azure":
async_credential = DefaultAzureCredential()
token_provider = get_bearer_token_provider(async_credential, "https://cognitiveservices.azure.com/.default")
client = OpenAIChatClient(
base_url=f"{os.environ['AZURE_OPENAI_ENDPOINT']}/openai/v1/",
api_key=token_provider,
model_id=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
)
elif API_HOST == "github":
client = OpenAIChatClient(
base_url="https://models.github.ai/inference",
api_key=os.environ["GITHUB_TOKEN"],
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
)
else:
client = OpenAIChatClient(
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
)


def get_weather(
city: Annotated[str, Field(description="City name, spelled out fully")],
) -> dict:
"""Returns weather data for a given city, a dictionary with temperature and description."""
logger.info(f"Getting weather for {city}")
weather_options = [
{"temperature": 72, "description": "Sunny"},
{"temperature": 60, "description": "Rainy"},
{"temperature": 55, "description": "Cloudy"},
{"temperature": 45, "description": "Windy"},
]
return random.choice(weather_options)


def get_current_time(
timezone_name: Annotated[str, Field(description="Timezone name, e.g. 'US/Eastern', 'Asia/Tokyo', 'UTC'")],
) -> str:
"""Returns the current date and time in UTC (timezone_name is for display context only)."""
logger.info(f"Getting current time for {timezone_name}")
now = datetime.now(timezone.utc)
return f"The current time in {timezone_name} is approximately {now.strftime('%Y-%m-%d %H:%M:%S')} UTC"


agent = ChatAgent(
name="weather-time-agent",
chat_client=client,
instructions="You are a helpful assistant that can look up weather and time information.",
tools=[get_weather, get_current_time],
)


async def main():
response = await agent.run("What's the weather in Seattle and what time is it in Tokyo?")
print(response.text)

if async_credential:
await async_credential.close()


if __name__ == "__main__":
asyncio.run(main())
58 changes: 58 additions & 0 deletions examples/spanish/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -174,6 +174,64 @@ Puedes ejecutar los ejemplos en este repositorio ejecutando los scripts en el di
| [agent_mcp_local.py](agent_mcp_local.py) | Un agente conectado a un servidor MCP local (ej. para registro de gastos). |
| [openai_tool_calling.py](openai_tool_calling.py) | Llamadas a funciones con el SDK de OpenAI de bajo nivel, mostrando despacho manual de herramientas. |
| [workflow_basic.py](workflow_basic.py) | Usa Agent Framework para crear un agente basado en flujo de trabajo. |
| [agent_otel_aspire.py](agent_otel_aspire.py) | Un agente con trazas, métricas y logs estructurados de OpenTelemetry exportados al [Aspire Dashboard](https://aspire.dev/dashboard/standalone/). |

## Usar el Aspire Dashboard para telemetría

El ejemplo [agent_otel_aspire.py](agent_otel_aspire.py) puede exportar trazas, métricas y logs estructurados de OpenTelemetry a un [Aspire Dashboard](https://aspire.dev/dashboard/standalone/).

### En GitHub Codespaces / Dev Containers

El Aspire Dashboard se ejecuta automaticamente como un servicio junto al dev container. No necesitas configuracion adicional.

1. La variable de entorno `OTEL_EXPORTER_OTLP_ENDPOINT` ya esta configurada por el dev container.

2. Ejecuta el ejemplo:

```sh
uv run agent_otel_aspire.py
```

3. Abre el dashboard en <http://localhost:18888> y explora:

* **Traces**: Ve el arbol completo de spans — invocacion del agente → completado del chat → ejecucion de herramientas
* **Metrics**: Consulta histogramas de uso de tokens y duracion de operaciones
* **Structured Logs**: Navega los mensajes de la conversacion (sistema, usuario, asistente, herramienta)
* **Visualizador GenAI**: Selecciona un span de completado del chat para ver la conversacion renderizada

### Entorno local (sin Dev Containers)

Si ejecutas localmente sin Dev Containers, necesitas iniciar el Aspire Dashboard manualmente:

1. Inicia el Aspire Dashboard:

```sh
docker run --rm -it -d -p 18888:18888 -p 4317:18889 --name aspire-dashboard \
-e DASHBOARD__FRONTEND__AUTHMODE=Unsecured \
mcr.microsoft.com/dotnet/aspire-dashboard:latest
```

2. Agrega el endpoint OTLP a tu archivo `.env`:

```sh
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
```

3. Ejecuta el ejemplo:

```sh
uv run agent_otel_aspire.py
```

4. Abre el dashboard en <http://localhost:18888> y explora.

5. Cuando termines, deten el dashboard:

```sh
docker stop aspire-dashboard
```

Para la guia completa de Python + Aspire, consulta [Usar el Aspire Dashboard con apps de Python](https://aspire.dev/dashboard/standalone-for-python/).

## Recursos

Expand Down
103 changes: 103 additions & 0 deletions examples/spanish/agent_otel_aspire.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
import asyncio
import logging
import os
import random
from datetime import datetime, timezone
from typing import Annotated

from agent_framework import ChatAgent
from agent_framework.observability import configure_otel_providers
from agent_framework.openai import OpenAIChatClient
from azure.identity.aio import DefaultAzureCredential, get_bearer_token_provider
from dotenv import load_dotenv
from pydantic import Field
from rich import print
from rich.logging import RichHandler

# Configura logging
handler = RichHandler(show_path=False, rich_tracebacks=True, show_level=False)
logging.basicConfig(level=logging.WARNING, handlers=[handler], force=True, format="%(message)s")
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)

# Configura la exportación de OpenTelemetry al Aspire Dashboard (si el endpoint está configurado)
otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT")
if otlp_endpoint:
os.environ.setdefault("OTEL_EXPORTER_OTLP_PROTOCOL", "grpc")
os.environ.setdefault("OTEL_SERVICE_NAME", "agent-framework-demo")
configure_otel_providers(enable_sensitive_data=True)
logger.info(f"Exportación OpenTelemetry habilitada — enviando a {otlp_endpoint}")
else:
logger.info(
"Configura OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317 en .env"
" para exportar telemetría al Aspire Dashboard"
)

# Configura el cliente para usar Azure OpenAI, GitHub Models u OpenAI
load_dotenv(override=True)
API_HOST = os.getenv("API_HOST", "github")

async_credential = None
if API_HOST == "azure":
async_credential = DefaultAzureCredential()
token_provider = get_bearer_token_provider(async_credential, "https://cognitiveservices.azure.com/.default")
client = OpenAIChatClient(
base_url=f"{os.environ['AZURE_OPENAI_ENDPOINT']}/openai/v1/",
api_key=token_provider,
model_id=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
)
elif API_HOST == "github":
client = OpenAIChatClient(
base_url="https://models.github.ai/inference",
api_key=os.environ["GITHUB_TOKEN"],
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
)
else:
client = OpenAIChatClient(
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
)


def get_weather(
city: Annotated[str, Field(description="City name, spelled out fully")],
) -> dict:
"""Devuelve datos meteorológicos para una ciudad: temperatura y descripción."""
logger.info(f"Obteniendo el clima para {city}")
weather_options = [
{"temperature": 22, "description": "Soleado"},
{"temperature": 15, "description": "Lluvioso"},
{"temperature": 13, "description": "Nublado"},
{"temperature": 7, "description": "Ventoso"},
]
return random.choice(weather_options)


def get_current_time(
timezone_name: Annotated[
str, Field(description="Timezone name, e.g. 'US/Eastern', 'America/Mexico_City', 'UTC'")
],
) -> str:
"""Devuelve la fecha y hora actual en UTC (timezone_name es solo para contexto de visualización)."""
logger.info(f"Obteniendo la hora actual para {timezone_name}")
now = datetime.now(timezone.utc)
return f"La hora actual en {timezone_name} es aproximadamente {now.strftime('%Y-%m-%d %H:%M:%S')} UTC"


agent = ChatAgent(
name="weather-time-agent",
chat_client=client,
instructions="Eres un asistente útil que puede consultar información del clima y la hora.",
tools=[get_weather, get_current_time],
)


async def main():
response = await agent.run("¿Cómo está el clima en Ciudad de México y qué hora es en Buenos Aires?")
print(response.text)

if async_credential:
await async_credential.close()


if __name__ == "__main__":
asyncio.run(main())
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ dependencies = [
"aiohttp",
"faker",
"fastmcp",
"opentelemetry-exporter-otlp-proto-grpc",
"agent-framework-core @ git+https://github.com/microsoft/agent-framework.git@98cd72839e4057d661a58092a3b013993264d834#subdirectory=python/packages/core",
"agent-framework-devui @ git+https://github.com/microsoft/agent-framework.git@98cd72839e4057d661a58092a3b013993264d834#subdirectory=python/packages/devui",
]
Expand Down
Loading