Smoke Test: OTEL Backends
Overall Result: FAIL
Run: https://github.com/github/gh-aw/actions/runs/25918800907
Executive Summary
Local OTEL emission is partially working — a span for run 25918800907 was written to the local JSONL mirror, but 2 OTLP export errors were recorded, indicating the push to remote backends failed. Grafana Tempo shows zero spans across all service names in a 2-hour query window, consistent with push failures. Sentry project gh-aw exists and MCP access works, but no events-query tool is available to confirm span visibility.
Checklist
Evidence
Step 1: Local OTEL Emission — send_status = inconclusive
| Check |
Result |
OTEL_EXPORTER_OTLP_ENDPOINT |
✅ set |
OTEL_EXPORTER_OTLP_HEADERS |
✅ set |
GH_AW_OTLP_ENDPOINTS |
✅ set |
OTEL_SERVICE_NAME |
✅ gh-aw |
COPILOT_OTEL_FILE_EXPORTER_PATH |
⚠️ not set |
/tmp/gh-aw/otel.jsonl |
✅ exists, 1 span: gh-aw.agent.setup |
github.run_id in span |
✅ 25918800907 confirmed |
otlp-export-errors.count |
❌ 2 |
The local JSONL mirror contains a span tagged with the current run ID. However, /tmp/gh-aw/otlp-export-errors.count = 2, meaning two OTLP push batches failed. COPILOT_OTEL_FILE_EXPORTER_PATH is unset (file exporter not configured).
Step 2: Sentry — sentry_status = inconclusive
| Check |
Result |
| MCP connection |
✅ working |
| Organization |
✅ github at (github.sentry.io/redacted) |
Project gh-aw |
✅ found |
| Current-run spans visible |
❓ no span-query tool available |
Recent gh-aw spans visible |
❓ no span-query tool available |
The Sentry MCP exposes find_organizations, find_projects, whoami, etc., but no trace/span search tool. The project exists. Whether spans arrived cannot be confirmed or denied — this is an observability gap in the query tooling.
Step 3: Grafana — grafana_status = fail
| Check |
Result |
| MCP connection |
✅ working |
| Datasource |
✅ grafanacloud-traces (Tempo, uid: grafanacloud-traces) |
| Current-run spans (30-min window) |
❌ 0 traces |
Recent gh-aw spans (2-hour window) |
❌ 0 traces |
| All service names in Tempo (2-hour) |
❌ {} — Tempo completely empty |
Query used: {resource.service.name="gh-aw"} via tempo_traceql-search. Both 30-minute and 2-hour windows returned {"traces":[]}. tempo_get-attribute-values for resource.service.name returned {}, confirming no spans of any service are in the Tempo backend for the last 2 hours. This is consistent with the 2 OTLP push failures.
Blockers
❌ OTLP push failures (emit-side)
/tmp/gh-aw/otlp-export-errors.count = 2. The spans were written locally but failed to reach the configured OTLP endpoint (OTEL_EXPORTER_OTLP_ENDPOINT). Check:
- The endpoint is reachable from the runner (firewall/network)
- The headers in
OTEL_EXPORTER_OTLP_HEADERS are valid and unexpired
GH_AW_OTLP_ENDPOINTS routing logic is correct
❌ Grafana Tempo completely empty (read-side)
No span data at all in the grafanacloud-traces Tempo datasource. This is the downstream effect of the OTLP push failures. Once push errors are resolved, Tempo should start receiving data.
⚠️ Sentry span-query tooling gap
The Sentry MCP server has no search_events, search_traces, or equivalent span-search tool. Sentry verification is blocked. A search_issues or get_sentry_resource tool for span/event queries would enable proper verification.
Run URL: https://github.com/github/gh-aw/actions/runs/25918800907
Generated by 🧪 Smoke OTEL Backends for issue #32351 · ● 4.9M · ◷
Smoke Test: OTEL Backends
Overall Result: FAIL
Run: https://github.com/github/gh-aw/actions/runs/25918800907
Executive Summary
Local OTEL emission is partially working — a span for run
25918800907was written to the local JSONL mirror, but 2 OTLP export errors were recorded, indicating the push to remote backends failed. Grafana Tempo shows zero spans across all service names in a 2-hour query window, consistent with push failures. Sentry projectgh-awexists and MCP access works, but no events-query tool is available to confirm span visibility.Checklist
/tmp/gh-aw/otel.jsonlEvidence
Step 1: Local OTEL Emission —
send_status = inconclusiveOTEL_EXPORTER_OTLP_ENDPOINTOTEL_EXPORTER_OTLP_HEADERSGH_AW_OTLP_ENDPOINTSOTEL_SERVICE_NAMEgh-awCOPILOT_OTEL_FILE_EXPORTER_PATH/tmp/gh-aw/otel.jsonlgh-aw.agent.setupgithub.run_idin span25918800907confirmedotlp-export-errors.countThe local JSONL mirror contains a span tagged with the current run ID. However,
/tmp/gh-aw/otlp-export-errors.count= 2, meaning two OTLP push batches failed.COPILOT_OTEL_FILE_EXPORTER_PATHis unset (file exporter not configured).Step 2: Sentry —
sentry_status = inconclusivegithubat (github.sentry.io/redacted)gh-awgh-awspans visibleThe Sentry MCP exposes
find_organizations,find_projects,whoami, etc., but no trace/span search tool. The project exists. Whether spans arrived cannot be confirmed or denied — this is an observability gap in the query tooling.Step 3: Grafana —
grafana_status = failgrafanacloud-traces(Tempo, uid:grafanacloud-traces)gh-awspans (2-hour window){}— Tempo completely emptyQuery used:
{resource.service.name="gh-aw"}viatempo_traceql-search. Both 30-minute and 2-hour windows returned{"traces":[]}.tempo_get-attribute-valuesforresource.service.namereturned{}, confirming no spans of any service are in the Tempo backend for the last 2 hours. This is consistent with the 2 OTLP push failures.Blockers
❌ OTLP push failures (emit-side)
/tmp/gh-aw/otlp-export-errors.count= 2. The spans were written locally but failed to reach the configured OTLP endpoint (OTEL_EXPORTER_OTLP_ENDPOINT). Check:OTEL_EXPORTER_OTLP_HEADERSare valid and unexpiredGH_AW_OTLP_ENDPOINTSrouting logic is correct❌ Grafana Tempo completely empty (read-side)
No span data at all in the
grafanacloud-tracesTempo datasource. This is the downstream effect of the OTLP push failures. Once push errors are resolved, Tempo should start receiving data.The Sentry MCP server has no
search_events,search_traces, or equivalent span-search tool. Sentry verification is blocked. Asearch_issuesorget_sentry_resourcetool for span/event queries would enable proper verification.Run URL: https://github.com/github/gh-aw/actions/runs/25918800907