Skip to content

opsmeter-io/opsmeter.io-integration-examples

Repository files navigation

Opsmeter Integration Examples

Build Version License Node SDK Python SDK

Provider changes, Opsmeter payload stays the same.

Working examples for sending telemetry to POST /v1/ingest/llm-request in:

  • .NET (examples/dotnet)
  • Node.js (examples/node)
  • Python (examples/python)

This repo is optimized for teams implementing LLM cost tracking, OpenAI usage monitoring, Anthropic usage telemetry, and AI inference cost control with a consistent request schema.

Opsmeter product site: https://opsmeter.io
Opsmeter API base: https://api.opsmeter.io Provider + catalog model names: https://opsmeter.io/docs/catalog Official SDK package identities for opsmeter.io:

Current provider support in examples: OpenAI and Anthropic only.

This repository targets LLM telemetry quickstart, OpenAI cost tracking examples, Anthropic integration examples, and AI cost observability setup keywords.

What this repo solves

  • No-proxy telemetry: keep your provider call path untouched, send attribution metadata after each LLM call.
  • Retry-safe ingestion: reuse externalRequestId on retries to prevent duplicate request rows.
  • Cost attribution dimensions: keep endpointTag and promptVersion consistent for feature-level and version-level analysis.

Documentation paths

  • Direct ingest (current production path): Quickstart + payload contract + language examples under examples/.
  • SDK auto-instrumentation (preview path): moved to dedicated SDK repositories.
  • Direct ingest docs stay valid; SDK docs are additive and do not replace existing integration docs.

Official package identity (opsmeter.io)

Table of contents

Quickstart (60s)

  1. Clone and set your API key.
git clone https://github.com/opsmeter-io/opsmeter.io-integration-examples.git
cd opsmeter-integration-examples
export OPSMETER_API_KEY="<YOUR_WORKSPACE_PRIMARY_API_KEY>"
export OPSMETER_API_BASE_URL="https://api.opsmeter.io"
  1. Run one stack (Node shown below):
# Provider/model names: https://opsmeter.io/docs/catalog
node examples/node/index.mjs --provider openai --model gpt-4o-mini --retry
  1. Expected output:
Business call completed.
Telemetry dispatched (non-blocking).
Ingest response: 200 ok=true planTier=Free warnings=0
Retry with same externalRequestId sent.
  1. Verify in product:
  • Dashboard request count increases.
  • endpointTag and promptVersion appear in Top Endpoints / Prompt Versions.

--retry uses the same externalRequestId to demonstrate retry-safe behavior.

Payload contract (shared)

Canonical ingest endpoint: https://api.opsmeter.io/v1/ingest/llm-request

All examples send this same shape:

{
  "externalRequestId": "ext_123abc",
  "provider": "openai",
  "model": "gpt-4o-mini",
  "promptVersion": "summary_v3",
  "endpointTag": "checkout.ai_summary",
  "inputTokens": 120,
  "outputTokens": 45,
  "totalTokens": 165,
  "latencyMs": 820,
  "status": "success",
  "errorCode": null,
  "userId": null,
  "dataMode": "real",
  "environment": "prod"
}

provider and model values should be selected from the catalog: https://opsmeter.io/docs/catalog

Allowed values

Field Allowed Notes
provider openai, anthropic Current support in this repo/examples
status success, error Required by API validation
dataMode real, test, demo Default recommendation: real
environment prod, staging, dev Use real deployment environment

Recommended combinations

Use case dataMode environment
Production traffic real prod
QA/Test traffic test staging or dev
Seed/demo flows demo dev

If you do not label these fields correctly, dashboard analytics can mix operational and non-production signals.

Architecture

flowchart LR
  A["LLM call"] --> B["Map usage + latency"]
  B --> C["Build Opsmeter payload"]
  C --> D["POST /v1/ingest/llm-request"]
  D --> E["Dashboard / Budgets / Alerts"]
Loading

Quick visual

Quickstart example

Examples

Launch-ready wedges

These are lighter assets optimized for GitHub discovery, founder distribution, and quick proof-of-value sharing.

Important framing:

  • The payload contract stays provider-agnostic.

  • These folders are provider-specific entry points for faster evaluation.

  • The generic direct-ingest examples under examples/node, examples/python, and examples/dotnet remain the canonical shared pattern.

  • OpenAI cost tracker example (Node)

    • Real OpenAI call + usage extraction + direct-ingest telemetry
    • Best first asset for JavaScript-heavy product teams evaluating AI cost tracking with minimal setup
  • Anthropic cost tracker example (Node)

    • Real Anthropic Messages API call + usage extraction + direct-ingest telemetry
    • Keeps the same Opsmeter payload shape while changing only the provider call
  • OpenAI cost tracker example (Python)

    • Same wedge for Python-heavy AI backends and internal tools
    • Useful when the evaluation owner is closer to ML or backend workflows
  • Anthropic cost tracker example (Python)

    • Same provider-specific entry point for Python teams using Claude
    • Makes the generic telemetry pattern obvious across providers

Example modes

SDK preview

Preview SDK contracts and reference implementations are maintained in dedicated repositories:

n8n templates

These templates are for Opsmeter n8n integration with workspace status branching, budget warning automation, and telemetry paused handling.

Path: ./n8n

  • workspace-status-check.json: polls GET /v1/diagnostics/workspace-status and branches by plan/budget booleans.
  • budget-warning-to-slack.json: scheduled budget status check with Slack notification path.
  • openai-to-opsmeter-ingest.json: provider call + usage mapping + ingest + 402 plan-limit branch.
  • Import and setup guide: n8n/README.md

Common mistakes

Common mistakes

  • Provider/model typo (example: wrong provider string), causing unknown model attribution.
  • Generating a new externalRequestId for retries (breaks idempotent behavior).
  • Blocking the request path with long telemetry timeouts.
  • Treating telemetry failure as business failure (it should be swallowed/logged).

CI / Quality gates

  • Node lint + tests
  • Python lint + tests
  • Dotnet build + tests
  • Smoke script runs all three examples in dry-run mode

See .github/workflows/ci.yml and scripts/smoke.sh.

Product linking text (for docs/pricing/landing)

Use this exact label when linking from the main product:

Integration examples (60-second quickstart)

Target URL:

https://github.com/opsmeter-io/opsmeter.io-integration-examples

Release

Current bootstrap release target: v0.1.0 (see CHANGELOG).

SEO and discoverability notes

Primary terms covered in this repository:

  • Opsmeter integration examples
  • LLM cost tracking integration
  • OpenAI usage monitoring
  • Anthropic telemetry integration
  • AI inference cost control

About

No description, website, or topics provided.

Resources

License

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages