-
Notifications
You must be signed in to change notification settings - Fork 1
feat: move to middleware #31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Introduces a middleware system for Axiom AI telemetry that integrates with Vercel's AI SDK middleware architecture. This allows users to either continue using the existing wrapper classes (wrapAISDKModel) or adopt the new middleware approach with wrapLanguageModel and Axiom's middleware functions.
- Adds comprehensive middleware implementation with separate V1/V2 versions and a unified auto-detecting variant
- Maintains backwards compatibility by refactoring wrapper classes to use middleware internally
- Includes extensive test coverage for the new middleware functionality
Reviewed Changes
Copilot reviewed 28 out of 29 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
packages/ai/src/otel/middleware.ts |
Core middleware implementation with V1/V2 support and unified auto-detection |
packages/ai/src/otel/AxiomWrappedLanguageModelV1.ts |
Refactored to use middleware internally while preserving API |
packages/ai/src/otel/AxiomWrappedLanguageModelV2.ts |
Refactored to use middleware internally while preserving API |
packages/ai/test/otel/middleware.test.ts |
Comprehensive test suite for middleware functionality |
| Various test files | Added proper telemetry initialization and cleanup |
| Example files | Updated to demonstrate new middleware usage patterns |
Files not reviewed (1)
- pnpm-lock.yaml: Language not supported
Comments suppressed due to low confidence (7)
packages/ai/src/otel/wrapTool.ts:61
- [nitpick] The variable name
_errorfollows the convention for unused variables but may be confusing. Consider using a more descriptive name like_serializationErrorsince this is specifically catching JSON serialization errors.
} catch (_error) {
packages/ai/src/otel/wrapTool.ts:73
- [nitpick] The variable name
_errorfollows the convention for unused variables but may be confusing. Consider using a more descriptive name like_serializationErrorsince this is specifically catching JSON serialization errors.
} catch (_error) {
packages/ai/package.json:58
- The version
2.0.0-beta.9should be verified to exist. Based on my knowledge cutoff in January 2025, I cannot confirm this specific beta version exists.
"@ai-sdk/anthropicv2": "npm:@ai-sdk/anthropic@2.0.0-beta.9",
packages/ai/package.json:60
- The version
2.0.0-beta.12should be verified to exist. Based on my knowledge cutoff in January 2025, I cannot confirm this specific beta version exists.
"@ai-sdk/openaiv2": "npm:@ai-sdk/openai@2.0.0-beta.12",
packages/ai/package.json:72
- The version
5.0.0-beta.28should be verified to exist. Based on my knowledge cutoff in January 2025, I cannot confirm this specific beta version exists.
"aiv5": "npm:ai@5.0.0-beta.28",
examples/example-middleware/package.json:9
- The version
2.0.0-beta.12should be verified to exist. Based on my knowledge cutoff in January 2025, I cannot confirm this specific beta version exists.
"@ai-sdk/openaiv2": "npm:@ai-sdk/openai@2.0.0-beta.12",
examples/example-middleware/package.json:12
- The version
5.0.0-beta.28should be verified to exist. Based on my knowledge cutoff in January 2025, I cannot confirm this specific beta version exists.
"aiv5": "npm:ai@5.0.0-beta.28",
* feat: move to middleware * prettier * rename * add tests, examples, and docs * remove readme here * shared typecheck/lint/format * pass build * fix wrapTool types * fix import
Middleware
We are moving the Vercel AI SDK wrapper to the middleware pattern that is encouraged in their docs.
There are two different ways of using it:
wrapLanguageModelfrom the Vercel SDK itself:wrapLanguageModel({ model, middleware: [AxiomAIMiddleware({ model })] })wrapAISDKModel(model)which now calls the new stuff internally. There is no user facing change to this APIThere is also
axiomAIMiddlewareV1andaxiomAIMiddlewareV2which can be used if it's not possible to pass the model (which is how we know which middleware version to use) to the function.Other changes