From 9719c32a2ddcc673279999572d415e4a68ee5a94 Mon Sep 17 00:00:00 2001 From: Mano Toth Date: Wed, 24 Sep 2025 18:13:43 +0200 Subject: [PATCH 1/4] Add Gateway provider route to AI SDK --- .../observe/axiom-ai-sdk-instrumentation.mdx | 92 +++++-------------- 1 file changed, 25 insertions(+), 67 deletions(-) diff --git a/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx b/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx index ebf8ba1e..30d9b555 100644 --- a/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx +++ b/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx @@ -11,6 +11,8 @@ This page explains how to set up instrumentation in your TypeScript generative A Axiom AI SDK is an open-source project and welcomes your contributions. For more information, see the [GitHub repository](https://github.com/axiomhq/ai). + +This page explains how to work with OpenAI. The process is similar for other LLMs. Alternatively, [instrument your app manually](/ai-engineering/observe/manual-instrumentation). For more information on instrumentation approaches, see [Introduction to Observe](/ai-engineering/observe). @@ -21,17 +23,23 @@ Follow the procedure in [Quickstart](/ai-engineering/quickstart) to set up Axiom ## Instrument AI SDK calls -Axiom AI SDK provides helper functions for Vercel’s [AI SDK](https://ai-sdk.dev/docs) to wrap your existing AI model client. - -The `wrapAISDKModel` function takes an existing AI model object and returns an instrumented version that automatically generates trace data for every call. +Axiom AI SDK provides helper functions for Vercel’s [AI SDK](https://ai-sdk.dev/docs) to wrap your existing AI model client. The `wrapAISDKModel` function takes an existing AI model object and returns an instrumented version that automatically generates trace data for every call. - + + +To instrument calls with a Vercel AI SDK provider: + +1. Run the following in your terminal to install the Vercel AI SDK: + + ```sh + npm i ai + ``` -1. Run the following in your terminal to install the Vercel AI SDK and the OpenAI provider. +1. Install the AI SDK provider for the LLM you want to use. For example, run the following in your terminal to install the OpenAI provider. For the full list of providers, see the [Vercel documentation](https://ai-sdk.dev/providers/ai-sdk-providers/ai-gateway). ```sh - npm i ai @ai-sdk/openai + npm i @ai-sdk/openai ``` 1. Create the file `src/shared/openai.ts` with the following content: @@ -46,86 +54,36 @@ The `wrapAISDKModel` function takes an existing AI model object and returns an i // Wrap the model to enable automatic tracing export const gpt4o = wrapAISDKModel(openaiProvider('gpt-4o')); - export const gpt4oMini = wrapAISDKModel(openaiProvider('gpt-4o-mini')); ``` - - -1. Run the following in your terminal to install the Vercel AI SDK and the Anthropic provider. - - ```sh - npm i ai @ai-sdk/anthropic - ``` - -1. Create the file `src/shared/anthropic.ts` with the following content: - - ```ts /src/shared/anthropic.ts - import { createAnthropic } from '@ai-sdk/anthropic'; - import { wrapAISDKModel } from 'axiom/ai'; - - const anthropicProvider = createAnthropic({ - apiKey: process.env.ANTHROPIC_API_KEY, - }); + - // Wrap the model to enable automatic tracing - export const claude35Sonnet = wrapAISDKModel(anthropicProvider('claude-3-5-sonnet-20241022')); - export const claude35Haiku = wrapAISDKModel(anthropicProvider('claude-3-5-haiku-20241022')); - ``` - - - +To instrument calls without a Vercel AI SDK provider, use the generic Vercel AI Gateway provider. For more information, see the [Vercel documentation](https://ai-sdk.dev/providers/ai-sdk-providers/ai-gateway). -1. Run the following in your terminal to install the Vercel AI SDK and the Gemini provider. +1. Run the following in your terminal to install the Vercel AI SDK: ```sh - npm i ai @ai-sdk/google + npm i ai ``` -1. Create the file `src/shared/gemini.ts` with the following content: - - ```ts /src/shared/gemini.ts - import { createGoogleGenerativeAI } from '@ai-sdk/google'; - import { wrapAISDKModel } from 'axiom/ai'; - - const geminiProvider = createGoogleGenerativeAI({ - apiKey: process.env.GEMINI_API_KEY, - }); - - // Wrap the model to enable automatic tracing - export const gemini20Flash = wrapAISDKModel(geminiProvider('gemini-2.0-flash-exp')); - export const gemini15Pro = wrapAISDKModel(geminiProvider('gemini-1.5-pro')); - ``` - - - - -1. Run the following in your terminal to install the Vercel AI SDK and the Grok provider. - - ```sh - npm i ai @ai-sdk/xai - ``` - -1. Create the file `src/shared/grok.ts` with the following content: +1. Create the file `src/shared/openai.ts` with the following content: - ```ts /src/shared/grok.ts - import { createXai } from '@ai-sdk/xai'; - import { wrapAISDKModel } from 'axiom/ai'; + ```ts /src/shared/openai.ts + import { createGateway } from 'ai'; + import { wrapAISDKModel } from 'axiom'; - const grokProvider = createXai({ - apiKey: process.env.XAI_API_KEY, + const gateway = createGateway({ + apiKey: process.env.OPENAI_API_KEY, }); // Wrap the model to enable automatic tracing - export const grokBeta = wrapAISDKModel(grokProvider('grok-beta')); - export const grok2Mini = wrapAISDKModel(grokProvider('grok-2-mini')); + export const gpt4o = wrapAISDKModel(gateway('openai/gpt-4o')); ``` -The rest of the page explains how to work with OpenAI. The process is similar for other LLMs. - ## Add context The `withSpan` function allows you to add crucial business context to your traces. It creates a parent span around your LLM call and attaches metadata about the `capability` and `step` that you execute. From 64d944eea410d48bac14b8d10073eb23a0d9d4b2 Mon Sep 17 00:00:00 2001 From: Mano Toth Date: Wed, 24 Sep 2025 21:15:27 +0200 Subject: [PATCH 2/4] Fix link --- ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx b/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx index 30d9b555..287e2d7d 100644 --- a/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx +++ b/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx @@ -36,7 +36,7 @@ To instrument calls with a Vercel AI SDK provider: npm i ai ``` -1. Install the AI SDK provider for the LLM you want to use. For example, run the following in your terminal to install the OpenAI provider. For the full list of providers, see the [Vercel documentation](https://ai-sdk.dev/providers/ai-sdk-providers/ai-gateway). +1. Install the AI SDK provider for the LLM you want to use. For example, run the following in your terminal to install the OpenAI provider. For the full list of providers, see the [Vercel documentation](https://ai-sdk.dev/providers/ai-sdk-providers). ```sh npm i @ai-sdk/openai From 6d833bbc4b91b5d4cf4b029e13cdd568cf58725e Mon Sep 17 00:00:00 2001 From: Mano Toth Date: Mon, 29 Sep 2025 11:01:55 +0200 Subject: [PATCH 3/4] Add back previous tabs --- .../observe/axiom-ai-sdk-instrumentation.mdx | 110 ++++++++++++++---- 1 file changed, 90 insertions(+), 20 deletions(-) diff --git a/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx b/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx index 287e2d7d..b3967df2 100644 --- a/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx +++ b/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx @@ -11,8 +11,6 @@ This page explains how to set up instrumentation in your TypeScript generative A Axiom AI SDK is an open-source project and welcomes your contributions. For more information, see the [GitHub repository](https://github.com/axiomhq/ai). - -This page explains how to work with OpenAI. The process is similar for other LLMs. Alternatively, [instrument your app manually](/ai-engineering/observe/manual-instrumentation). For more information on instrumentation approaches, see [Introduction to Observe](/ai-engineering/observe). @@ -23,23 +21,17 @@ Follow the procedure in [Quickstart](/ai-engineering/quickstart) to set up Axiom ## Instrument AI SDK calls -Axiom AI SDK provides helper functions for Vercel’s [AI SDK](https://ai-sdk.dev/docs) to wrap your existing AI model client. The `wrapAISDKModel` function takes an existing AI model object and returns an instrumented version that automatically generates trace data for every call. - - - - -To instrument calls with a Vercel AI SDK provider: +Axiom AI SDK provides helper functions for [Vercel AI SDK](https://ai-sdk.dev/docs) to wrap your existing AI model client. The `wrapAISDKModel` function takes an existing AI model object and returns an instrumented version that automatically generates trace data for every call. -1. Run the following in your terminal to install the Vercel AI SDK: +Choose one of the following options to use a Vercel AI SDK provider for the most common LLMs. For the full list of providers, see the [Vercel documentation](https://ai-sdk.dev/providers/ai-sdk-providers). - ```sh - npm i ai - ``` + + -1. Install the AI SDK provider for the LLM you want to use. For example, run the following in your terminal to install the OpenAI provider. For the full list of providers, see the [Vercel documentation](https://ai-sdk.dev/providers/ai-sdk-providers). +1. Run the following in your terminal to install the Vercel AI SDK and the OpenAI provider. ```sh - npm i @ai-sdk/openai + npm i ai @ai-sdk/openai ``` 1. Create the file `src/shared/openai.ts` with the following content: @@ -54,10 +46,87 @@ To instrument calls with a Vercel AI SDK provider: // Wrap the model to enable automatic tracing export const gpt4o = wrapAISDKModel(openaiProvider('gpt-4o')); + export const gpt4oMini = wrapAISDKModel(openaiProvider('gpt-4o-mini')); ``` - + + +1. Run the following in your terminal to install the Vercel AI SDK and the Anthropic provider. + + ```sh + npm i ai @ai-sdk/anthropic + ``` + +1. Create the file `src/shared/anthropic.ts` with the following content: + + ```ts /src/shared/anthropic.ts + import { createAnthropic } from '@ai-sdk/anthropic'; + import { wrapAISDKModel } from 'axiom/ai'; + + const anthropicProvider = createAnthropic({ + apiKey: process.env.ANTHROPIC_API_KEY, + }); + + // Wrap the model to enable automatic tracing + export const claude35Sonnet = wrapAISDKModel(anthropicProvider('claude-3-5-sonnet-20241022')); + export const claude35Haiku = wrapAISDKModel(anthropicProvider('claude-3-5-haiku-20241022')); + ``` + + + + +1. Run the following in your terminal to install the Vercel AI SDK and the Gemini provider. + + ```sh + npm i ai @ai-sdk/google + ``` + +1. Create the file `src/shared/gemini.ts` with the following content: + + ```ts /src/shared/gemini.ts + import { createGoogleGenerativeAI } from '@ai-sdk/google'; + import { wrapAISDKModel } from 'axiom/ai'; + + const geminiProvider = createGoogleGenerativeAI({ + apiKey: process.env.GEMINI_API_KEY, + }); + + // Wrap the model to enable automatic tracing + export const gemini20Flash = wrapAISDKModel(geminiProvider('gemini-2.0-flash-exp')); + export const gemini15Pro = wrapAISDKModel(geminiProvider('gemini-1.5-pro')); + ``` + + + + +1. Run the following in your terminal to install the Vercel AI SDK and the Grok provider. + + ```sh + npm i ai @ai-sdk/xai + ``` + +1. Create the file `src/shared/grok.ts` with the following content: + + ```ts /src/shared/grok.ts + import { createXai } from '@ai-sdk/xai'; + import { wrapAISDKModel } from 'axiom/ai'; + + const grokProvider = createXai({ + apiKey: process.env.XAI_API_KEY, + }); + + // Wrap the model to enable automatic tracing + export const grokBeta = wrapAISDKModel(grokProvider('grok-beta')); + export const grok2Mini = wrapAISDKModel(grokProvider('grok-2-mini')); + ``` + + + + +To instrument calls without a Vercel AI SDK provider, use the generic Vercel AI Gateway provider. + + To instrument calls without a Vercel AI SDK provider, use the generic Vercel AI Gateway provider. For more information, see the [Vercel documentation](https://ai-sdk.dev/providers/ai-sdk-providers/ai-gateway). @@ -81,14 +150,15 @@ To instrument calls without a Vercel AI SDK provider, use the generic Vercel AI export const gpt4o = wrapAISDKModel(gateway('openai/gpt-4o')); ``` - - + + +The rest of the page explains how to work with OpenAI. The process is similar for other LLMs. ## Add context The `withSpan` function allows you to add crucial business context to your traces. It creates a parent span around your LLM call and attaches metadata about the `capability` and `step` that you execute. -```typescript /src/app/page.tsx +```ts /src/app/page.tsx import { withSpan } from 'axiom/ai'; import { generateText } from 'ai'; import { gpt4o } from '@/shared/openai'; @@ -122,7 +192,7 @@ For many AI capabilities, the LLM call is only part of the story. If your capabi The `wrapTool` helper takes your tool’s name and its definition and returns an instrumented version. This wrapper creates a dedicated child span for every tool execution, capturing its arguments, output, and any errors. -```typescript /src/app/generate-text/page.tsx +```ts /src/app/generate-text/page.tsx import { tool } from 'ai'; import { z } from 'zod'; import { wrapTool } from 'axiom/ai'; @@ -160,7 +230,7 @@ const { text, toolResults } = await generateText({ Example of how all three instrumentation functions work together in a single, real-world example: -```typescript /src/app/page.tsx expandable +```ts /src/app/page.tsx expandable import { withSpan, wrapAISDKModel, wrapTool } from 'axiom/ai'; import { generateText, tool } from 'ai'; import { createOpenAI } from '@ai-sdk/openai'; From 8ce30a5319a3c07196ed354f37a3647188ab71e7 Mon Sep 17 00:00:00 2001 From: Mano Toth Date: Mon, 29 Sep 2025 12:19:06 +0200 Subject: [PATCH 4/4] Simplify wording --- ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx b/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx index b3967df2..07158ade 100644 --- a/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx +++ b/ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx @@ -23,7 +23,7 @@ Follow the procedure in [Quickstart](/ai-engineering/quickstart) to set up Axiom Axiom AI SDK provides helper functions for [Vercel AI SDK](https://ai-sdk.dev/docs) to wrap your existing AI model client. The `wrapAISDKModel` function takes an existing AI model object and returns an instrumented version that automatically generates trace data for every call. -Choose one of the following options to use a Vercel AI SDK provider for the most common LLMs. For the full list of providers, see the [Vercel documentation](https://ai-sdk.dev/providers/ai-sdk-providers). +Choose one of the following common Vercel AI SDK providers. For the full list of providers, see the [Vercel documentation](https://ai-sdk.dev/providers/ai-sdk-providers).