Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 33 additions & 5 deletions ai-engineering/observe/axiom-ai-sdk-instrumentation.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,9 @@ Follow the procedure in [Quickstart](/ai-engineering/quickstart) to set up Axiom

## Instrument AI SDK calls

Axiom AI SDK provides helper functions for Vercel’s [AI SDK](https://ai-sdk.dev/docs) to wrap your existing AI model client.
Axiom AI SDK provides helper functions for [Vercel AI SDK](https://ai-sdk.dev/docs) to wrap your existing AI model client. The `wrapAISDKModel` function takes an existing AI model object and returns an instrumented version that automatically generates trace data for every call.

The `wrapAISDKModel` function takes an existing AI model object and returns an instrumented version that automatically generates trace data for every call.
Choose one of the following common Vercel AI SDK providers. For the full list of providers, see the [Vercel documentation](https://ai-sdk.dev/providers/ai-sdk-providers).

<Tabs>
<Tab title="OpenAI">
Expand Down Expand Up @@ -124,13 +124,41 @@ The `wrapAISDKModel` function takes an existing AI model object and returns an i
</Tab>
</Tabs>

To instrument calls without a Vercel AI SDK provider, use the generic Vercel AI Gateway provider.

<Accordion title="Gateway provider">

To instrument calls without a Vercel AI SDK provider, use the generic Vercel AI Gateway provider. For more information, see the [Vercel documentation](https://ai-sdk.dev/providers/ai-sdk-providers/ai-gateway).

1. Run the following in your terminal to install the Vercel AI SDK:

```sh
npm i ai
```

1. Create the file `src/shared/openai.ts` with the following content:

```ts /src/shared/openai.ts
import { createGateway } from 'ai';
import { wrapAISDKModel } from 'axiom';

const gateway = createGateway({
apiKey: process.env.OPENAI_API_KEY,
});

// Wrap the model to enable automatic tracing
export const gpt4o = wrapAISDKModel(gateway('openai/gpt-4o'));
```

</Accordion>

The rest of the page explains how to work with OpenAI. The process is similar for other LLMs.

## Add context

The `withSpan` function allows you to add crucial business context to your traces. It creates a parent span around your LLM call and attaches metadata about the `capability` and `step` that you execute.

```typescript /src/app/page.tsx
```ts /src/app/page.tsx
import { withSpan } from 'axiom/ai';
import { generateText } from 'ai';
import { gpt4o } from '@/shared/openai';
Expand Down Expand Up @@ -164,7 +192,7 @@ For many AI capabilities, the LLM call is only part of the story. If your capabi

The `wrapTool` helper takes your tool’s name and its definition and returns an instrumented version. This wrapper creates a dedicated child span for every tool execution, capturing its arguments, output, and any errors.

```typescript /src/app/generate-text/page.tsx
```ts /src/app/generate-text/page.tsx
import { tool } from 'ai';
import { z } from 'zod';
import { wrapTool } from 'axiom/ai';
Expand Down Expand Up @@ -202,7 +230,7 @@ const { text, toolResults } = await generateText({

Example of how all three instrumentation functions work together in a single, real-world example:

```typescript /src/app/page.tsx expandable
```ts /src/app/page.tsx expandable
import { withSpan, wrapAISDKModel, wrapTool } from 'axiom/ai';
import { generateText, tool } from 'ai';
import { createOpenAI } from '@ai-sdk/openai';
Expand Down