Skip to content

add llm client option for AI SDK experimental telemetry config #759

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

dinmukhamedm
Copy link

@dinmukhamedm dinmukhamedm commented May 15, 2025

Note: sorry, doesn't seem like evals branch recommended in your contributing guide exists

why

To enhance Laminar (or any other) observability for AI SDK calls.

Allow users to configure telemetry for AI SDK

what changed

Added a new field to ClientOptions and passed it down the generateText and generateObject calls.

test plan

Tested manually with a simple Laminar script:

import { Laminar, getTracer } from "@lmnr-ai/lmnr";
import { Stagehand } from "@browserbasehq/stagehand";
import dotenv from "dotenv";
dotenv.config();

Laminar.initialize({
  instrumentModules: {
    stagehand: Stagehand,
  },
});

async function example() {
  const stagehand = new Stagehand({
    modelName: "anthropic/claude-3-7-sonnet-latest",
    modelClientOptions: {
      apiKey: process.env.ANTHROPIC_API_KEY,
      aiSdkTelemetrySettings: {
        isEnabled: true,
        tracer: getTracer(),
      }
    },
    env: "LOCAL",
    verbose: 0,
  });
  await stagehand.init();
  const page = stagehand.page;

  await page.goto("https://www.lmnr.ai/");

  await page.act("open the blogs page");
  // ...

  await stagehand.close();
}

(async () => {
  await example();
  await Laminar.flush();
})().then(() => {
  console.log('done');
});

Copy link

changeset-bot bot commented May 15, 2025

⚠️ No Changeset found

Latest commit: 01faf26

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Summary

Added support for AI SDK experimental telemetry configuration to enhance observability for AI SDK calls, particularly for integration with Laminar.

  • Added aiSdkExperimentalTelemetry field in types/stagehand.ts ConstructorParams interface
  • Modified types/model.ts to include optional aiSdkTelemetrySettings in ClientOptions
  • Updated lib/llm/aisdk.ts to pass telemetry settings to generateObject and generateText calls
  • Added telemetry parameter handling in lib/llm/LLMProvider.ts for AISdkClient initialization
  • Consider adding version requirements and compatibility documentation for the ai package dependency

💡 (1/5) You can manually trigger the bot by mentioning @greptileai in a comment!

5 file(s) reviewed, 2 comment(s)
Edit PR Review Bot Settings | Greptile

Comment on lines +48 to +50
export type ClientOptions = (OpenAIClientOptions | AnthropicClientOptions) & {
aiSdkTelemetrySettings?: TelemetrySettings;
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

style: Consider making this a discriminated union with a provider field to ensure type safety when using provider-specific options

@dinmukhamedm dinmukhamedm changed the title add constructor param for AI SDK experimental telemetry config add llm client option for AI SDK experimental telemetry config May 15, 2025
dinmukhamedm added a commit to lmnr-ai/stagehand that referenced this pull request May 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant