Skip to content

bug: Model name not set when using Vercel AI SDK and Azure OpenAI provider #5286

Open
vercel/ai
#6803
@oysteinborg-try

Description

@oysteinborg-try

Describe the bug

We are using the vercel ai sdk and the azure open ai proivder, as documented here: https://sdk.vercel.ai/providers/ai-sdk-providers/azure
When invoking streamText, or any other vercel ai sdk functions, the model name does not get set by the langfuseexporter

const streamResult = streamText({
  model: azure('gpt-4o'),
  ...

Output from langfuseexporter debug:

"id":"cb893129ba276daf",
"startTime":"2025-01-29T20:40:16.652Z",
"traceId":"8374ca67-8ecf-437b-b805-2eb850384aa",
"parentObservationId":"f7758888946238da",
"name":"ai.streamText.doStream",
"endTime":"2025-01-29T20:40:17.568Z",
"completionStartTime":"2025-01-29T20:40:17.044Z",
"model":"",
"modelParameters":{"toolChoice":"{\"type\":\"auto\"}","system":"azure-openai.chat","maxRetries":"2"}

Langfuse are able to track the tokens used, but there are no cost associated with an empty model,and I cant get the regex of a custom model to match.

To reproduce

Use vercel ai sdk with azure open ai provider and this integration https://langfuse.com/docs/integrations/vercel-ai-sdk

SDK and container versions

"@ai-sdk/azure": "^1.1.5",
"@ai-sdk/openai": "^1.1.5",
"@ai-sdk/provider": "^1.0.6",

"@opentelemetry/api-logs": "^0.57.1",
"@opentelemetry/instrumentation": "^0.57.1",
"@opentelemetry/sdk-logs": "^0.57.1",
"@opentelemetry/sdk-trace-base": "^1.30.1",

"@vercel/otel": "^1.10.1",
"ai": "^4.1.11",

"langfuse": "^3.33.1",
"langfuse-vercel": "^3.33.1",

Additional information

No response

Are you interested to contribute a fix for this bug?

No

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions