From 35a3d8e41232dea7a9bbfd4f4ab309178f7abf14 Mon Sep 17 00:00:00 2001 From: Lars Grammel Date: Fri, 10 May 2024 20:46:02 +0200 Subject: [PATCH] docs: improve custom provider guide (#1554) --- .../01-custom-providers.mdx | 49 +++++++++++++------ 1 file changed, 35 insertions(+), 14 deletions(-) diff --git a/content/providers/03-community-providers/01-custom-providers.mdx b/content/providers/03-community-providers/01-custom-providers.mdx index 1578d05403..910b509bb1 100644 --- a/content/providers/03-community-providers/01-custom-providers.mdx +++ b/content/providers/03-community-providers/01-custom-providers.mdx @@ -6,14 +6,10 @@ description: Learn how to write a custom provider for the Vercel AI SDK # Writing a Custom Provider The Vercel AI SDK provides a [Language Model Specification](https://github.com/vercel/ai/tree/main/packages/provider/src/language-model/v1). -You can write your own provider that adheres to the specification and it will be compatible with the AI SDK. +You can write your own provider that adheres to the specification and it will be compatible with the Vercel AI SDK. You can find the Language Model Specification in the [AI SDK repository](https://github.com/vercel/ai/tree/main/packages/provider/src/language-model/v1). -It can be imported from `'@ai-sdk/provider'`. - -We also provide utilities that make it easier to implement a custom provider. You can find them in the `@ai-sdk/provider-utils` package ([source code](https://github.com/vercel/ai/tree/main/packages/provider-utils)). - -There are several reference implementations, e.g. a [Mistral reference implementation](https://github.com/vercel/ai/tree/main/packages/mistral). +It can be imported from `'@ai-sdk/provider'`. We also provide utilities that make it easier to implement a custom provider. You can find them in the `@ai-sdk/provider-utils` package ([source code](https://github.com/vercel/ai/tree/main/packages/provider-utils)). If you open-source a provider, we'd love to promote it here. Please send us a @@ -21,7 +17,21 @@ There are several reference implementations, e.g. a [Mistral reference implement section. -## Provider Entry Point +## Provider Implementation Guide + +Implementing a custom language model provider involves several steps: + +- Creating an entry point +- Adding a language model implementation +- Mapping the input (prompt, tools, settings) +- Processing the results (generate, streaming, tool calls) +- Supporting object generation + +The best way to get started is to copy a reference implementation and modify it to fit your needs. +Check out the [Mistral reference implementation](https://github.com/vercel/ai/tree/main/packages/mistral) +to see how the project is structured, and feel free to copy the setup. + +### Creating an Entry Point Each AI SDK provider should follow the pattern of using a factory function that returns a provider instance and provide a default instance. @@ -114,19 +124,30 @@ export function createCustomProvider( export const customProvider = createCustomProvider(); ``` -## Language Model Implementation +### Implementing the Language Model + +A [language model](https://github.com/vercel/ai/blob/main/packages/provider/src/language-model/v1/language-model-v1.ts) needs to implement: + +- metadata fields + - `specificationVersion: 'v1'` - always `'v1'` + - `provider: string` - name of the provider + - `modelId: string` - unique identifier of the model + - `defaultObjectGenerationMode` - default object generation mode, e.g. "json" +- `doGenerate` method +- `doStream` method -Please refer to the Language Model Specification in the [AI SDK repository](https://github.com/vercel/ai/tree/main/packages/provider/src/language-model/v1). +Check out the [Mistral language model](https://github.com/vercel/ai/blob/main/packages/mistral/src/mistral-chat-language-model.ts) as an example. -We also provide utilities that make it easier to implement a custom provider. You can find them in the `@ai-sdk/provider-utils` package ([source code](https://github.com/vercel/ai/tree/main/packages/provider-utils)). +At a high level, both `doGenerate` and `doStream` methods should: -There are several reference implementations, e.g. a [OpenAI reference implementation](https://github.com/vercel/ai/tree/main/packages/openai) -and a [Mistral reference implementation](https://github.com/vercel/ai/tree/main/packages/mistral). +1. **Map the prompt and the settings to the format required by the provider API.** This can be extracted, e.g. the Mistral provider contains a `getArgs` method. +2. **Call the provider API.** You could e.g. use fetch calls or a library offered by the provider. +3. **Process the results.** You need to convert the response to the format required by the AI SDK. -## Errors +### Errors The AI SDK provides [standardized errors](https://github.com/vercel/ai/tree/main/packages/provider/src/errors) that should be used by providers where possible. This will make it easy for user to debug them. -## Retries, timeouts, and abort signals +### Retries, timeouts, and abort signals The AI SDK will handle retries, timeouts, and aborting requests in a unified way. The model classes should not implement retries or timeouts themselves. Instead, they should use the `abortSignal` parameter to determine when the call should be aborted, and they should throw `ApiCallErrors` (or similar) with a correct `isRetryable` flag when errors such as network errors occur.