Skip to content

Commit

Permalink
update
Browse files Browse the repository at this point in the history
  • Loading branch information
gitworkflows committed Apr 11, 2024
1 parent f0fe620 commit b028504
Show file tree
Hide file tree
Showing 161 changed files with 340 additions and 340 deletions.
2 changes: 1 addition & 1 deletion .changeset/twenty-crabs-roll.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@
'ai': patch
---

Breaking change: extract experimental AI core provider packages. They can now be imported with e.g. import { openai } from '@ai-sdk/openai' after adding them to a project.
Breaking change: extract experimental AI core provider packages. They can now be imported with e.g. import { openai } from '@khulnasoft/openai' after adding them to a project.
4 changes: 2 additions & 2 deletions .eslintrc.js
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
module.exports = {
root: true,
// This tells ESLint to load the config from the package `eslint-config-vercel-ai`
extends: ['vercel-ai'],
// This tells ESLint to load the config from the package `eslint-config-khulnasoft-ai`
extends: ['khulnasoft-ai'],
settings: {
next: {
rootDir: ['apps/*/'],
Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/1.bug_report.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ body:
- type: markdown
attributes:
value: |
This template is to report bugs for the AI SDK. If you need help with your own project, feel free to [start a new thread in our discussions](https://github.com/vercel/ai/discussions).
This template is to report bugs for the AI SDK. If you need help with your own project, feel free to [start a new thread in our discussions](https://github.com/khulnasoft/ai/discussions).
- type: textarea
attributes:
label: Description
Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/2.feature_request.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ body:
- type: markdown
attributes:
value: |
This template is to propose new features for the AI SDK. If you need help with your own project, feel free to [start a new thread in our discussions](https://github.com/vercel/ai/discussions).
This template is to propose new features for the AI SDK. If you need help with your own project, feel free to [start a new thread in our discussions](https://github.com/khulnasoft/ai/discussions).
- type: textarea
attributes:
label: Feature Description
Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/config.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
blank_issues_enabled: false
contact_links:
- name: Ask a question
url: https://github.com/vercel/ai/discussions
url: https://github.com/khulnasoft/ai/discussions
about: Please ask questions in our discussions forum.
2 changes: 1 addition & 1 deletion docs/pages/docs/acknowledgements.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@ This library is created by [Vercel](https://vercel.com) and [Next.js](https://ne
- Malte Ubl ([@cramforce](https://twitter.com/cramforce)) - [Vercel](https://vercel.com)
- Justin Ridgewell ([@jridgewell](https://github.com/jridgewell)) - [Vercel](https://vercel.com)

[Contributors](https://github.com/vercel/ai/graphs/contributors)
[Contributors](https://github.com/khulnasoft/ai/graphs/contributors)
12 changes: 6 additions & 6 deletions docs/pages/docs/ai-core/anthropic.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,27 +13,27 @@ It creates language model objects that can be used with the `generateText` and `

## Setup

The Anthropic provider is available in the `@ai-sdk/anthropic` module. You can install it with
The Anthropic provider is available in the `@khulnasoft/anthropic` module. You can install it with

<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>

```bash
pnpm add @ai-sdk/anthropic
pnpm add @khulnasoft/anthropic
```

</Tab>
<Tab>

```bash
npm i @ai-sdk/anthropic
npm i @khulnasoft/anthropic
```

</Tab>
<Tab>

```bash
yarn add @ai-sdk/anthropic
yarn add @khulnasoft/anthropic
```

</Tab>
Expand All @@ -44,7 +44,7 @@ yarn add @ai-sdk/anthropic
You can import `Anthropic` from `ai/anthropic` and initialize a provider instance with various settings:

```ts
import { Anthropic } from '@ai-sdk/anthropic';
import { Anthropic } from '@khulnasoft/anthropic';

const anthropic = new Anthropic({
baseUrl: '', // optional base URL for proxies etc.
Expand All @@ -55,7 +55,7 @@ const anthropic = new Anthropic({
The AI SDK also provides a shorthand `anthropic` import with a Anthropic provider instance that uses defaults:

```ts
import { anthropic } from '@ai-sdk/anthropic';
import { anthropic } from '@khulnasoft/anthropic';
```

## Messages Models
Expand Down
22 changes: 11 additions & 11 deletions docs/pages/docs/ai-core/custom-provider.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,21 +14,21 @@ import { Callout } from 'nextra-theme-docs';
The AI SDK provides a language model specification.
You can write your own providers that adhere to the AI SDK language model specification and they will be compatible with the AI Core functions.

You can find the Language Model Specification in the [AI SDK repository](https://github.com/vercel/ai/tree/main/packages/provider/src/language-model/v1).
It can be imported from `'@ai-sdk/provider'`.
You can find the Language Model Specification in the [AI SDK repository](https://github.com/khulnasoft/ai/tree/main/packages/provider/src/language-model/v1).
It can be imported from `'@khulnasoft/ai-sdk-provider'`.

We also provide utilities that make it easier to implement a custom provider. You can find them in the `@ai-sdk/provider-utils` package ([source code](https://github.com/vercel/ai/tree/main/packages/provider-utils)).
We also provide utilities that make it easier to implement a custom provider. You can find them in the `@khulnasoft/ai-sdk-provider-utils` package ([source code](https://github.com/khulnasoft/ai/tree/main/packages/provider-utils)).

There are several reference implementations, e.g. a [OpenAI reference implementation](https://github.com/vercel/ai/tree/main/packages/openai)
and a [Mistral reference implementation](https://github.com/vercel/ai/tree/main/packages/mistral).
There are several reference implementations, e.g. a [OpenAI reference implementation](https://github.com/khulnasoft/ai/tree/main/packages/openai)
and a [Mistral reference implementation](https://github.com/khulnasoft/ai/tree/main/packages/mistral).

## Provider Facade

A custom provider should follow the pattern of using a provider facade with factory methods for the specific providers.
An instance of the custom provider class with default settings can be exported for convenience.

```ts filename="custom-provider-facade.ts"
import { generateId, loadApiKey } from ''@ai-sdk/provider-utils'';
import { generateId, loadApiKey } from ''@khulnasoft/ai-sdk-provider-utils'';
import { CustomChatLanguageModel } from './custom-chat-language-model';
import { CustomChatModelId, CustomChatSettings } from './mistral-chat-settings';

Expand Down Expand Up @@ -78,16 +78,16 @@ export const customprovider = new CustomProvider();

## Language Model Implementation

Please refer to the Language Model Specification in the [AI SDK repository](https://github.com/vercel/ai/tree/main/packages/provider/src/language-model/v1).
Please refer to the Language Model Specification in the [AI SDK repository](https://github.com/khulnasoft/ai/tree/main/packages/provider/src/language-model/v1).

We also provide utilities that make it easier to implement a custom provider. You can find them in the `@ai-sdk/provider-utils` package ([source code](https://github.com/vercel/ai/tree/main/packages/provider-utils)).
We also provide utilities that make it easier to implement a custom provider. You can find them in the `@khulnasoft/ai-sdk-provider-utils` package ([source code](https://github.com/khulnasoft/ai/tree/main/packages/provider-utils)).

There are several reference implementations, e.g. a [OpenAI reference implementation](https://github.com/vercel/ai/tree/main/packages/openai)
and a [Mistral reference implementation](https://github.com/vercel/ai/tree/main/packages/mistral).
There are several reference implementations, e.g. a [OpenAI reference implementation](https://github.com/khulnasoft/ai/tree/main/packages/openai)
and a [Mistral reference implementation](https://github.com/khulnasoft/ai/tree/main/packages/mistral).

### Errors

The AI SDK provides [standardized errors](https://github.com/vercel/ai/tree/main/packages/provider/src/errors) that should be used by providers where possible.
The AI SDK provides [standardized errors](https://github.com/khulnasoft/ai/tree/main/packages/provider/src/errors) that should be used by providers where possible.
This will make it easy for user to debug them.

### Retries, timeouts, and abort signals
Expand Down
12 changes: 6 additions & 6 deletions docs/pages/docs/ai-core/google.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,27 +11,27 @@ It creates language model objects that can be used with the `generateText`, `str

## Setup

The Google provider is available in the `@ai-sdk/google` module. You can install it with
The Google provider is available in the `@khulnasoft/google` module. You can install it with

<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>

```bash
pnpm add @ai-sdk/google
pnpm add @khulnasoft/google
```

</Tab>
<Tab>

```bash
npm i @ai-sdk/google
npm i @khulnasoft/google
```

</Tab>
<Tab>

```bash
yarn add @ai-sdk/google
yarn add @khulnasoft/google
```

</Tab>
Expand All @@ -42,7 +42,7 @@ yarn add @ai-sdk/google
You can import `Google` from `ai/google` and initialize a provider instance with various settings:

```ts
import { Google } from '@ai-sdk/google';
import { Google } from '@khulnasoft/google';

const google = new Google({
baseUrl: '', // optional base URL for proxies etc.
Expand All @@ -53,7 +53,7 @@ const google = new Google({
The AI SDK also provides a shorthand `google` import with a Google provider instance that uses defaults:

```ts
import { google } from '@ai-sdk/google';
import { google } from '@khulnasoft/google';
```

## Generative AI Models
Expand Down
4 changes: 2 additions & 2 deletions docs/pages/docs/ai-core/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ Here is a simple example for `generateText`:

```ts
import { experimental_generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { openai } from '@khulnasoft/openai';

const { text } = await experimental_generateText({
model: openai.chat('gpt-3.5-turbo'),
Expand Down Expand Up @@ -77,6 +77,6 @@ The AI SDK contains the following providers:
- [Google Provider](/docs/ai-core/google) (`ai/google`)
- [Anthropic Provider](/docs/ai-core/anthropic) (`ai/anthropic`)

The AI SDK also provides a [language model specification](https://github.com/vercel/ai/tree/main/packages/core/spec/language-model/v1) that you can use to implement [custom providers](/docs/ai-core/custom-provider).
The AI SDK also provides a [language model specification](https://github.com/khulnasoft/ai/tree/main/packages/core/spec/language-model/v1) that you can use to implement [custom providers](/docs/ai-core/custom-provider).

![AI SDK Diagram](/images/ai-sdk-diagram.png)
12 changes: 6 additions & 6 deletions docs/pages/docs/ai-core/mistral.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,27 +11,27 @@ It creates language model objects that can be used with the `generateText`, `str

## Setup

The Mistral provider is available in the `@ai-sdk/mistral` module. You can install it with
The Mistral provider is available in the `@khulnasoft/mistral` module. You can install it with

<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>

```bash
pnpm add @ai-sdk/mistral
pnpm add @khulnasoft/mistral
```

</Tab>
<Tab>

```bash
npm i @ai-sdk/mistral
npm i @khulnasoft/mistral
```

</Tab>
<Tab>

```bash
yarn add @ai-sdk/mistral
yarn add @khulnasoft/mistral
```

</Tab>
Expand All @@ -42,7 +42,7 @@ yarn add @ai-sdk/mistral
You can import `Mistral` from `ai/mistral` and initialize a provider instance with various settings:

```ts
import { Mistral } from '@ai-sdk/mistral';
import { Mistral } from '@khulnasoft/mistral';

const mistral = new Mistral({
baseUrl: '', // optional base URL for proxies etc.
Expand All @@ -53,7 +53,7 @@ const mistral = new Mistral({
The AI SDK also provides a shorthand `mistral` import with a Mistral provider instance that uses defaults:

```ts
import { mistral } from '@ai-sdk/mistral';
import { mistral } from '@khulnasoft/mistral';
```

## Chat Models
Expand Down
12 changes: 6 additions & 6 deletions docs/pages/docs/ai-core/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,27 +11,27 @@ It creates language model objects that can be used with the `generateText`, `str

## Setup

The OpenAI provider is available in the `@ai-sdk/openai` module. You can install it with
The OpenAI provider is available in the `@khulnasoft/openai` module. You can install it with

<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>

```bash
pnpm add @ai-sdk/openai
pnpm add @khulnasoft/openai
```

</Tab>
<Tab>

```bash
npm i @ai-sdk/openai
npm i @khulnasoft/openai
```

</Tab>
<Tab>

```bash
yarn add @ai-sdk/openai
yarn add @khulnasoft/openai
```

</Tab>
Expand All @@ -42,7 +42,7 @@ yarn add @ai-sdk/openai
You can import `OpenAI` from `ai/openai` and initialize a provider instance with various settings:

```ts
import { OpenAI } from '@ai-sdk/openai'
import { OpenAI } from '@khulnasoft/openai'

const openai = new OpenAI({
baseUrl: '', // optional base URL for proxies etc.
Expand All @@ -54,7 +54,7 @@ const openai = new OpenAI({
The AI SDK also provides a shorthand `openai` import with an OpenAI provider instance that uses defaults:

```ts
import { openai } from '@ai-sdk/openai';
import { openai } from '@khulnasoft/openai';
```

## Chat Models
Expand Down
8 changes: 4 additions & 4 deletions docs/pages/docs/ai-core/stream-text.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ by the [`useCompletion`](/docs/api-reference/use-completion) hook.

```ts
import { StreamingTextResponse, experimental_streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { openai } from '@khulnasoft/openai';

export const runtime = 'edge';

Expand All @@ -114,7 +114,7 @@ by the [`useChat`](/docs/api-reference/use-chat) hook.

```ts
import { StreamingTextResponse, experimental_streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { openai } from '@khulnasoft/openai';

export const runtime = 'edge';

Expand All @@ -135,7 +135,7 @@ export async function POST(req: Request) {

```ts
import { ExperimentalMessage, experimental_streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { openai } from '@khulnasoft/openai';
import * as readline from 'node:readline/promises';

const terminal = readline.createInterface({
Expand Down Expand Up @@ -181,7 +181,7 @@ import {
ToolResultPart,
experimental_streamText,
} from 'ai';
import { openai } from '@ai-sdk/openai';
import { openai } from '@khulnasoft/openai';
import * as readline from 'node:readline/promises';

const terminal = readline.createInterface({
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

The `HuggingFaceStream` function is a utility that transforms the output from an array of text generation models hosted on [Hugging Face.co](https://huggingface.co) into a `ReadableStream`. The transformation uses an `AsyncGenerator` as provided by the [Hugging Face Inference SDK](https://huggingface.co/docs/huggingface.js/inference/README)'s `hf.textGenerationStream` method. This feature enables you to handle AI responses in real-time by means of a readable stream.

While `HuggingFaceStream` is compatible with _most_ Hugging Face text generation models, the rapidly evolving landscape of models may result in certain new or niche models not being supported. If you encounter a model that isn't supported, we encourage you to [open an issue](https://github.com/vercel/ai/issues/new).
While `HuggingFaceStream` is compatible with _most_ Hugging Face text generation models, the rapidly evolving landscape of models may result in certain new or niche models not being supported. If you encounter a model that isn't supported, we encourage you to [open an issue](https://github.com/khulnasoft/ai/issues/new).

To ensure that AI responses are comprised purely of text without any delimiters that could pose issues when rendering in chat or completion modes, we standardize and remove special end-of-response tokens. If your use case requires a different handling of responses, you can fork and modify this stream to meet your specific needs.

Expand Down
2 changes: 1 addition & 1 deletion docs/pages/docs/api-reference/stream-data.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ import { Callout } from 'nextra-theme-docs';
# `StreamData`

The `StreamData` class allows you to stream arbitrary data to the client alongside your LLM response.
For information on the implementation, see the associated [pull request](https://github.com/vercel/ai/pull/425).
For information on the implementation, see the associated [pull request](https://github.com/khulnasoft/ai/pull/425).

## Usage

Expand Down
8 changes: 4 additions & 4 deletions docs/pages/docs/guides/frameworks/nextjs-app.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ export default function MyComponent() {

## Examples

- [next-openai](https://github.com/vercel/ai/tree/main/examples/next-openai)
- [next-replicate](https://github.com/vercel/ai/tree/main/examples/next-replicate)
- [next-huggingface](https://github.com/vercel/ai/tree/main/examples/next-huggingface)
- [next-langchain](https://github.com/vercel/ai/tree/main/examples/next-langchain)
- [next-openai](https://github.com/khulnasoft/ai/tree/main/examples/next-openai)
- [next-replicate](https://github.com/khulnasoft/ai/tree/main/examples/next-replicate)
- [next-huggingface](https://github.com/khulnasoft/ai/tree/main/examples/next-huggingface)
- [next-langchain](https://github.com/khulnasoft/ai/tree/main/examples/next-langchain)
4 changes: 2 additions & 2 deletions docs/pages/docs/guides/frameworks/nuxt.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -81,5 +81,5 @@ To see the full list of options for `useChat`, see the [API reference](/docs/api

## Examples

- [nuxt-openai](https://github.com/vercel/ai/tree/main/examples/nuxt-openai)
- [nuxt-langchain](https://github.com/vercel/ai/tree/main/examples/nuxt-langchain)
- [nuxt-openai](https://github.com/khulnasoft/ai/tree/main/examples/nuxt-openai)
- [nuxt-langchain](https://github.com/khulnasoft/ai/tree/main/examples/nuxt-langchain)
4 changes: 2 additions & 2 deletions docs/pages/docs/guides/frameworks/solidjs.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -84,8 +84,8 @@ export const POST = async (event: APIEvent) => {
};
```

You can visit the full SolidStart example [on GitHub](https://github.com/vercel/ai/blob/main/examples/solidstart-openai).
You can visit the full SolidStart example [on GitHub](https://github.com/khulnasoft/ai/blob/main/examples/solidstart-openai).

## Examples

- [solidstart-openai](https://github.com/vercel/ai/tree/main/examples/solidstart-openai)
- [solidstart-openai](https://github.com/khulnasoft/ai/tree/main/examples/solidstart-openai)
Loading

0 comments on commit b028504

Please sign in to comment.