Skip to content

Commit

Permalink
feat (ai/svelte): add useAssistant (#1593)
Browse files Browse the repository at this point in the history
Co-authored-by: Jake Hall <jake.hallslm@gmail.com>
  • Loading branch information
lgrammel and jaycoolslm committed May 15, 2024
1 parent 7588999 commit 18a9655
Show file tree
Hide file tree
Showing 11 changed files with 547 additions and 46 deletions.
5 changes: 5 additions & 0 deletions .changeset/thick-jars-hear.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'ai': patch
---

feat (ai/svelte): add useAssistant
6 changes: 5 additions & 1 deletion content/docs/05-ai-sdk-ui/03-openai-assistants.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,11 @@ description: Learn how to use the useAssistant hook.

# OpenAI Assistants

The `useAssistant` hook allows you to handle the client state when interacting with an OpenAI compatible assistant API. This hook is useful when you want to integrate assistant capabilities into your application, with the UI updated automatically as the assistant is streaming its execution.
The `useAssistant` hook allows you to handle the client state when interacting with an OpenAI compatible assistant API.
This hook is useful when you want to integrate assistant capabilities into your application,
with the UI updated automatically as the assistant is streaming its execution.

The `useAssistant` hook is currently supported with `ai/react` and `ai/svelte`.

## Example

Expand Down
12 changes: 10 additions & 2 deletions content/docs/07-reference/ai-sdk-ui/03-use-assistant.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,22 @@ description: Reference documentation for the useAssistant hook in the AI SDK UI

# `useAssistant`

Allows you to handle the client state when interacting with an OpenAI compatible assistant API. This hook is useful when you want to integrate assistant capibilities into your application, with the UI updated automatically as the assistant is streaming its execution.
Allows you to handle the client state when interacting with an OpenAI compatible assistant API.
This hook is useful when you want to integrate assistant capibilities into your application,
with the UI updated automatically as the assistant is streaming its execution.

This works in conjunction with [`AssistantResponse`]() in the backend.
This works in conjunction with [`AssistantResponse`](/docs/reference/stream-helpers/assistant-response) in the backend.

`useAssistant` is currently supported with `ai/react` and `ai/svelte`.

## Import

### React

<Snippet text={`import { useAssistant } from "ai/react"`} prompt={false} />

### Svelte

<Snippet text={`import { useAssistant } from "ai/svelte"`} prompt={false} />

<ReferenceTable packageName="react" functionName="useAssistant" />
117 changes: 117 additions & 0 deletions examples/sveltekit-openai/src/routes/api/assistant/+server.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
import type { RequestHandler } from './$types';

import { env } from '$env/dynamic/private';

import { AssistantResponse } from 'ai';
import OpenAI from 'openai';

const openai = new OpenAI({
apiKey: env.OPENAI_API_KEY || '',
});

const homeTemperatures = {
bedroom: 20,
'home office': 21,
'living room': 21,
kitchen: 22,
bathroom: 23,
};

export const POST = (async ({ request }) => {
// Parse the request body
const input: {
threadId: string | null;
message: string;
} = await request.json();

// Create a thread if needed
const threadId = input.threadId ?? (await openai.beta.threads.create({})).id;

// Add a message to the thread
const createdMessage = await openai.beta.threads.messages.create(threadId, {
role: 'user',
content: input.message,
});

return AssistantResponse(
{ threadId, messageId: createdMessage.id },
async ({ forwardStream, sendDataMessage }) => {
// Run the assistant on the thread
const runStream = openai.beta.threads.runs.stream(threadId, {
assistant_id:
env.ASSISTANT_ID ??
(() => {
throw new Error('ASSISTANT_ID is not set');
})(),
});

// forward run status would stream message deltas
let runResult = await forwardStream(runStream);

// status can be: queued, in_progress, requires_action, cancelling, cancelled, failed, completed, or expired
while (
runResult?.status === 'requires_action' &&
runResult.required_action?.type === 'submit_tool_outputs'
) {
const tool_outputs =
runResult.required_action.submit_tool_outputs.tool_calls.map(
(toolCall: any) => {
const parameters = JSON.parse(toolCall.function.arguments);

switch (toolCall.function.name) {
case 'getRoomTemperature': {
const temperature =
homeTemperatures[
parameters.room as keyof typeof homeTemperatures
];

return {
tool_call_id: toolCall.id,
output: temperature.toString(),
};
}

case 'setRoomTemperature': {
const oldTemperature =
homeTemperatures[
parameters.room as keyof typeof homeTemperatures
];

homeTemperatures[
parameters.room as keyof typeof homeTemperatures
] = parameters.temperature;

sendDataMessage({
role: 'data',
data: {
oldTemperature,
newTemperature: parameters.temperature,
description: `Temperature in ${parameters.room} changed from ${oldTemperature} to ${parameters.temperature}`,
},
});

return {
tool_call_id: toolCall.id,
output: `temperature set successfully`,
};
}

default:
throw new Error(
`Unknown tool call function: ${toolCall.function.name}`,
);
}
},
);

runResult = await forwardStream(
openai.beta.threads.runs.submitToolOutputsStream(
threadId,
runResult.id,
{ tool_outputs },
),
);
}
},
);
}) satisfies RequestHandler;
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
# Home Automation Assistant Example

## Setup

### Create OpenAI Assistant

[OpenAI Assistant Website](https://platform.openai.com/assistants)

Create a new assistant. Enable Code interpreter. Add the following functions and instructions to the assistant.

Then add the assistant id to the `.env` file as `ASSISTANT_ID=your-assistant-id`.

### Instructions

```
You are an assistant with access to a home automation system. You can get and set the temperature in the bedroom, home office, living room, kitchen and bathroom.
The system uses temperature in Celsius. If the user requests Fahrenheit, you should convert the temperature to Fahrenheit.
```

### getRoomTemperature function

```json
{
"name": "getRoomTemperature",
"description": "Get the temperature in a room",
"parameters": {
"type": "object",
"properties": {
"room": {
"type": "string",
"enum": ["bedroom", "home office", "living room", "kitchen", "bathroom"]
}
},
"required": ["room"]
}
}
```

### setRoomTemperature function

```json
{
"name": "setRoomTemperature",
"description": "Set the temperature in a room",
"parameters": {
"type": "object",
"properties": {
"room": {
"type": "string",
"enum": ["bedroom", "home office", "living room", "kitchen", "bathroom"]
},
"temperature": { "type": "number" }
},
"required": ["room", "temperature"]
}
}
```

## Run

1. Run `pnpm dev` in `examples/sveltekit-openai`
2. Go to http://localhost:5173/assistant
46 changes: 46 additions & 0 deletions examples/sveltekit-openai/src/routes/assistant/+page.svelte
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
<script>
import { useAssistant } from 'ai/svelte'
const { messages, input, submitMessage } = useAssistant({
api: '/api/assistant',
});
</script>

<svelte:head>
<title>Home</title>
<meta name="description" content="Svelte demo app" />
</svelte:head>

<section>
<h1>useAssistant</h1>
<ul>
{#each $messages as m}
<strong>{m.role}</strong>
{#if m.role !== 'data'}
{m.content}
{/if}
{#if m.role === 'data'}
<pre>{JSON.stringify(m.data, null, 2)}}</pre>
{/if}
<br/>
<br/>
{/each}
</ul>
<form on:submit={submitMessage}>
<input bind:value={$input} />
<button type="submit">Send</button>
</form>
</section>

<style>
section {
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
flex: 0.6;
}
h1 {
width: 100%;
}
</style>
51 changes: 8 additions & 43 deletions packages/core/react/use-assistant.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,12 @@ import { isAbortError } from '@ai-sdk/provider-utils';
import { useCallback, useRef, useState } from 'react';
import { generateId } from '../shared/generate-id';
import { readDataStream } from '../shared/read-data-stream';
import { CreateMessage, Message } from '../shared/types';
import { abort } from 'node:process';

export type AssistantStatus = 'in_progress' | 'awaiting_message';
import {
AssistantStatus,
CreateMessage,
Message,
UseAssistantOptions,
} from '../shared/types';

export type UseAssistantHelpers = {
/**
Expand All @@ -16,7 +18,7 @@ export type UseAssistantHelpers = {
messages: Message[];

/**
* setState-powered method to update the messages array.
* Update the message store with a new array of messages.
*/
setMessages: React.Dispatch<React.SetStateAction<Message[]>>;

Expand Down Expand Up @@ -83,42 +85,6 @@ Abort the current request immediately, keep the generated tokens if any.
error: undefined | unknown;
};

export type UseAssistantOptions = {
/**
* The API endpoint that accepts a `{ threadId: string | null; message: string; }` object and returns an `AssistantResponse` stream.
* The threadId refers to an existing thread with messages (or is `null` to create a new thread).
* The message is the next message that should be appended to the thread and sent to the assistant.
*/
api: string;

/**
* An optional string that represents the ID of an existing thread.
* If not provided, a new thread will be created.
*/
threadId?: string;

/**
* An optional literal that sets the mode of credentials to be used on the request.
* Defaults to "same-origin".
*/
credentials?: RequestCredentials;

/**
* An optional object of headers to be passed to the API endpoint.
*/
headers?: Record<string, string> | Headers;

/**
* An optional, additional body object to be passed to the API endpoint.
*/
body?: object;

/**
* An optional callback that will be called when the assistant encounters an error.
*/
onError?: (error: Error) => void;
};

export function useAssistant({
api,
threadId: threadIdParam,
Expand Down Expand Up @@ -254,8 +220,7 @@ export function useAssistant({
}

case 'error': {
const errorObj = new Error(value);
setError(errorObj);
setError(new Error(value));
break;
}
}
Expand Down
2 changes: 2 additions & 0 deletions packages/core/shared/types.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
import { ToolCall as CoreToolCall } from '../core/generate-text/tool-call';
import { ToolResult as CoreToolResult } from '../core/generate-text/tool-result';

export * from './use-assistant-types';

// https://github.com/openai/openai-node/blob/07b3504e1c40fd929f4aae1651b83afc19e3baf8/src/resources/chat/completions.ts#L146-L159
export interface FunctionCall {
/**
Expand Down
38 changes: 38 additions & 0 deletions packages/core/shared/use-assistant-types.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
// Define a type for the assistant status
export type AssistantStatus = 'in_progress' | 'awaiting_message';

export type UseAssistantOptions = {
/**
* The API endpoint that accepts a `{ threadId: string | null; message: string; }` object and returns an `AssistantResponse` stream.
* The threadId refers to an existing thread with messages (or is `null` to create a new thread).
* The message is the next message that should be appended to the thread and sent to the assistant.
*/
api: string;

/**
* An optional string that represents the ID of an existing thread.
* If not provided, a new thread will be created.
*/
threadId?: string;

/**
* An optional literal that sets the mode of credentials to be used on the request.
* Defaults to "same-origin".
*/
credentials?: RequestCredentials;

/**
* An optional object of headers to be passed to the API endpoint.
*/
headers?: Record<string, string> | Headers;

/**
* An optional, additional body object to be passed to the API endpoint.
*/
body?: object;

/**
* An optional callback that will be called when the assistant encounters an error.
*/
onError?: (error: Error) => void;
};
Loading

0 comments on commit 18a9655

Please sign in to comment.