Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@ jobs:
cancel-in-progress: true
permissions:
contents: read
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}

steps:
- name: Checkout
Expand All @@ -41,6 +43,8 @@ jobs:

- name: Run Playwright smoke tests
run: pnpm e2e:smoke
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}

- name: Upload Playwright report
if: always()
Expand Down
201 changes: 201 additions & 0 deletions docs/content/docs/plugins/ai-chat.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,201 @@
---
title: AI Chat Plugin
description: Add AI-powered chat functionality with conversation history, streaming, and customizable models
---

import { Tabs, Tab } from "fumadocs-ui/components/tabs";
import { Callout } from "fumadocs-ui/components/callout";

## Installation

Follow these steps to add the AI Chat plugin to your Better Stack setup.

### 1. Add Plugin to Backend API

Import and register the AI Chat backend plugin in your `better-stack.ts` file:

```ts title="lib/better-stack.ts"
import { betterStack } from "@btst/stack"
import { aiChatBackendPlugin } from "@btst/stack/plugins/ai-chat/api"
import { openai } from "@ai-sdk/openai"
// ... your adapter imports

const { handler, dbSchema } = betterStack({
basePath: "/api/data",
plugins: {
aiChat: aiChatBackendPlugin({
model: openai("gpt-4o"), // Or any LanguageModel from AI SDK
hooks: {
onBeforeChat: async (messages, context) => {
// Optional: Add authorization logic
return true
},
}
})
},
adapter: (db) => createMemoryAdapter(db)({})
})

export { handler, dbSchema }
```

The `aiChatBackendPlugin()` requires a `model` parameter (from AI SDK) and accepts optional hooks for customizing behavior (authorization, logging, etc.).

<Callout type="info">
**Model Configuration:** You can use any model from the AI SDK, including OpenAI, Anthropic, Google, and more. Make sure to install the corresponding provider package (e.g., `@ai-sdk/openai`) and set up your API keys in environment variables.
</Callout>

### 2. Add Plugin to Client

Register the AI Chat client plugin in your `better-stack-client.tsx` file:

```tsx title="lib/better-stack-client.tsx"
import { createStackClient } from "@btst/stack/client"
import { aiChatClientPlugin } from "@btst/stack/plugins/ai-chat/client"

const getBaseURL = () =>
typeof window !== 'undefined'
? (process.env.NEXT_PUBLIC_BASE_URL || window.location.origin)
: (process.env.BASE_URL || "http://localhost:3000")

export const getStackClient = (queryClient: QueryClient) => {
const baseURL = getBaseURL()
return createStackClient({
plugins: {
aiChat: aiChatClientPlugin({
apiBaseURL: baseURL,
apiBasePath: "/api/data",
})
}
})
}
```

**Required configuration:**
- `apiBaseURL`: Base URL for API calls
- `apiBasePath`: Path where your API is mounted (e.g., `/api/data`)

### 3. Generate Database Schema

After adding the plugin, generate your database schema using the CLI:

```bash
npx @btst/cli generate --orm prisma --config lib/better-stack.ts
```

This will create the necessary database tables for conversations and messages. Run migrations as needed for your ORM.

For more details on the CLI and all available options, see the [CLI documentation](/cli).

## Usage

The AI Chat plugin provides two routes:

- `/chat` - Start a new conversation
- `/chat/:id` - Resume an existing conversation

The plugin automatically handles:
- Creating and managing conversations
- Saving messages to the database
- Streaming AI responses in real-time
- Conversation history persistence

## Customization

### Backend Hooks

Customize backend behavior with optional hooks:

<AutoTypeTable path="../packages/better-stack/src/plugins/ai-chat/api/plugin.ts" name="AiChatBackendHooks" />

**Example usage:**

```ts title="lib/better-stack.ts"
import { aiChatBackendPlugin, type AiChatBackendHooks } from "@btst/stack/plugins/ai-chat/api"

const chatHooks: AiChatBackendHooks = {
onBeforeChat: async (messages, context) => {
// Add authorization logic
const authHeader = context.headers?.get("authorization")
if (!authHeader) {
return false // Deny access
}
return true
},
onAfterChat: async (conversationId, messages, context) => {
// Log conversation or trigger webhooks
console.log("Chat completed:", conversationId)
},
}

const { handler, dbSchema } = betterStack({
plugins: {
aiChat: aiChatBackendPlugin({
model: openai("gpt-4o"),
hooks: chatHooks
})
},
// ...
})
```

### Model Configuration

You can configure different models and tools:

```ts title="lib/better-stack.ts"
import { openai } from "@ai-sdk/openai"
import { anthropic } from "@ai-sdk/anthropic"

// Use OpenAI
aiChat: aiChatBackendPlugin({
model: openai("gpt-4o"),
})

// Or use Anthropic
aiChat: aiChatBackendPlugin({
model: anthropic("claude-3-5-sonnet-20241022"),
})

// With tools (if your model supports it)
aiChat: aiChatBackendPlugin({
model: openai("gpt-4o"),
// Tools configuration would go here if supported
})
```

## API Endpoints

The plugin provides the following endpoints:

- `POST /api/data/chat` - Send a message and receive streaming response
- `GET /api/data/conversations` - List all conversations
- `GET /api/data/conversations/:id` - Get a conversation with messages
- `POST /api/data/conversations` - Create a new conversation
- `DELETE /api/data/conversations/:id` - Delete a conversation

## Client Components

The plugin exports a `ChatInterface` component that you can use directly:

```tsx
import { ChatInterface } from "@btst/stack/plugins/ai-chat/client"

export default function ChatPage() {
return (
<ChatInterface
apiPath="/api/data/chat"
initialMessages={[]}
/>
)
}
```

## Features

- **Streaming Responses**: Real-time streaming of AI responses using AI SDK v5
- **Conversation History**: Automatic persistence of conversations and messages
- **Customizable Models**: Use any LanguageModel from the AI SDK
- **Authorization Hooks**: Add custom authentication and authorization logic
- **Type-Safe**: Full TypeScript support with proper types from AI SDK

6 changes: 6 additions & 0 deletions docs/content/docs/plugins/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,12 @@ Better Stack provides a collection of full-stack plugins that you can easily int
icon={<BookOpen size={20} />}
description="Content management, editor, drafts, publishing, SEO, RSS feeds."
/>
<Card
title="AI Chat Plugin"
href="/plugins/ai-chat"
icon={<BookOpen size={20} />}
description="AI-powered chat with conversation history, streaming, and customizable models."
/>
<Card
title="Building Plugins"
href="/plugins/development"
Expand Down
8 changes: 6 additions & 2 deletions e2e/playwright.config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ export default defineConfig({
HOST: "127.0.0.1",
BASE_URL: "http://localhost:3003",
NEXT_PUBLIC_BASE_URL: "http://localhost:3003",
OPENAI_API_KEY: process.env.OPENAI_API_KEY || "",
},
},
{
Expand All @@ -49,6 +50,7 @@ export default defineConfig({
PORT: "3004",
HOST: "127.0.0.1",
BASE_URL: "http://localhost:3004",
OPENAI_API_KEY: process.env.OPENAI_API_KEY || "",
},
},
{
Expand All @@ -63,6 +65,7 @@ export default defineConfig({
PORT: "3005",
HOST: "127.0.0.1",
BASE_URL: "http://localhost:3005",
OPENAI_API_KEY: process.env.OPENAI_API_KEY || "",
},
},
],
Expand All @@ -76,17 +79,18 @@ export default defineConfig({
"**/*.todos.spec.ts",
"**/*.auth-blog.spec.ts",
"**/*.blog.spec.ts",
"**/*.chat.spec.ts",
],
},
{
name: "tanstack:memory",
use: { baseURL: "http://localhost:3004" },
testMatch: ["**/*.blog.spec.ts"],
testMatch: ["**/*.blog.spec.ts", "**/*.chat.spec.ts"],
},
{
name: "react-router:memory",
use: { baseURL: "http://localhost:3005" },
testMatch: ["**/*.blog.spec.ts"],
testMatch: ["**/*.blog.spec.ts", "**/*.chat.spec.ts"],
},
],
});
45 changes: 45 additions & 0 deletions e2e/tests/smoke.chat.spec.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
import { test, expect } from "@playwright/test";

const hasOpenAiKey =
typeof process.env.OPENAI_API_KEY === "string" &&
process.env.OPENAI_API_KEY.trim().length > 0;

if (!hasOpenAiKey) {
// eslint-disable-next-line no-console -- surfaced only when tests are skipped
console.warn(
"Skipping AI chat smoke tests: OPENAI_API_KEY is not available in the environment.",
);
}

test.skip(
!hasOpenAiKey,
"OPENAI_API_KEY is required to run AI chat smoke tests.",
);

test.describe("AI Chat Plugin", () => {
test("should start a new conversation and send a message", async ({
page,
}) => {
// 1. Navigate to the chat page
await page.goto("/pages/chat");

// 2. Verify initial state
await expect(page.getByText("Start a conversation...")).toBeVisible();
await expect(page.getByPlaceholder("Type a message...")).toBeVisible();

// 3. Send a message
const input = page.getByPlaceholder("Type a message...");
await input.fill("Hello, world!");
// Use Enter key or find the submit button
await page.keyboard.press("Enter");

// 4. Verify user message appears
await expect(page.getByText("Hello, world!")).toBeVisible({
timeout: 5000,
});

// 5. Verify AI response appears (using real OpenAI, so response content varies, but should exist)
// We wait for the AI message container - look for prose class in assistant messages
await expect(page.locator(".prose").nth(1)).toBeVisible({ timeout: 30000 });
});
});
3 changes: 3 additions & 0 deletions examples/nextjs/app/page.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,9 @@ export default function Home() {
<Button className="text-destructive" variant="link" asChild>
<Link href="/pages/blog/new">New Post</Link>
</Button>
<Button className="text-destructive" variant="link" asChild>
<Link href="/pages/chat">Chat</Link>
</Button>
</div>
</main>
</div>
Expand Down
7 changes: 6 additions & 1 deletion examples/nextjs/lib/better-stack-client.tsx
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import { createStackClient } from "@btst/stack/client"
import { todosClientPlugin } from "@/lib/plugins/todo/client/client"
import { blogClientPlugin } from "@btst/stack/plugins/blog/client"
import { aiChatClientPlugin } from "@btst/stack/plugins/ai-chat/client"
import { QueryClient } from "@tanstack/react-query"

// Get base URL function - works on both server and client
Expand Down Expand Up @@ -84,7 +85,11 @@ export const getStackClient = (
);
},
}
}),
aiChat: aiChatClientPlugin({
apiBaseURL: baseURL,
apiBasePath: "/api/data",
})
}
})
}
}
7 changes: 6 additions & 1 deletion examples/nextjs/lib/better-stack.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@ import { createMemoryAdapter } from "@btst/adapter-memory"
import { betterStack } from "@btst/stack"
import { todosBackendPlugin } from "./plugins/todo/api/backend"
import { blogBackendPlugin, type BlogBackendHooks } from "@btst/stack/plugins/blog/api"
import { aiChatBackendPlugin } from "@btst/stack/plugins/ai-chat/api"
import { openai } from "@ai-sdk/openai"

// Define blog hooks with proper types
// NOTE: This is the main API at /api/data - kept auth-free for regular tests
Expand Down Expand Up @@ -64,7 +66,10 @@ const { handler, dbSchema } = betterStack({
basePath: "/api/data",
plugins: {
todos: todosBackendPlugin,
blog: blogBackendPlugin(blogHooks)
blog: blogBackendPlugin(blogHooks),
aiChat: aiChatBackendPlugin({
model: openai("gpt-4o"),
})
},
adapter: (db) => createMemoryAdapter(db)({})
})
Expand Down
3 changes: 3 additions & 0 deletions examples/nextjs/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,9 @@
"dependencies": {
"@btst/adapter-memory": "^1.0.2",
"@btst/stack": "workspace:*",
"ai": "^5.0.94",
"@ai-sdk/react": "^2.0.94",
"@ai-sdk/openai": "^2.0.68",
"@next/bundle-analyzer": "^16.0.0",
"@radix-ui/react-checkbox": "^1.3.3",
"@radix-ui/react-dropdown-menu": "^2.1.16",
Expand Down
Loading