- resources
- user chat attachment
- database domain/topic knowledge base
- indexing
- from RAW markdown to JSON. (
_scripts) - leverage obsidian community plugin dataviewjs
- embedding database storage (postgres vector plugin)
- from RAW markdown to JSON. (
- Table Schema:
schema.ts - Save Chat
- Response message: assistant messages, server tool call and result,
- Request message: user message, client tool result,
Note
Save Client side Tool's result message from request message
The client side tool's tool-call is in assistant response message, while too-result will be in the next assistant request message if user send more questions upon the tool's result.
And message id are the same for the two assistant message (onToolCall). So when saving the two message to the database, we need to generate a new message id for the tool-result.
- Request/Response for
steamText()at\app\(chat)\api\chat- update chat saving logic to include topic metadata
- Sent attachment along with message at
\app\(chat)\api\files\uploadand store the file in Blob Store (Only images and not RAG).- expand
allowedFileTypesto include document attachment as user end RAG resource.
- expand
Not touched:
- get Chats By UserId at
\app\(chat)\api\history - user rating at
\app\(chat)\api\vote - Block Document at
\app\(chat)\api\document
/app/api/chat/route.ts
Note
sendExtraMessageFields
when enable sendExtraMessageFields: true in useChat, extra fields like id, createdAt will be added to the message. Especially, the assistant role message might be added with the revisionId.
- Display Message from interactive session
- Display Message from DataBase
- Chat Header: Model Selection, Topic Selection
- Chat TopicInputValues
- Suggest Action:
append
- Server side tool:
execute: - Client side tool:
onToolCall
Note
onToolCall triggered by append
The onToolCall function uses topicInputValues directly, but because onToolCall is passed to the useChat hook, it does not automatically update when topicInputValues changes due to React’s closure behavior.
Originally version (issue)
const [topicInputValues, setTopicInputValues] = useState<TopicInputs>(() => {....});
async function onToolCall(
{ toolCall }: { toolCall: ToolCall<string, unknown> }
): Promise<string | undefined | NatalChartDataOriginal[]> {
if (topicInputValues.topicId === 'topic-numerology') {
console.log('topicInputValues.solarDateStr', topicInputValues.solarDateStr);
const astrolabeData = bySolar(
topicInputValues.solarDateStr,
topicInputValues.timeIndex,
topicInputValues.gender
);
...
}
};Fixed version (Use useRef to Track State Changes)
const [topicInputValues, setTopicInputValues] = useState<TopicInputs>(() => {....});
const topicInputValuesRef = useRef(topicInputValues);
useEffect(() => {
topicInputValuesRef.current = topicInputValues;
}, [topicInputValues]);
async function onToolCall(
{ toolCall }: { toolCall: ToolCall<string, unknown> }
): Promise<string | undefined | NatalChartDataOriginal[]> {
const currentTopicInputValues = topicInputValuesRef.current; // Get the latest value
if (currentTopicInputValues.topicId === 'topic-numerology') {
console.log('topicInputValues.solarDateStr', currentTopicInputValues.solarDateStr);
const astrolabeData = bySolar(
currentTopicInputValues.solarDateStr,
currentTopicInputValues.timeIndex,
currentTopicInputValues.gender
);
...
}
};- bazi feature
- natal chart feature
- zhouyi feature
- weather
- coding
- document editing
// .vscode/settings.json
{
// mute tailwind unknownAtRules warning
"files.associations": {
"*.css": "tailwindcss"
}
}tree -L 4 --gitignore
Following is from the original template's README.
- Next.js App Router
- Advanced routing for seamless navigation and performance
- React Server Components (RSCs) and Server Actions for server-side rendering and increased performance
- AI SDK
- Unified API for generating text, structured objects, and tool calls with LLMs
- Hooks for building dynamic chat and generative user interfaces
- Supports OpenAI (default), Anthropic, Cohere, and other model providers
- shadcn/ui
- Styling with Tailwind CSS
- Component primitives from Radix UI for accessibility and flexibility
- Data Persistence
- Vercel Postgres powered by Neon for saving chat history and user data
- Vercel Blob for efficient file storage
- NextAuth.js
- Simple and secure authentication
This template ships with OpenAI gpt-4o as the default. However, with the AI SDK, you can switch LLM providers to OpenAI, Anthropic, Cohere, and many more with just a few lines of code.
You can deploy your own version of the Next.js AI Chatbot to Vercel with one click:
You will need to use the environment variables defined in .env.example to run Next.js AI Chatbot. It's recommended you use Vercel Environment Variables for this, but a .env file is all that is necessary.
Note: You should not commit your
.envfile or it will expose secrets that will allow others to control access to your various OpenAI and authentication provider accounts.
- Install Vercel CLI:
npm i -g vercel - Link local instance with Vercel and GitHub accounts (creates
.verceldirectory):vercel link - Download your environment variables:
vercel env pull
pnpm install
pnpm devYour app template should now be running on localhost:3000.
Note
Highlights information that users should take into account, even when skimming.
Tip
Optional information to help a user be more successful.
Important
Crucial information necessary for users to succeed.
Warning
Critical content demanding immediate user attention due to potential risks.
Caution
Negative potential consequences of an action.