A full-stack AI chat application with advanced tool orchestration and streaming support. Built with Next.js frontend and Node.js backend, featuring OpenAI-compatible API endpoints with server-side tool calling capabilities.
- Bootstrap repo
- Backend proxy with streaming (OpenAI-compatible)
- Rate limiting (in-memory per-IP)
- Basic chat UI (streaming toggle, model select, abort)
- Responses API with conversation continuity
- Testing infrastructure (Jest for backend & frontend)
- Markdown rendering with syntax highlighting
- Development tooling (ESLint, Prettier)
- Docker development environment
- Tool orchestration system (server-side, up to 10 iterations)
- Enhanced UI components (quality controls, floating dropdowns)
- Advanced streaming (tool events, thinking support)
- Conversation persistence (SQLite database with migrations)
- Conversation history UI integration
- System prompt / temperature controls
- Auth & per-user limits (planned)
- Server-side tool calling with unified orchestrator
- Iterative workflows with thinking support
- Supports both streaming and non-streaming modes
- Smart error handling and timeout management
- Quality slider for response control (quick/balanced/thorough)
- Improved dropdown components with floating UI positioning
- Responsive design with accessibility features
- Real-time streaming with tool event display
- OpenAI-compatible API (Chat Completions + Responses)
- SQLite database with migrations for conversation persistence
- Comprehensive test coverage (Jest)
- ESLint/Prettier code quality tools
- Docker development environment with hot reload
# backend
cp backend/.env.example backend/.env
npm --prefix backend install
npm --prefix backend run dev
# frontend (in separate terminal)
cp frontend/.env.example frontend/.env.local
npm --prefix frontend install
npm --prefix frontend run dev
Visit http://localhost:3000 (backend on :3001).
Note: Set OPENAI_API_KEY
in backend/.env
before starting. Optionally set TAVILY_API_KEY
for web search tool functionality.
cp backend/.env.example backend/.env # ensure API key set
docker compose -f docker-compose.yml up --build
Then open http://localhost:3000
Images:
- backend: minimal prod deps only
- frontend: multi-stage (deps → build → runtime) with build-time
NEXT_PUBLIC_API_BASE
To rebuild frontend with a different API base:
docker compose build --build-arg NEXT_PUBLIC_API_BASE=http://backend:3001 frontend
docker compose up -d frontend
cp backend/.env.example backend/.env # ensure API key set
docker compose -f docker-compose.dev.yml up --build
Frontend on http://localhost:3000 with hot reload enabled.
The application includes a server-side tool registry located in backend/src/lib/tools.js
. To add new tools:
- Define your tool in the registry with a
validate
function for arguments - Implement the
handler
function - Tools are automatically available via the orchestration system
Example:
export const tools = {
get_weather: {
validate: (args) => {
if (!args || typeof args.city !== 'string') {
throw new Error('get_weather requires a "city" argument of type string');
}
return { city: args.city };
},
handler: async ({ city }) => ({ tempC: 22, city }),
}
};
# Backend tests
npm --prefix backend test
# Frontend tests
npm --prefix frontend test
# Backend linting
npm --prefix backend run lint
# Frontend linting
npm --prefix frontend run lint