A demo app showcasing agent-sdk-go — the Temporal-first AI agent SDK for Go. Built with a React UI and Go REST API, with durable workflow-backed conversations.
This is a demo app showcasing agent-sdk-go. Not intended for production use.
Most agent frameworks run in-process — if your server restarts, the agent run is lost. agent-sdk-go is Temporal-first, so every agent run is a durable workflow:
- Durable conversations — chat history and agent runs survive server restarts
- Long-running agents — conversations can run for extended periods without losing state
- Automatic retries — failed LLM calls retry automatically via Temporal
- agent-sdk-go — AI agent SDK for Go
- React Router 7 + Vite — UI
- Tailwind CSS v4 — Styling
- react-markdown + remark-gfm — Message bubbles render Markdown (GFM)
- Docker — Docker Engine with Docker Compose (the
docker composeCLI; Compose v2 is bundled with Docker Desktop and current Engine installs). - LLM access — An API key from a supported provider (for example OpenAI or an OpenAI-compatible HTTP API). Add it to
server/.envin the Configuration section below. - A local copy of this project — You need the files on your computer before you can run anything (clone with Git or download a ZIP — Get the code below).
Follow these steps in order. Every shell command assumes your current directory is the repository root: the folder that contains docker-compose.yml.
-
Clone with Git (recommended):
git clone https://github.com/agenticenv/agent-chat.git cd agent-chatUse your fork’s URL if you forked the repo. After
cd, you should seedocker-compose.ymlin that directory. -
Or download a ZIP — On GitHub, open Code → Download ZIP, unzip it, then open a terminal and
cdinto the unzipped folder (the one that containsdocker-compose.yml).
Agent Chat reads server/.env for LLM settings. If LLM_API_KEY is missing, the Agent Chat API will not start and containers may fail or restart.
-
Copy the example file:
cp server/.env.example server/.env
-
Required
Variable You must… LLM_API_KEYSet to your real LLM API key. An empty placeholder means Agent Chat cannot start. -
Optional — LLM (defaults are fine for OpenAI)
LLM_PROVIDER— defaultopenaiLLM_MODEL— defaultgpt-4oLLM_BASE_URL— set only for a custom or Azure-style HTTP endpoint; use aLLM_MODELyour provider supports
-
Optional — agent
AGENT_SYSTEM_PROMPT— how Agent Chat behaves (role, tone, rules). Omit to use the built-in default.AGENT_NAME,AGENT_DESCRIPTION,AGENT_CONVERSATION_WINDOW_SIZE— labeling and how much chat history is in context; seeserver/.env.example.
Full variable list and behavior: server/README.md.
Agent Chat runs with Docker Compose from the repository root.
-
Start the stack (Postgres, Temporal, API, UI):
docker compose up -d --build
-
Open Agent Chat: http://localhost:3000 — use the chat in your browser.
-
(Optional) Temporal UI: http://localhost:8233 — view Temporal workflow executions for Agent Chat.
docker compose down