Skip to content

feat: multi-turn conversation via Ollama /api/chat#16

Merged
quiet-node merged 5 commits intomainfrom
worktree-humming-coalescing-lobster
Apr 2, 2026
Merged

feat: multi-turn conversation via Ollama /api/chat#16
quiet-node merged 5 commits intomainfrom
worktree-humming-coalescing-lobster

Conversation

@quiet-node
Copy link
Copy Markdown
Owner

Summary

  • Switch from stateless /api/generate to /api/chat so the assistant remembers prior messages within a session
  • Conversation history owned by Rust backend (ConversationHistory with epoch-based stale-write prevention), matching the industry-standard pattern used by ChatGPT, Claude, and Open WebUI
  • Frontend simplified — sends only the new user message; backend assembles full messages array, streams response, and persists it
  • Configurable system prompt via THUKI_SYSTEM_PROMPT env var (loaded by dotenvy so .env files work for both Vite and Rust)
  • Cancelled generations are excluded from history to prevent partial responses from corrupting Ollama context

Test plan

  • Verify multi-turn conversation works: ask a follow-up question and confirm the assistant remembers prior context
  • Verify session reset: close and reopen the overlay, confirm conversation starts fresh
  • Verify cancel mid-stream: stop a generation, then ask a new question — confirm partial response is not echoed back as context
  • Verify system prompt: set THUKI_SYSTEM_PROMPT in .env, restart app, confirm the model follows it
  • Run bun run test:all — 155 frontend + 57 backend tests pass
  • Run bun run test:coverage — 100% coverage maintained
  • Run bun run validate-build — lint, format, typecheck all clean

🤖 Generated with Claude Code

quiet-node and others added 5 commits April 1, 2026 21:54
…ed history

Switch from stateless /api/generate to /api/chat so the assistant remembers
prior messages within a session. Conversation history is owned by the Rust
backend (Mutex<Vec<ChatMessage>>), matching the industry-standard pattern
used by ChatGPT, Claude, and Open WebUI. The frontend sends only the new
user message; the backend assembles the full messages array, streams the
response, and persists it. Adds a configurable system prompt via
THUKI_SYSTEM_PROMPT env var with a sensible default.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
- Check cancel_token.is_cancelled() after streaming to skip history
  persistence for cancelled generations, preventing partial responses
  from corrupting Ollama context on subsequent turns.
- Add dotenvy to load .env files at Rust startup so THUKI_SYSTEM_PROMPT
  works consistently with Vite's VITE_* vars for developers.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
Replaces the terse one-liner with a full prompt that establishes Thuki
as a context-aware floating secretary (thư ký in Vietnamese), defines
concise response style, explains how to handle quoted context from the
host app, and lists core strengths — giving the model a clear persona
for every session regardless of whether THUKI_SYSTEM_PROMPT is set.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
The `Some(val)` arm of the env-var restoration match was never reached
because THUKI_SYSTEM_PROMPT is unset in the test environment, dropping
line coverage below 100%. Replaced the branching match with an
unconditional `remove_var` call — semantically identical in CI and
eliminates the untested branch.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
The `if let Some(val) = original` restore guard in
`load_system_prompt_returns_default_when_unset` was never entered
because THUKI_SYSTEM_PROMPT is unset in CI, making the Some arm dead
code and dropping line coverage below 100%. Dropped the unreachable
restoration entirely, matching the pattern used in the other two
env-var prompt tests.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
@quiet-node quiet-node merged commit f79b403 into main Apr 2, 2026
3 checks passed
@quiet-node quiet-node deleted the worktree-humming-coalescing-lobster branch April 2, 2026 04:34
quiet-node added a commit that referenced this pull request Apr 10, 2026
* feat: multi-turn conversation via Ollama /api/chat with backend-managed history

Switch from stateless /api/generate to /api/chat so the assistant remembers
prior messages within a session. Conversation history is owned by the Rust
backend (Mutex<Vec<ChatMessage>>), matching the industry-standard pattern
used by ChatGPT, Claude, and Open WebUI. The frontend sends only the new
user message; the backend assembles the full messages array, streams the
response, and persists it. Adds a configurable system prompt via
THUKI_SYSTEM_PROMPT env var with a sensible default.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* fix: prevent cancelled partials in history + load .env for backend

- Check cancel_token.is_cancelled() after streaming to skip history
  persistence for cancelled generations, preventing partial responses
  from corrupting Ollama context on subsequent turns.
- Add dotenvy to load .env files at Rust startup so THUKI_SYSTEM_PROMPT
  works consistently with Vite's VITE_* vars for developers.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* feat: enrich DEFAULT_SYSTEM_PROMPT with Thuki's identity and style guide

Replaces the terse one-liner with a full prompt that establishes Thuki
as a context-aware floating secretary (thư ký in Vietnamese), defines
concise response style, explains how to handle quoted context from the
host app, and lists core strengths — giving the model a clear persona
for every session regardless of whether THUKI_SYSTEM_PROMPT is set.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* fix: remove dead coverage branch in system prompt cleanup tests

The `Some(val)` arm of the env-var restoration match was never reached
because THUKI_SYSTEM_PROMPT is unset in the test environment, dropping
line coverage below 100%. Replaced the branching match with an
unconditional `remove_var` call — semantically identical in CI and
eliminates the untested branch.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* fix: remove dead if-let branch in system prompt unset test

The `if let Some(val) = original` restore guard in
`load_system_prompt_returns_default_when_unset` was never entered
because THUKI_SYSTEM_PROMPT is unset in CI, making the Some arm dead
code and dropping line coverage below 100%. Dropped the unreachable
restoration entirely, matching the pattern used in the other two
env-var prompt tests.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

---------

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
quiet-node added a commit that referenced this pull request Apr 10, 2026
* feat: multi-turn conversation via Ollama /api/chat with backend-managed history

Switch from stateless /api/generate to /api/chat so the assistant remembers
prior messages within a session. Conversation history is owned by the Rust
backend (Mutex<Vec<ChatMessage>>), matching the industry-standard pattern
used by ChatGPT, Claude, and Open WebUI. The frontend sends only the new
user message; the backend assembles the full messages array, streams the
response, and persists it. Adds a configurable system prompt via
THUKI_SYSTEM_PROMPT env var with a sensible default.

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* fix: prevent cancelled partials in history + load .env for backend

- Check cancel_token.is_cancelled() after streaming to skip history
  persistence for cancelled generations, preventing partial responses
  from corrupting Ollama context on subsequent turns.
- Add dotenvy to load .env files at Rust startup so THUKI_SYSTEM_PROMPT
  works consistently with Vite's VITE_* vars for developers.

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* feat: enrich DEFAULT_SYSTEM_PROMPT with Thuki's identity and style guide

Replaces the terse one-liner with a full prompt that establishes Thuki
as a context-aware floating secretary (thư ký in Vietnamese), defines
concise response style, explains how to handle quoted context from the
host app, and lists core strengths — giving the model a clear persona
for every session regardless of whether THUKI_SYSTEM_PROMPT is set.

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* fix: remove dead coverage branch in system prompt cleanup tests

The `Some(val)` arm of the env-var restoration match was never reached
because THUKI_SYSTEM_PROMPT is unset in the test environment, dropping
line coverage below 100%. Replaced the branching match with an
unconditional `remove_var` call — semantically identical in CI and
eliminates the untested branch.

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* fix: remove dead if-let branch in system prompt unset test

The `if let Some(val) = original` restore guard in
`load_system_prompt_returns_default_when_unset` was never entered
because THUKI_SYSTEM_PROMPT is unset in CI, making the Some arm dead
code and dropping line coverage below 100%. Dropped the unreachable
restoration entirely, matching the pattern used in the other two
env-var prompt tests.

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

---------

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
quiet-node added a commit that referenced this pull request Apr 11, 2026
* feat: multi-turn conversation via Ollama /api/chat with backend-managed history

Switch from stateless /api/generate to /api/chat so the assistant remembers
prior messages within a session. Conversation history is owned by the Rust
backend (Mutex<Vec<ChatMessage>>), matching the industry-standard pattern
used by ChatGPT, Claude, and Open WebUI. The frontend sends only the new
user message; the backend assembles the full messages array, streams the
response, and persists it. Adds a configurable system prompt via
THUKI_SYSTEM_PROMPT env var with a sensible default.

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* fix: prevent cancelled partials in history + load .env for backend

- Check cancel_token.is_cancelled() after streaming to skip history
  persistence for cancelled generations, preventing partial responses
  from corrupting Ollama context on subsequent turns.
- Add dotenvy to load .env files at Rust startup so THUKI_SYSTEM_PROMPT
  works consistently with Vite's VITE_* vars for developers.

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* feat: enrich DEFAULT_SYSTEM_PROMPT with Thuki's identity and style guide

Replaces the terse one-liner with a full prompt that establishes Thuki
as a context-aware floating secretary (thư ký in Vietnamese), defines
concise response style, explains how to handle quoted context from the
host app, and lists core strengths — giving the model a clear persona
for every session regardless of whether THUKI_SYSTEM_PROMPT is set.

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* fix: remove dead coverage branch in system prompt cleanup tests

The `Some(val)` arm of the env-var restoration match was never reached
because THUKI_SYSTEM_PROMPT is unset in the test environment, dropping
line coverage below 100%. Replaced the branching match with an
unconditional `remove_var` call — semantically identical in CI and
eliminates the untested branch.

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

* fix: remove dead if-let branch in system prompt unset test

The `if let Some(val) = original` restore guard in
`load_system_prompt_returns_default_when_unset` was never entered
because THUKI_SYSTEM_PROMPT is unset in CI, making the Some arm dead
code and dropping line coverage below 100%. Dropped the unreachable
restoration entirely, matching the pattern used in the other two
env-var prompt tests.

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>

---------

Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
This was referenced Apr 11, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant