feat: multi-turn conversation via Ollama /api/chat#16
Merged
quiet-node merged 5 commits intomainfrom Apr 2, 2026
Merged
Conversation
…ed history Switch from stateless /api/generate to /api/chat so the assistant remembers prior messages within a session. Conversation history is owned by the Rust backend (Mutex<Vec<ChatMessage>>), matching the industry-standard pattern used by ChatGPT, Claude, and Open WebUI. The frontend sends only the new user message; the backend assembles the full messages array, streams the response, and persists it. Adds a configurable system prompt via THUKI_SYSTEM_PROMPT env var with a sensible default. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
- Check cancel_token.is_cancelled() after streaming to skip history persistence for cancelled generations, preventing partial responses from corrupting Ollama context on subsequent turns. - Add dotenvy to load .env files at Rust startup so THUKI_SYSTEM_PROMPT works consistently with Vite's VITE_* vars for developers. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
Replaces the terse one-liner with a full prompt that establishes Thuki as a context-aware floating secretary (thư ký in Vietnamese), defines concise response style, explains how to handle quoted context from the host app, and lists core strengths — giving the model a clear persona for every session regardless of whether THUKI_SYSTEM_PROMPT is set. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
The `Some(val)` arm of the env-var restoration match was never reached because THUKI_SYSTEM_PROMPT is unset in the test environment, dropping line coverage below 100%. Replaced the branching match with an unconditional `remove_var` call — semantically identical in CI and eliminates the untested branch. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
The `if let Some(val) = original` restore guard in `load_system_prompt_returns_default_when_unset` was never entered because THUKI_SYSTEM_PROMPT is unset in CI, making the Some arm dead code and dropping line coverage below 100%. Dropped the unreachable restoration entirely, matching the pattern used in the other two env-var prompt tests. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
This was referenced Apr 5, 2026
quiet-node
added a commit
that referenced
this pull request
Apr 10, 2026
* feat: multi-turn conversation via Ollama /api/chat with backend-managed history Switch from stateless /api/generate to /api/chat so the assistant remembers prior messages within a session. Conversation history is owned by the Rust backend (Mutex<Vec<ChatMessage>>), matching the industry-standard pattern used by ChatGPT, Claude, and Open WebUI. The frontend sends only the new user message; the backend assembles the full messages array, streams the response, and persists it. Adds a configurable system prompt via THUKI_SYSTEM_PROMPT env var with a sensible default. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * fix: prevent cancelled partials in history + load .env for backend - Check cancel_token.is_cancelled() after streaming to skip history persistence for cancelled generations, preventing partial responses from corrupting Ollama context on subsequent turns. - Add dotenvy to load .env files at Rust startup so THUKI_SYSTEM_PROMPT works consistently with Vite's VITE_* vars for developers. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * feat: enrich DEFAULT_SYSTEM_PROMPT with Thuki's identity and style guide Replaces the terse one-liner with a full prompt that establishes Thuki as a context-aware floating secretary (thư ký in Vietnamese), defines concise response style, explains how to handle quoted context from the host app, and lists core strengths — giving the model a clear persona for every session regardless of whether THUKI_SYSTEM_PROMPT is set. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * fix: remove dead coverage branch in system prompt cleanup tests The `Some(val)` arm of the env-var restoration match was never reached because THUKI_SYSTEM_PROMPT is unset in the test environment, dropping line coverage below 100%. Replaced the branching match with an unconditional `remove_var` call — semantically identical in CI and eliminates the untested branch. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * fix: remove dead if-let branch in system prompt unset test The `if let Some(val) = original` restore guard in `load_system_prompt_returns_default_when_unset` was never entered because THUKI_SYSTEM_PROMPT is unset in CI, making the Some arm dead code and dropping line coverage below 100%. Dropped the unreachable restoration entirely, matching the pattern used in the other two env-var prompt tests. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> --------- Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
quiet-node
added a commit
that referenced
this pull request
Apr 10, 2026
* feat: multi-turn conversation via Ollama /api/chat with backend-managed history Switch from stateless /api/generate to /api/chat so the assistant remembers prior messages within a session. Conversation history is owned by the Rust backend (Mutex<Vec<ChatMessage>>), matching the industry-standard pattern used by ChatGPT, Claude, and Open WebUI. The frontend sends only the new user message; the backend assembles the full messages array, streams the response, and persists it. Adds a configurable system prompt via THUKI_SYSTEM_PROMPT env var with a sensible default. Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * fix: prevent cancelled partials in history + load .env for backend - Check cancel_token.is_cancelled() after streaming to skip history persistence for cancelled generations, preventing partial responses from corrupting Ollama context on subsequent turns. - Add dotenvy to load .env files at Rust startup so THUKI_SYSTEM_PROMPT works consistently with Vite's VITE_* vars for developers. Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * feat: enrich DEFAULT_SYSTEM_PROMPT with Thuki's identity and style guide Replaces the terse one-liner with a full prompt that establishes Thuki as a context-aware floating secretary (thư ký in Vietnamese), defines concise response style, explains how to handle quoted context from the host app, and lists core strengths — giving the model a clear persona for every session regardless of whether THUKI_SYSTEM_PROMPT is set. Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * fix: remove dead coverage branch in system prompt cleanup tests The `Some(val)` arm of the env-var restoration match was never reached because THUKI_SYSTEM_PROMPT is unset in the test environment, dropping line coverage below 100%. Replaced the branching match with an unconditional `remove_var` call — semantically identical in CI and eliminates the untested branch. Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * fix: remove dead if-let branch in system prompt unset test The `if let Some(val) = original` restore guard in `load_system_prompt_returns_default_when_unset` was never entered because THUKI_SYSTEM_PROMPT is unset in CI, making the Some arm dead code and dropping line coverage below 100%. Dropped the unreachable restoration entirely, matching the pattern used in the other two env-var prompt tests. Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> --------- Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
quiet-node
added a commit
that referenced
this pull request
Apr 11, 2026
* feat: multi-turn conversation via Ollama /api/chat with backend-managed history Switch from stateless /api/generate to /api/chat so the assistant remembers prior messages within a session. Conversation history is owned by the Rust backend (Mutex<Vec<ChatMessage>>), matching the industry-standard pattern used by ChatGPT, Claude, and Open WebUI. The frontend sends only the new user message; the backend assembles the full messages array, streams the response, and persists it. Adds a configurable system prompt via THUKI_SYSTEM_PROMPT env var with a sensible default. Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * fix: prevent cancelled partials in history + load .env for backend - Check cancel_token.is_cancelled() after streaming to skip history persistence for cancelled generations, preventing partial responses from corrupting Ollama context on subsequent turns. - Add dotenvy to load .env files at Rust startup so THUKI_SYSTEM_PROMPT works consistently with Vite's VITE_* vars for developers. Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * feat: enrich DEFAULT_SYSTEM_PROMPT with Thuki's identity and style guide Replaces the terse one-liner with a full prompt that establishes Thuki as a context-aware floating secretary (thư ký in Vietnamese), defines concise response style, explains how to handle quoted context from the host app, and lists core strengths — giving the model a clear persona for every session regardless of whether THUKI_SYSTEM_PROMPT is set. Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * fix: remove dead coverage branch in system prompt cleanup tests The `Some(val)` arm of the env-var restoration match was never reached because THUKI_SYSTEM_PROMPT is unset in the test environment, dropping line coverage below 100%. Replaced the branching match with an unconditional `remove_var` call — semantically identical in CI and eliminates the untested branch. Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> * fix: remove dead if-let branch in system prompt unset test The `if let Some(val) = original` restore guard in `load_system_prompt_returns_default_when_unset` was never entered because THUKI_SYSTEM_PROMPT is unset in CI, making the Some arm dead code and dropping line coverage below 100%. Dropped the unreachable restoration entirely, matching the pattern used in the other two env-var prompt tests. Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com> --------- Signed-off-by: Logan Nguyen <lg.131.dev@gmail.com>
This was referenced Apr 11, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
/api/generateto/api/chatso the assistant remembers prior messages within a sessionConversationHistorywith epoch-based stale-write prevention), matching the industry-standard pattern used by ChatGPT, Claude, and Open WebUITHUKI_SYSTEM_PROMPTenv var (loaded bydotenvyso.envfiles work for both Vite and Rust)Test plan
THUKI_SYSTEM_PROMPTin.env, restart app, confirm the model follows itbun run test:all— 155 frontend + 57 backend tests passbun run test:coverage— 100% coverage maintainedbun run validate-build— lint, format, typecheck all clean🤖 Generated with Claude Code