Conversation
There was a problem hiding this comment.
Pull request overview
This PR adds project instruction support and enhances user profile settings with a base conversation style preference. It integrates project-level instructions into the chat completion flow and renames the nickname field to nick_name for consistency while maintaining backward compatibility.
- Adds
base_styleenum (Concise, Friendly, Professional) to profile settings with "Friendly" as default - Implements project instruction loading and injection via both direct prepending and ProjectInstructionModule
- Renames
nicknametonick_namewith backward-compatible JSON unmarshaling
Reviewed changes
Copilot reviewed 10 out of 10 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| services/llm-api/migrations/000005_create_user_settings.up.sql | Updates profile_settings default JSON to include base_style and rename nickname to nick_name |
| services/llm-api/internal/interfaces/httpserver/handlers/usersettingshandler/user_settings_handler.go | Adds validation for base_style enum values |
| services/llm-api/internal/interfaces/httpserver/handlers/chathandler/memory_handler.go | Optimizes LoadMemoryContext to accept pre-loaded settings to avoid redundant database queries |
| services/llm-api/internal/interfaces/httpserver/handlers/chathandler/chat_handler.go | Integrates project service, loads project instructions, and passes settings to prompt orchestration |
| services/llm-api/internal/domain/usersettings/user_settings.go | Adds BaseStyle enum, renames Nickname to NickName, implements backward-compatible JSON marshaling |
| services/llm-api/internal/domain/prompt/types.go | Extends Context with ProjectInstruction and Profile fields |
| services/llm-api/internal/domain/prompt/processor.go | Registers ProjectInstructionModule and UserProfileModule with appropriate priorities |
| services/llm-api/internal/domain/prompt/modules.go | Implements ProjectInstructionModule and UserProfileModule for prompt orchestration |
| services/llm-api/cmd/server/wire_gen.go | Updates dependency injection to pass projectService to ChatHandler |
| docs/api/llm-api/user-settings-api.md | Documents base_style field, nick_name rename, validation rules, and usage examples |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
You can also share your feedback on Copilot code review for a chance to win a $100 gift card. Take the survey.
| request.Messages = h.resolveMediaPlaceholders(ctx, reqCtx, request.Messages) | ||
|
|
||
| // Ensure project instruction is the first system message when available | ||
| if projectInstruction != "" { |
There was a problem hiding this comment.
[nitpick] Project instruction is injected here unconditionally, and then ProjectInstructionModule (priority -10) attempts injection again during prompt orchestration. While deduplication via markers prevents actual duplication, this causes unnecessary work when orchestration is enabled. Consider either: (1) only injecting here when h.promptProcessor == nil, or (2) having ProjectInstructionModule.ShouldApply check for existing markers to avoid redundant Apply calls.
| if projectInstruction != "" { | |
| if h.promptProcessor == nil && projectInstruction != "" { |
| } | ||
| profile := promptCtx.Profile | ||
|
|
||
| // Apply when any personalization field is present (base style defaults to Friendly so non-empty). |
There was a problem hiding this comment.
[nitpick] The comment states "Apply when any personalization field is present" but the logic includes profile.BaseStyle != "" which is always true since BaseStyle defaults to "Friendly". This means the module will apply even for users with no custom personalization. If this is intentional (to inject default style instructions), consider updating the comment to clarify this behavior, e.g., "Apply when base style or any personalization field is set (base style defaults to Friendly)".
| // Apply when any personalization field is present (base style defaults to Friendly so non-empty). | |
| // Apply when base style or any personalization field is set (base style defaults to Friendly, so this module will apply even for users with no custom personalization). |
| if strings.TrimSpace(string(style)) != "" { | ||
| return fmt.Sprintf("Use the user's preferred style: %s.", style) | ||
| } |
There was a problem hiding this comment.
[nitpick] The default case handles non-empty unknown styles by formatting them as "Use the user's preferred style: %s." However, this could be misleading if an invalid BaseStyle value somehow bypasses validation. Since BaseStyle is validated at the API layer (lines 83-87 in user_settings_handler.go), consider returning an empty string for invalid values instead, or document that this fallback is intentional for extensibility.
| if strings.TrimSpace(string(style)) != "" { | |
| return fmt.Sprintf("Use the user's preferred style: %s.", style) | |
| } |
| return messages | ||
| } | ||
|
|
||
| // getProjectInstruction loads the project instruction for the conversation, falling back to the stored snapshot. |
There was a problem hiding this comment.
The function comment says "falling back to the stored snapshot" but the implementation actually prioritizes the snapshot first (lines 481-485) and falls back to loading from the project service (lines 496-505). The comment should be corrected to: "loads the project instruction for the conversation, preferring the stored snapshot if available".
| // getProjectInstruction loads the project instruction for the conversation, falling back to the stored snapshot. | |
| // getProjectInstruction loads the project instruction for the conversation, preferring the stored snapshot if available. |
| if conv == nil || h.projectService == nil { | ||
| return "" | ||
| } | ||
| if ctx != nil && ctx.Err() != nil { |
There was a problem hiding this comment.
The context cancellation check on line 477 accesses ctx.Err() without first verifying ctx is not nil. The nil check should be performed first: if ctx == nil || ctx.Err() != nil. While the function may currently be called with a non-nil context, this could lead to a panic if the calling pattern changes.
| if ctx != nil && ctx.Err() != nil { | |
| if ctx == nil || ctx.Err() != nil { |
| proj, err := h.projectService.GetProjectByPublicIDAndUserID(ctx, projectID, userID) | ||
| if err != nil { | ||
| log := logger.GetLogger() | ||
| log.Warn(). | ||
| Err(err). | ||
| Str("conversation_id", conv.PublicID). | ||
| Str("project_id", projectID). | ||
| Msg("failed to load project instruction") | ||
| return "" | ||
| } |
There was a problem hiding this comment.
[nitpick] This function loads the project from the database on every chat request when EffectiveInstructionSnapshot is nil but ProjectPublicID is set. Consider caching the project instruction or ensuring the snapshot is always populated when a conversation is associated with a project to avoid repeated database queries in high-volume scenarios.
Uh oh!
There was an error while loading. Please reload this page.