Skip to content

Feat/project instruction#265

Merged
locnguyen1986 merged 2 commits intomainfrom
feat/project-instruction
Nov 25, 2025
Merged

Feat/project instruction#265
locnguyen1986 merged 2 commits intomainfrom
feat/project-instruction

Conversation

@locnguyen1986
Copy link
Collaborator

@locnguyen1986 locnguyen1986 commented Nov 25, 2025

  • Adds base_style enum (Concise, Friendly, Professional) to profile settings with "Friendly" as default
  • Implements project instruction loading and injection via both direct prepending and ProjectInstructionModule
  • Renames nickname to nick_name with backward-compatible JSON unmarshaling

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds project instruction support and enhances user profile settings with a base conversation style preference. It integrates project-level instructions into the chat completion flow and renames the nickname field to nick_name for consistency while maintaining backward compatibility.

  • Adds base_style enum (Concise, Friendly, Professional) to profile settings with "Friendly" as default
  • Implements project instruction loading and injection via both direct prepending and ProjectInstructionModule
  • Renames nickname to nick_name with backward-compatible JSON unmarshaling

Reviewed changes

Copilot reviewed 10 out of 10 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
services/llm-api/migrations/000005_create_user_settings.up.sql Updates profile_settings default JSON to include base_style and rename nickname to nick_name
services/llm-api/internal/interfaces/httpserver/handlers/usersettingshandler/user_settings_handler.go Adds validation for base_style enum values
services/llm-api/internal/interfaces/httpserver/handlers/chathandler/memory_handler.go Optimizes LoadMemoryContext to accept pre-loaded settings to avoid redundant database queries
services/llm-api/internal/interfaces/httpserver/handlers/chathandler/chat_handler.go Integrates project service, loads project instructions, and passes settings to prompt orchestration
services/llm-api/internal/domain/usersettings/user_settings.go Adds BaseStyle enum, renames Nickname to NickName, implements backward-compatible JSON marshaling
services/llm-api/internal/domain/prompt/types.go Extends Context with ProjectInstruction and Profile fields
services/llm-api/internal/domain/prompt/processor.go Registers ProjectInstructionModule and UserProfileModule with appropriate priorities
services/llm-api/internal/domain/prompt/modules.go Implements ProjectInstructionModule and UserProfileModule for prompt orchestration
services/llm-api/cmd/server/wire_gen.go Updates dependency injection to pass projectService to ChatHandler
docs/api/llm-api/user-settings-api.md Documents base_style field, nick_name rename, validation rules, and usage examples

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review for a chance to win a $100 gift card. Take the survey.

request.Messages = h.resolveMediaPlaceholders(ctx, reqCtx, request.Messages)

// Ensure project instruction is the first system message when available
if projectInstruction != "" {
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Project instruction is injected here unconditionally, and then ProjectInstructionModule (priority -10) attempts injection again during prompt orchestration. While deduplication via markers prevents actual duplication, this causes unnecessary work when orchestration is enabled. Consider either: (1) only injecting here when h.promptProcessor == nil, or (2) having ProjectInstructionModule.ShouldApply check for existing markers to avoid redundant Apply calls.

Suggested change
if projectInstruction != "" {
if h.promptProcessor == nil && projectInstruction != "" {

Copilot uses AI. Check for mistakes.
}
profile := promptCtx.Profile

// Apply when any personalization field is present (base style defaults to Friendly so non-empty).
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The comment states "Apply when any personalization field is present" but the logic includes profile.BaseStyle != "" which is always true since BaseStyle defaults to "Friendly". This means the module will apply even for users with no custom personalization. If this is intentional (to inject default style instructions), consider updating the comment to clarify this behavior, e.g., "Apply when base style or any personalization field is set (base style defaults to Friendly)".

Suggested change
// Apply when any personalization field is present (base style defaults to Friendly so non-empty).
// Apply when base style or any personalization field is set (base style defaults to Friendly, so this module will apply even for users with no custom personalization).

Copilot uses AI. Check for mistakes.
Comment on lines +338 to +340
if strings.TrimSpace(string(style)) != "" {
return fmt.Sprintf("Use the user's preferred style: %s.", style)
}
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The default case handles non-empty unknown styles by formatting them as "Use the user's preferred style: %s." However, this could be misleading if an invalid BaseStyle value somehow bypasses validation. Since BaseStyle is validated at the API layer (lines 83-87 in user_settings_handler.go), consider returning an empty string for invalid values instead, or document that this fallback is intentional for extensibility.

Suggested change
if strings.TrimSpace(string(style)) != "" {
return fmt.Sprintf("Use the user's preferred style: %s.", style)
}

Copilot uses AI. Check for mistakes.
return messages
}

// getProjectInstruction loads the project instruction for the conversation, falling back to the stored snapshot.
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function comment says "falling back to the stored snapshot" but the implementation actually prioritizes the snapshot first (lines 481-485) and falls back to loading from the project service (lines 496-505). The comment should be corrected to: "loads the project instruction for the conversation, preferring the stored snapshot if available".

Suggested change
// getProjectInstruction loads the project instruction for the conversation, falling back to the stored snapshot.
// getProjectInstruction loads the project instruction for the conversation, preferring the stored snapshot if available.

Copilot uses AI. Check for mistakes.
if conv == nil || h.projectService == nil {
return ""
}
if ctx != nil && ctx.Err() != nil {
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The context cancellation check on line 477 accesses ctx.Err() without first verifying ctx is not nil. The nil check should be performed first: if ctx == nil || ctx.Err() != nil. While the function may currently be called with a non-nil context, this could lead to a panic if the calling pattern changes.

Suggested change
if ctx != nil && ctx.Err() != nil {
if ctx == nil || ctx.Err() != nil {

Copilot uses AI. Check for mistakes.
Comment on lines +496 to +505
proj, err := h.projectService.GetProjectByPublicIDAndUserID(ctx, projectID, userID)
if err != nil {
log := logger.GetLogger()
log.Warn().
Err(err).
Str("conversation_id", conv.PublicID).
Str("project_id", projectID).
Msg("failed to load project instruction")
return ""
}
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] This function loads the project from the database on every chat request when EffectiveInstructionSnapshot is nil but ProjectPublicID is set. Consider caching the project instruction or ensuring the snapshot is always populated when a conversation is associated with a project to avoid repeated database queries in high-volume scenarios.

Copilot uses AI. Check for mistakes.
@locnguyen1986 locnguyen1986 merged commit e9c4d04 into main Nov 25, 2025
13 checks passed
@github-project-automation github-project-automation bot moved this to QA in Jan Nov 25, 2025
@locnguyen1986 locnguyen1986 deleted the feat/project-instruction branch November 25, 2025 17:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: QA

Development

Successfully merging this pull request may close these issues.

2 participants