Conversation
There was a problem hiding this comment.
Pull request overview
This pull request implements AI conversation history functionality for the backend, adding support for maintaining conversation context across multiple AI interactions. The changes enable tracking of conversation history and response IDs for different AI providers (OpenAI and Bedrock).
Changes:
- Added a database migration to add a
response_idcolumn to theai_chat_messagetable - Extended the AI chat message entity and repository to support storing and retrieving response IDs
- Modified the AI request use case to build messages with conversation history and pass response IDs through the AI provider configuration
Reviewed changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
| backend/src/migrations/1769790101930-AddResponseIdToAiChatMessage.ts | Database migration adding response_id column (has timestamp issue) |
| backend/src/entities/ai/ai-conversation-history/ai-chat-messages/ai-chat-message.entity.ts | Added response_id field to entity |
| backend/src/entities/ai/ai-conversation-history/ai-chat-messages/repository/ai-chat-message-repository.interface.ts | Added findLastAiMessageForChat method and responseId parameter to saveMessage |
| backend/src/entities/ai/ai-conversation-history/ai-chat-messages/repository/ai-chat-message-repository.extension.ts | Implemented new repository methods for conversation history |
| backend/src/entities/ai/use-cases/request-info-from-table-with-ai-v7.use.case.ts | Integrated conversation history building and response ID tracking into the AI request flow |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| const config: AIProviderConfig = {}; | ||
| if (this.aiProvider === AIProviderType.OPENAI && previousResponseId) { | ||
| config.previousResponseId = previousResponseId; | ||
| } |
There was a problem hiding this comment.
The aiProvider is hardcoded to AIProviderType.BEDROCK (line 40), but there's conditional logic checking for AIProviderType.OPENAI (lines 107-109). Unless the aiProvider can be changed at runtime or through configuration, this OpenAI-specific code will never execute. If OpenAI support is intended, consider making the aiProvider configurable. If not, the OpenAI-specific logic should be removed to avoid confusion.
| foundConnection, | ||
| ); | ||
|
|
||
| if (this.aiProvider === AIProviderType.OPENAI && lastResponseId) { |
There was a problem hiding this comment.
This conditional check for OpenAI will never be true since aiProvider is hardcoded to AIProviderType.BEDROCK (line 40). The same issue exists on lines 107-109. Consider removing this dead code or making the aiProvider configurable.
| if (this.aiProvider === AIProviderType.OPENAI && lastResponseId) { | |
| if (lastResponseId) { |
| if (this.aiProvider === AIProviderType.OPENAI) { | ||
| const lastAiMessage = await this._dbContext.aiChatMessageRepository.findLastAiMessageForChat(chatId); | ||
| const previousResponseId = lastAiMessage?.response_id || null; | ||
| const messages = new MessageBuilder().system(systemPrompt).human(userMessage).build(); | ||
| return { messages, previousResponseId }; | ||
| } | ||
|
|
There was a problem hiding this comment.
This conditional check for OpenAI will never be true since aiProvider is hardcoded to AIProviderType.BEDROCK (line 40). The entire if-block (lines 385-390) will never execute, meaning this OpenAI-specific optimization is dead code.
| if (this.aiProvider === AIProviderType.OPENAI) { | |
| const lastAiMessage = await this._dbContext.aiChatMessageRepository.findLastAiMessageForChat(chatId); | |
| const previousResponseId = lastAiMessage?.response_id || null; | |
| const messages = new MessageBuilder().system(systemPrompt).human(userMessage).build(); | |
| return { messages, previousResponseId }; | |
| } |
| builder.human(userMessage); | ||
|
|
There was a problem hiding this comment.
For non-OpenAI providers with existing chats, the user message is saved to the database first (line 90), then retrieved as part of the history (line 392), and then added again to the message builder (line 403). This results in the current user message being included twice in the conversation context sent to the AI provider. This duplication could confuse the AI and waste tokens. Consider either excluding the most recent user message from the history retrieval, or not saving it until after building the messages.
| builder.human(userMessage); |
| if (accumulatedResponse) { | ||
| await this._dbContext.aiChatMessageRepository.saveMessage( | ||
| foundUserAiChat.id, | ||
| accumulatedResponse, | ||
| MessageRole.ai, | ||
| lastResponseId, | ||
| ); |
There was a problem hiding this comment.
The lastResponseId could be null if no response chunks contain a responseId. Saving a null response_id to the database is valid per the schema (nullable: true), but consider whether this scenario should be logged or handled specially, as it might indicate an unexpected issue with the AI provider's response format.
No description provided.