feat: story prompt customisation, LLM Q&A chat, and filter execution fix#142
Conversation
There was a problem hiding this comment.
Code Review
This pull request introduces an interactive Q&A feature for network traffic stories, enabling users to ask the LLM specific questions about the analysis. Key changes include the addition of a StoryChat frontend component, new backend endpoints for Q&A and context-aware story generation, and updates to the tshark filtering logic. The review feedback suggests improving the API controllers by using UUID types for path variables to leverage Spring's automatic validation and removing redundant fully qualified class names in the service layer.
…fix (#135) Story enhancements: - Add optional additional-context textarea to story generation (frontend + backend) - Collapse StoryInfoCard by default; move context field inside it below the info - Add StoryChat component: LLM Q&A with continuous conversation history, suggested follow-up questions generated per response, scroll-within-card (no page jump) - LLM generates 3 dynamic suggested questions when story is created (suggestedQuestions in JSON schema); follow-up questions regenerated after each answer - Fix Hibernate PreparedStatementDetails warning: remove @CreationTimestamp, mark created_at as insertable=false (DB DEFAULT CURRENT_TIMESTAMP handles it) Filter fix: - Replace -Y (single-pass display filter) with -2 -R (two-pass read filter) in both validateDisplayFilter and filterPackets — tshark 3.6.2 silently drops -T fields output when -Y is combined with it - Fix tcp.flags.rst → tcp.flags.reset: invalid field name caused tshark to reject the entire command and return zero output regardless of filter matches Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
416ab7b to
9b5a7ec
Compare
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request introduces an interactive Q&A feature for network traffic stories, allowing users to ask follow-up questions to the LLM. Key changes include new backend endpoints and service logic for handling conversation history and suggested questions, along with the addition of a StoryChat frontend component. The StoryInfoCard was also updated to allow users to provide additional context during story generation. Furthermore, tshark command arguments were adjusted for improved filtering compatibility. The review feedback suggests enhancing type safety by using UUID for path variables and simplifying redundant class references.
- Use UUID path variable types directly in StoryController (generateStory, askQuestion) so Spring handles conversion and returns 400 for invalid UUIDs instead of 500 - Remove redundant UUID.fromString() calls now that path vars are already UUID - Use StoryQuestionRequest.HistoryEntry (short form) instead of fully qualified name since com.tracepcap.story.dto.* is already imported Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Summary
StoryChatcomponent with continuous LLM conversation, dynamic suggested questions on generation and after each answer@CreationTimestampfromStoryEntity.created_at— let DBDEFAULT CURRENT_TIMESTAMPhandle it instead of causing aPreparedStatementDetailswarning in the batch pipeline-Yis combined with-T fields; replaced with-2 -R(two-pass read filter) in bothvalidateDisplayFilterandfilterPacketsChanges
Backend
GenerateStoryRequest— optionaladditionalContextbody field forPOST /story/generate/{fileId}StoryQuestionRequest— question + full conversationhistoryforPOST /story/{storyId}/askStoryAnswerResponse— includesfollowUpQuestions: List<String>alongsideanswerStoryService— threadsadditionalContextinto the user prompt;askQuestionreturns JSON{answer, followUpQuestions}in one LLM call;suggestedQuestionsadded to story generation schema and parsedStoryEntity—created_atmarkedinsertable=false,@CreationTimestampremovedFilterService—validateDisplayFilterandfilterPacketschanged from-Yto-2 -RFrontend
StoryInfoCard— collapsible (default closed), context textarea inside card body with suggestion placeholder textStoryChat— chat bubble UI, scrolls within card (no page jump), suggestion buttons above input, refreshed after each answer, conversation history sent with every requestStoryPage— wires context + storyId through to both componentsStorytype —suggestedQuestions?: string[]Test plan
🤖 Generated with Claude Code