fix: chat file attachments - routing, vision API, and thumbnail display#5291
Merged
fix: chat file attachments - routing, vision API, and thumbnail display#5291
Conversation
Route messages with file attachments to file_chat_question node, add thread-safe queue operations via call_soon_threadsafe for sync-to-async bridging in file chat streaming. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Ensure a chat session exists before processing file attachments, so file_ids are properly associated with the session. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Images uploaded with purpose="vision" are incompatible with the Assistants API. Route all-image chats through Chat Completions with base64-encoded images instead. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace transfer_manager bulk upload with individual blob uploads that call make_public() so thumbnail URLs don't return 403. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Use FileImage for local paths instead of CachedNetworkImage, so thumbnails display immediately for just-sent messages without waiting for GCS network access. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Override thumbnail with selectedFiles local path in addMessageLocally so images display immediately instead of relying on GCS URLs. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Contributor
Greptile SummaryThis PR fixes multiple issues with chat file attachments across the full stack: Backend changes:
Flutter changes:
Issues found:
Confidence Score: 3/5
Important Files Changed
Sequence DiagramsequenceDiagram
participant Flutter as Flutter App
participant Backend as FastAPI Backend
participant Graph as LangGraph Router
participant FileChat as FileChatTool
participant OpenAI as OpenAI API
participant GCS as Google Cloud Storage
Flutter->>GCS: Upload image file
GCS-->>Flutter: Return GCS URL
Note over GCS: blob.make_public()
Flutter->>Flutter: Set local path as thumbnail
Flutter->>Backend: POST /v2/messages (with file_ids)
Backend->>Backend: Auto-create chat_session if needed
Backend->>Graph: execute_graph_chat_stream()
Graph->>Graph: determine_conversation_type()
alt Message has file attachments
Graph->>FileChat: file_chat_question()
FileChat->>FileChat: Check if all files are images
alt All images
FileChat->>OpenAI: Chat Completions API (vision)
Note over FileChat,OpenAI: Uses gpt-4.1 with base64 images
OpenAI-->>FileChat: Stream response
FileChat->>Graph: Stream via AsyncStreamingCallback
else Non-image files
FileChat->>OpenAI: Assistants API
OpenAI-->>FileChat: Stream response
end
end
Graph-->>Backend: Stream chunks
Backend-->>Flutter: SSE stream
Flutter->>Flutter: Display with local thumbnail
Last reviewed commit: 2650ef7 |
Comment on lines
+143
to
+145
| callback.put_data_nowait(delta.content) | ||
| output_list.append(delta.content) | ||
| callback.end_nowait() |
Contributor
There was a problem hiding this comment.
callback not checked for None before calling methods
Suggested change
| callback.put_data_nowait(delta.content) | |
| output_list.append(delta.content) | |
| callback.end_nowait() | |
| if callback: | |
| callback.put_data_nowait(delta.content) | |
| output_list.append(delta.content) | |
| if callback: | |
| callback.end_nowait() |
| class AsyncStreamingCallback(BaseCallbackHandler): | ||
| def __init__(self): | ||
| self.queue = asyncio.Queue() | ||
| self._loop = asyncio.get_event_loop() |
Contributor
There was a problem hiding this comment.
asyncio.get_event_loop() is deprecated in async contexts - use asyncio.get_running_loop() or store the loop lazily when first needed in call_soon_threadsafe calls
Suggested change
| self._loop = asyncio.get_event_loop() | |
| self._loop = None # Lazy initialization |
|
|
||
| def put_thought_nowait(self, text): | ||
| self.queue.put_nowait(f"think: {text}") | ||
| self._loop.call_soon_threadsafe(self.queue.put_nowait, f"think: {text}") |
Contributor
There was a problem hiding this comment.
implement lazy loop initialization if changed above
Suggested change
| self._loop.call_soon_threadsafe(self.queue.put_nowait, f"think: {text}") | |
| loop = self._loop or asyncio.get_running_loop() | |
| loop.call_soon_threadsafe(self.queue.put_nowait, f"think: {text}") |
Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!
Glucksberg
pushed a commit
to Glucksberg/omi-local
that referenced
this pull request
Apr 28, 2026
…ay (BasedHardware#5291) ## Summary - **Backend routing**: Enable `file_chat_question` node in LangGraph — messages with file attachments now route correctly instead of falling through to no-context conversation - **Vision API**: Use Chat Completions API (not Assistants API) for image-only chats — files uploaded with `purpose="vision"` are incompatible with Assistants API - **Thread safety**: Fix `AsyncStreamingCallback` to use `call_soon_threadsafe` for sync-to-async queue bridging in file chat streaming - **Chat session**: Auto-create chat session when files are attached but no session exists yet - **GCS thumbnails**: Make uploaded chat file thumbnails publicly readable via `blob.make_public()` - **Flutter thumbnails**: Use local file paths (`FileImage`) for just-sent image thumbnails instead of relying on GCS network URLs ## Test plan - [ ] Attach an image in chat and send a message — AI should analyze the image content - [ ] Image thumbnail should display in the sent message bubble (not show error icon) - [ ] Attach a non-image file and ask about its contents — should route through Assistants API - [ ] Follow-up questions about previously uploaded files should still work 🤖 Generated with [Claude Code](https://claude.com/claude-code)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
file_chat_questionnode in LangGraph — messages with file attachments now route correctly instead of falling through to no-context conversationpurpose="vision"are incompatible with Assistants APIAsyncStreamingCallbackto usecall_soon_threadsafefor sync-to-async queue bridging in file chat streamingblob.make_public()FileImage) for just-sent image thumbnails instead of relying on GCS network URLsTest plan
🤖 Generated with Claude Code