Skip to content

fix: chat file attachments - routing, vision API, and thumbnail display#5291

Merged
kodjima33 merged 6 commits intomainfrom
fix/chat-file-attachments
Mar 3, 2026
Merged

fix: chat file attachments - routing, vision API, and thumbnail display#5291
kodjima33 merged 6 commits intomainfrom
fix/chat-file-attachments

Conversation

@kodjima33
Copy link
Copy Markdown
Collaborator

Summary

  • Backend routing: Enable file_chat_question node in LangGraph — messages with file attachments now route correctly instead of falling through to no-context conversation
  • Vision API: Use Chat Completions API (not Assistants API) for image-only chats — files uploaded with purpose="vision" are incompatible with Assistants API
  • Thread safety: Fix AsyncStreamingCallback to use call_soon_threadsafe for sync-to-async queue bridging in file chat streaming
  • Chat session: Auto-create chat session when files are attached but no session exists yet
  • GCS thumbnails: Make uploaded chat file thumbnails publicly readable via blob.make_public()
  • Flutter thumbnails: Use local file paths (FileImage) for just-sent image thumbnails instead of relying on GCS network URLs

Test plan

  • Attach an image in chat and send a message — AI should analyze the image content
  • Image thumbnail should display in the sent message bubble (not show error icon)
  • Attach a non-image file and ask about its contents — should route through Assistants API
  • Follow-up questions about previously uploaded files should still work

🤖 Generated with Claude Code

kodjima33 and others added 6 commits March 2, 2026 20:06
Route messages with file attachments to file_chat_question node,
add thread-safe queue operations via call_soon_threadsafe for
sync-to-async bridging in file chat streaming.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Ensure a chat session exists before processing file attachments,
so file_ids are properly associated with the session.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Images uploaded with purpose="vision" are incompatible with the
Assistants API. Route all-image chats through Chat Completions
with base64-encoded images instead.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace transfer_manager bulk upload with individual blob uploads
that call make_public() so thumbnail URLs don't return 403.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Use FileImage for local paths instead of CachedNetworkImage,
so thumbnails display immediately for just-sent messages
without waiting for GCS network access.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Override thumbnail with selectedFiles local path in addMessageLocally
so images display immediately instead of relying on GCS URLs.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@kodjima33 kodjima33 merged commit a81e633 into main Mar 3, 2026
1 check passed
@kodjima33 kodjima33 deleted the fix/chat-file-attachments branch March 3, 2026 01:07
@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps Bot commented Mar 3, 2026

Greptile Summary

This PR fixes multiple issues with chat file attachments across the full stack:

Backend changes:

  • Enabled file_chat_question routing in LangGraph to properly handle messages with file attachments
  • Added vision API support using Chat Completions API for image-only chats (Assistants API incompatible with purpose="vision" files)
  • Fixed AsyncStreamingCallback thread safety by using call_soon_threadsafe for sync-to-async queue operations
  • Auto-creates chat session when files are attached but no session exists
  • Made GCS thumbnails publicly readable via blob.make_public()

Flutter changes:

  • Uses local file paths as thumbnails for immediate display before GCS upload completes
  • Added FileImage support to display images from local paths

Issues found:

  • Critical: _ask_vision_stream doesn't check if callback is None before calling methods - will crash if callback is not provided
  • Minor: AsyncStreamingCallback uses deprecated asyncio.get_event_loop() in __init__ - should use lazy initialization or asyncio.get_running_loop()

Confidence Score: 3/5

  • This PR has one critical bug that will cause crashes in production if callback is None
  • The PR addresses important functionality across backend routing, vision API integration, and Flutter UI. However, there's a critical bug in _ask_vision_stream where callback methods are called without None-checking, which will cause AttributeError crashes. Additionally, there's a minor issue with deprecated asyncio API usage. The core logic changes are sound but the callback bug must be fixed before merging.
  • backend/utils/other/chat_file.py requires immediate attention for the callback None-check bug

Important Files Changed

Filename Overview
backend/utils/other/chat_file.py added vision API support for image-only chats using Chat Completions API, but callback None-check missing in _ask_vision_stream
backend/utils/retrieval/graph.py enabled file_chat_question routing and fixed thread safety with call_soon_threadsafe, minor improvement possible with lazy loop initialization
backend/routers/chat.py added auto-creation of chat session when files are attached but session doesn't exist

Sequence Diagram

sequenceDiagram
    participant Flutter as Flutter App
    participant Backend as FastAPI Backend
    participant Graph as LangGraph Router
    participant FileChat as FileChatTool
    participant OpenAI as OpenAI API
    participant GCS as Google Cloud Storage

    Flutter->>GCS: Upload image file
    GCS-->>Flutter: Return GCS URL
    Note over GCS: blob.make_public()
    Flutter->>Flutter: Set local path as thumbnail
    Flutter->>Backend: POST /v2/messages (with file_ids)
    Backend->>Backend: Auto-create chat_session if needed
    Backend->>Graph: execute_graph_chat_stream()
    Graph->>Graph: determine_conversation_type()
    alt Message has file attachments
        Graph->>FileChat: file_chat_question()
        FileChat->>FileChat: Check if all files are images
        alt All images
            FileChat->>OpenAI: Chat Completions API (vision)
            Note over FileChat,OpenAI: Uses gpt-4.1 with base64 images
            OpenAI-->>FileChat: Stream response
            FileChat->>Graph: Stream via AsyncStreamingCallback
        else Non-image files
            FileChat->>OpenAI: Assistants API
            OpenAI-->>FileChat: Stream response
        end
    end
    Graph-->>Backend: Stream chunks
    Backend-->>Flutter: SSE stream
    Flutter->>Flutter: Display with local thumbnail
Loading

Last reviewed commit: 2650ef7

Copy link
Copy Markdown
Contributor

@greptile-apps greptile-apps Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

6 files reviewed, 3 comments

Edit Code Review Agent Settings | Greptile

Comment on lines +143 to +145
callback.put_data_nowait(delta.content)
output_list.append(delta.content)
callback.end_nowait()
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

callback not checked for None before calling methods

Suggested change
callback.put_data_nowait(delta.content)
output_list.append(delta.content)
callback.end_nowait()
if callback:
callback.put_data_nowait(delta.content)
output_list.append(delta.content)
if callback:
callback.end_nowait()

class AsyncStreamingCallback(BaseCallbackHandler):
def __init__(self):
self.queue = asyncio.Queue()
self._loop = asyncio.get_event_loop()
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

asyncio.get_event_loop() is deprecated in async contexts - use asyncio.get_running_loop() or store the loop lazily when first needed in call_soon_threadsafe calls

Suggested change
self._loop = asyncio.get_event_loop()
self._loop = None # Lazy initialization


def put_thought_nowait(self, text):
self.queue.put_nowait(f"think: {text}")
self._loop.call_soon_threadsafe(self.queue.put_nowait, f"think: {text}")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

implement lazy loop initialization if changed above

Suggested change
self._loop.call_soon_threadsafe(self.queue.put_nowait, f"think: {text}")
loop = self._loop or asyncio.get_running_loop()
loop.call_soon_threadsafe(self.queue.put_nowait, f"think: {text}")

Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!

Glucksberg pushed a commit to Glucksberg/omi-local that referenced this pull request Apr 28, 2026
…ay (BasedHardware#5291)

## Summary
- **Backend routing**: Enable `file_chat_question` node in LangGraph —
messages with file attachments now route correctly instead of falling
through to no-context conversation
- **Vision API**: Use Chat Completions API (not Assistants API) for
image-only chats — files uploaded with `purpose="vision"` are
incompatible with Assistants API
- **Thread safety**: Fix `AsyncStreamingCallback` to use
`call_soon_threadsafe` for sync-to-async queue bridging in file chat
streaming
- **Chat session**: Auto-create chat session when files are attached but
no session exists yet
- **GCS thumbnails**: Make uploaded chat file thumbnails publicly
readable via `blob.make_public()`
- **Flutter thumbnails**: Use local file paths (`FileImage`) for
just-sent image thumbnails instead of relying on GCS network URLs

## Test plan
- [ ] Attach an image in chat and send a message — AI should analyze the
image content
- [ ] Image thumbnail should display in the sent message bubble (not
show error icon)
- [ ] Attach a non-image file and ask about its contents — should route
through Assistants API
- [ ] Follow-up questions about previously uploaded files should still
work

🤖 Generated with [Claude Code](https://claude.com/claude-code)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant