Skip to content

Conversation

@rcII
Copy link
Contributor

@rcII rcII commented Dec 7, 2025

Summary

Fixes #9639

The async_post_call_streaming_iterator_hook function in litellm/proxy/utils.py was broken:

  1. Was a sync function (def) not an async generator
  2. Returned an AsyncGenerator without ever iterating it
  3. Callback generators were chained but never consumed - meaning callbacks never actually processed any chunks

Changes

  1. Changed def to async def making it an async generator
  2. Added async for chunk in current_response: yield chunk to actually iterate through the callback chain
  3. Added unit tests verifying the fix

The Bug (Before)

def async_post_call_streaming_iterator_hook(self, response, ...):  # NOT async!
    for callback in litellm.callbacks:
        response = _callback.async_post_call_streaming_iterator_hook(...)  # Assigns AsyncGenerator
    return response  # Returns UNCONSUMED generator - callbacks never execute!

The Fix (After)

async def async_post_call_streaming_iterator_hook(self, response, ...):  # Now async!
    current_response = response
    for callback in litellm.callbacks:
        current_response = _callback.async_post_call_streaming_iterator_hook(...)
    
    # Actually iterate and yield chunks
    async for chunk in current_response:
        yield chunk

Test Plan

  • Added 4 unit tests in tests/test_litellm/proxy/hooks/test_async_post_call_streaming_iterator_hook.py
  • Tests verify async generator behavior, callback chaining, empty callbacks, and error propagation
  • All tests pass
  • Black formatting applied
  • Ruff linting passes

Related

…ates async generators

The async_post_call_streaming_iterator_hook function was broken:
1. Was a sync function (def) not async generator
2. Returned AsyncGenerator without iterating it
3. Callback generators were chained but never consumed

This fix:
1. Makes the function an async generator (async def + yield)
2. Actually iterates through the chained callbacks with 'async for'
3. Properly yields chunks to the caller

Fixes BerriAI#9639
@CLAassistant
Copy link

CLAassistant commented Dec 7, 2025

CLA assistant check
All committers have signed the CLA.

@vercel
Copy link

vercel bot commented Dec 7, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
litellm Ready Ready Preview Comment Dec 7, 2025 5:59am

@krrishdholakia krrishdholakia merged commit eb689a1 into BerriAI:main Dec 8, 2025
6 of 7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Cannot Modify Streaming Responses: async_post_call_streaming_iterator_hook Not Executed in Proxy

3 participants