Skip to content

Problem with proxy and streaming #3092

@markowanga

Description

@markowanga

I try to process streaming (return customer in chat). I need to use proxy. I have problem that response does not streaming when using proxy (all responses returned after all processed, no effect of writing text)

import asyncio
from typing import Optional

from httpx import AsyncClient
from openai import AsyncStream, AsyncOpenAI
from openai.types.chat import ChatCompletionChunk


async def get_openai_stream_agenerator() -> AsyncStream[ChatCompletionChunk]:
    client = AsyncOpenAI(
        http_client=AsyncClient(
            # when I comment these two lines streaming is ok
            proxy="http://localhost:8080",  # I'm using mitmproxy with basic configuration
            verify=False,
        )
    )
    messages = [
        {"role": "system", "content": "Return details about asking person"},
        {"role": "user", "content": "Iga Świątek"},
    ]
    response: AsyncStream[ChatCompletionChunk] = await client.chat.completions.create(
        model='gpt-4-0613',
        messages=messages,
        stream=True,
    )  # type: ignore
    return response


def get_delta_argument(chunk: ChatCompletionChunk) -> Optional[str]:
    if len(chunk.choices) > 0:
        return chunk.dict()['choices'][0]['delta']['content']
    else:
        return None


async def get_response_generator() -> None:
    async for it in await get_openai_stream_agenerator():
        value = get_delta_argument(it)
        if value:
            print(value, end="")
    print()


if __name__ == '__main__':
    asyncio.run(get_response_generator())

OS: macOS
Python version: Python v3.11.7
Library version: openai 1.12.0, httpx 0.26.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions