Skip to content

RuntimeWarning: coroutine 'AsyncAPIClient.post' was never awaited #1265

@RahulVerma989

Description

@RahulVerma989

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

Getting this error while using create_and_stream() method.

error logs:

2024-03-22 20:05:30 /usr/src/app/app/services/openai_service.py:111: RuntimeWarning: coroutine 'AsyncAPIClient.post' was never awaited 2024-03-22 20:05:30 stream_or_run = await self.client.beta.threads.runs.create_and_stream(thread_id=thread_id, assistant_id=self.assistant_id, metadata=metadata, event_handler=event_handler) 2024-03-22 20:05:30 Object allocated at (most recent call last): 2024-03-22 20:05:30 File "/usr/local/lib/python3.9/site-packages/openai/resources/beta/threads/runs/runs.py", lineno 1322 2024-03-22 20:05:30 request = self._post(

To Reproduce

  1. Install openai 1.14.2 module and tracemalloc
  2. start the tracemalloc using the command tracemalloc.start()
  3. Execute the async run create_and_stream() method using an example code.

Example code:

from openai import AsyncAssistantEventHandler
 
class EventHandler(AsyncAssistantEventHandler):    
  @override
  async def on_text_created(self, text) -> None:
    print(f"\nassistant > ", end="", flush=True)
      
  @override
   async def on_text_delta(self, delta, snapshot):
    print(delta.value, end="", flush=True)
      
   async def on_tool_call_created(self, tool_call):
    print(f"\nassistant > {tool_call.type}\n", flush=True)
  
   async def on_tool_call_delta(self, delta, snapshot):
    if delta.type == 'code_interpreter':
      if delta.code_interpreter.input:
        print(delta.code_interpreter.input, end="", flush=True)
      if delta.code_interpreter.outputs:
        print(f"\n\noutput >", flush=True)
        for output in delta.code_interpreter.outputs:
          if output.type == "logs":
            print(f"\n{output.logs}", flush=True)
  
with client.beta.threads.runs.create_and_stream(
  thread_id=thread.id,
  assistant_id=assistant.id,
  instructions="Please address the user as Jane Doe. The user has a premium account.",
  event_handler=EventHandler(),
) as stream:
  stream.until_done()

Example code was taken from https://platform.openai.com/docs/assistants/overview?context=with-streaming and updated for asynchronous operation.

Code snippets

In runs.py under AsyncRuns class.

    def create_and_stream(
        self,
        *,
        assistant_id: str,
        additional_instructions: Optional[str] | NotGiven = NOT_GIVEN,
        instructions: Optional[str] | NotGiven = NOT_GIVEN,
        metadata: Optional[object] | NotGiven = NOT_GIVEN,
        model: Optional[str] | NotGiven = NOT_GIVEN,
        tools: Optional[Iterable[AssistantToolParam]] | NotGiven = NOT_GIVEN,
        thread_id: str,
        event_handler: AsyncAssistantEventHandlerT | None = None,
        # Use the following arguments if you need to pass additional parameters to the API that aren't available via kwargs.
        # The extra values given here take precedence over values defined on the client or passed to this method.
        extra_headers: Headers | None = None,
        extra_query: Query | None = None,
        extra_body: Body | None = None,
        timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
    ) -> (
        AsyncAssistantStreamManager[AsyncAssistantEventHandler]
        | AsyncAssistantStreamManager[AsyncAssistantEventHandlerT]
    ):
        """Create a Run stream"""
        if not thread_id:
            raise ValueError(f"Expected a non-empty value for `thread_id` but received {thread_id!r}")

        extra_headers = {
            "OpenAI-Beta": "assistants=v1",
            "X-Stainless-Stream-Helper": "threads.runs.create_and_stream",
            "X-Stainless-Custom-Event-Handler": "true" if event_handler else "false",
            **(extra_headers or {}),
        }
        request = self._post(
            f"/threads/{thread_id}/runs",
            body=maybe_transform(
                {
                    "assistant_id": assistant_id,
                    "additional_instructions": additional_instructions,
                    "instructions": instructions,
                    "metadata": metadata,
                    "model": model,
                    "stream": True,
                    "tools": tools,
                },
                run_create_params.RunCreateParams,
            ),
            options=make_request_options(
                extra_headers=extra_headers, extra_query=extra_query, extra_body=extra_body, timeout=timeout
            ),
            cast_to=Run,
            stream=True,
            stream_cls=AsyncStream[AssistantStreamEvent],
        )
        return AsyncAssistantStreamManager(request, event_handler=event_handler or AsyncAssistantEventHandler())

OS

macOS

Python version

Python 3.10.13

Library version

openai v1.14.2

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions