Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(weave): Fixes Stream response format in OpenAI #1726

Merged
merged 11 commits into from
Jun 10, 2024

Conversation

tssweeney
Copy link
Collaborator

@tssweeney tssweeney commented Jun 6, 2024

Our OpenAI integration was not returning a Stream object from the sync streaming API. This was not caught in the past because most people just use it for it's iterator interface and everything works. However, Langchain actually uses the context block (enter/exit) dunder methods. In order to bring our integration back into compliance, I implement a similar class as the WeaveStream which is the sister class to WeaveAsyncStream. These now follow a common pattern and are true subclasses of the original openai class.

This should have no functional change for users and hits existing tests.

This came to my attention because the growth team is implementing Langchain and hit this underlying issue.

 File "/opt/anaconda3/envs/weave/lib/python3.10/site-packages/langchain_openai/chat_models/base.py", line 480, in _stream
    with self.client.create(messages=message_dicts, **params) as response:
AttributeError: __enter__

choice["finish_reason"] = chunk_choice.finish_reason
if chunk_choice.logprobs:
choice["logprobs"] = chunk_choice.logprobs
def _process_chunk(self, chunk: ChatCompletionChunk) -> None:
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks like a huge change, but if you hide whitespace, you see that it is just moving some logic to a helper function.

finish_run(result_with_usage.model_dump(exclude_unset=True))

return _stream_create_gen() # type: ignore
base_stream = self._base_create(*args, **kwargs)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This logic is moved into the class

finish_run=finish_run,
)

return stream
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Notice we get rid of type ignore!

@circle-job-mirror
Copy link

circle-job-mirror bot commented Jun 6, 2024

@circle-job-mirror
Copy link

@@ -84,6 +84,41 @@ async def __aiter__(self) -> AsyncIterator[ChatCompletionChunk]:
self._finish_run(result_with_usage.model_dump(exclude_unset=True))


class WeaveStream(Stream):
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New class - pretty simple. It follows the same pattern as the async class above and uses the same code that was previously in a function below.

@circle-job-mirror
Copy link

1 similar comment
@circle-job-mirror
Copy link

@tssweeney tssweeney marked this pull request as ready for review June 6, 2024 23:51
@tssweeney tssweeney requested a review from a team as a code owner June 6, 2024 23:51
@@ -369,10 +369,52 @@ def teardown():
unpatch()


class MockSyncResponse:
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Follow same pattern as async

@tssweeney tssweeney merged commit 17febaa into master Jun 10, 2024
24 checks passed
@tssweeney tssweeney deleted the tim/fix_openai_stream_api branch June 10, 2024 21:50
@github-actions github-actions bot locked and limited conversation to collaborators Jun 10, 2024
raise NotImplementedError("Function calls not supported")
# function call
if chunk_choice.delta.function_call:
raise NotImplementedError("Function calls not supported")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tssweeney: I would like to understand why function calls are not supported here. I know this is deprecated in Openai SDK in favor of tool calls, but libraries like LangChain currently still use this feature, and this causes the following test case related to agents break in LangChain:

def test_agent_run_with_tools(

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants