Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add openai v2 adapter (langchain-ai#14063)
### Description Starting from [openai version 1.0.0](https://github.com/openai/openai-python/tree/17ac6779958b2b74999c634c4ea4c7b74906027a#module-level-client), the camel case form of `openai.ChatCompletion` is no longer supported and has been changed to lowercase `openai.chat.completions`. In addition, the returned object only accepts attribute access instead of index access: ```python import openai # optional; defaults to `os.environ['OPENAI_API_KEY']` openai.api_key = '...' # all client options can be configured just like the `OpenAI` instantiation counterpart openai.base_url = "https://..." openai.default_headers = {"x-foo": "true"} completion = openai.chat.completions.create( model="gpt-4", messages=[ { "role": "user", "content": "How do I output all files in a directory using Python?", }, ], ) print(completion.choices[0].message.content) ``` So I implemented a compatible adapter that supports both attribute access and index access: ```python In [1]: from langchain.adapters import openai as lc_openai ...: messages = [{"role": "user", "content": "hi"}] In [2]: result = lc_openai.chat.completions.create( ...: messages=messages, model="gpt-3.5-turbo", temperature=0 ...: ) In [3]: result.choices[0].message Out[3]: {'role': 'assistant', 'content': 'Hello! How can I assist you today?'} In [4]: result["choices"][0]["message"] Out[4]: {'role': 'assistant', 'content': 'Hello! How can I assist you today?'} In [5]: result = await lc_openai.chat.completions.acreate( ...: messages=messages, model="gpt-3.5-turbo", temperature=0 ...: ) In [6]: result.choices[0].message Out[6]: {'role': 'assistant', 'content': 'Hello! How can I assist you today?'} In [7]: result["choices"][0]["message"] Out[7]: {'role': 'assistant', 'content': 'Hello! How can I assist you today?'} In [8]: for rs in lc_openai.chat.completions.create( ...: messages=messages, model="gpt-3.5-turbo", temperature=0, stream=True ...: ): ...: print(rs.choices[0].delta) ...: print(rs["choices"][0]["delta"]) ...: {'role': 'assistant', 'content': ''} {'role': 'assistant', 'content': ''} {'content': 'Hello'} {'content': 'Hello'} {'content': '!'} {'content': '!'} In [20]: async for rs in await lc_openai.chat.completions.acreate( ...: messages=messages, model="gpt-3.5-turbo", temperature=0, stream=True ...: ): ...: print(rs.choices[0].delta) ...: print(rs["choices"][0]["delta"]) ...: {'role': 'assistant', 'content': ''} {'role': 'assistant', 'content': ''} {'content': 'Hello'} {'content': 'Hello'} {'content': '!'} {'content': '!'} ... ``` ### Twitter handle [lin_bob57617](https://twitter.com/lin_bob57617)
- Loading branch information