Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'AIMessageChunk' object has no attribute 'text' #18024

Closed
4 tasks done
FloTeu opened this issue Feb 23, 2024 · 4 comments
Closed
4 tasks done

AttributeError: 'AIMessageChunk' object has no attribute 'text' #18024

FloTeu opened this issue Feb 23, 2024 · 4 comments
Assignees
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🔌: openai Primarily related to OpenAI integrations

Comments

@FloTeu
Copy link

FloTeu commented Feb 23, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.

Example Code

install the following dependencies

pip install openai
pip install google-search-results
pip install langchain # version 0.1.9
pip install numexpr

Run the following python code (and add the openai api key):

from langchain import hub
from langchain import LLMMathChain, SerpAPIWrapper
from langchain.agents import Tool
from langchain.chat_models import ChatOpenAI
from langchain.agents import AgentExecutor, create_openai_functions_agent

import os
os.environ['OPENAI_API_KEY'] = str("xxx")

llm = ChatOpenAI(temperature=0, model="gpt-3.5-turbo-0613")
llm_math_chain = LLMMathChain.from_llm(llm=llm, verbose=True)
tools = [
    Tool(
        name="Calculator",
        func=llm_math_chain.run,
        description="useful for when you need to answer questions about math"
    )
]

hub_prompt: object = hub.pull("hwchase17/openai-tools-agent")
agent = create_openai_functions_agent(llm, tools, hub_prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True, handle_parsing_errors=True)
input_prompt = "What is the square root of the year of birth of the founder of Space X?"
agent_executor.invoke({"input": input_prompt})

Error Message and Stack Trace (if applicable)

AttributeError: 'AIMessageChunk' object has no attribute 'text'

Trace
Traceback (most recent call last):
File "/Users/fteutsch/Desktop/PythonProjects/private/digiprod-gen/bug_report.py", line 31, in
agent_executor.invoke({"input": input_prompt})
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 163, in invoke
raise e
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 153, in invoke
self._call(inputs, run_manager=run_manager)
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain/agents/agent.py", line 1391, in _call
next_step_output = self._take_next_step(
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain/agents/agent.py", line 1097, in _take_next_step
[
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain/agents/agent.py", line 1097, in
[
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain/agents/agent.py", line 1125, in _iter_next_step
output = self.agent.plan(
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain/agents/agent.py", line 387, in plan
for chunk in self.runnable.stream(inputs, config={"callbacks": callbacks}):
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2427, in stream
yield from self.transform(iter([input]), config, **kwargs)
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2414, in transform
yield from self._transform_stream_with_config(
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1494, in _transform_stream_with_config
chunk: Output = context.run(next, iterator) # type: ignore
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2378, in _transform
for output in final_pipeline:
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1032, in transform
for chunk in input:
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4167, in transform
yield from self.bound.transform(
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1042, in transform
yield from self.stream(final, config, **kwargs)
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 250, in stream
raise e
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 234, in stream
for chunk in self._stream(
File "/Users/fteutsch/Library/Caches/pypoetry/virtualenvs/digiprod-gen-qSAcCc-Q-py3.10/lib/python3.10/site-packages/langchain_community/chat_models/openai.py", line 418, in _stream
run_manager.on_llm_new_token(chunk.text, chunk=cg_chunk)
AttributeError: 'AIMessageChunk' object has no attribute 'text'

Description

Since i updated langchain from 0.1.7 to latest version 0.1.9 i got the exception mentioned above.
I already found the issue and maybe the solution as well.

libs/community/langchain_community/chat_models/openai.py
lines 414-418

cg_chunk = ChatGenerationChunk(
    message=chunk, generation_info=generation_info
)
if run_manager:
    run_manager.on_llm_new_token(**chunk**.text, chunk=cg_chunk)

chunk has the type AIMessageChunk which does not contain the attribute text, whereas cg_chunk has the type ChatGenerationChunk which contains text as attribute (and in version 0.1.7 the same class was used).
The fix probably would be:

cg_chunk = ChatGenerationChunk(
    message=chunk, generation_info=generation_info
)
if run_manager:
    run_manager.on_llm_new_token(**cg_chunk**.text, chunk=cg_chunk)

line 510 in the same file contains the same issue.

System Info

langchain==0.1.9
langchain-community==0.0.22
langchain-core==0.1.26
langchainhub==0.1.14

macOs
python 3.10

@dosubot dosubot bot added 🔌: openai Primarily related to OpenAI integrations 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Feb 23, 2024
@eyurtsev eyurtsev self-assigned this Feb 23, 2024
@eyurtsev
Copy link
Collaborator

eyurtsev commented Feb 23, 2024

confirmed

@eyurtsev
Copy link
Collaborator

@eyurtsev
Copy link
Collaborator

Fix merged. Closing issue -- will be available in next release.

@FloTeu we recommend transitioning into the partner package for open ai rather than using the community version.

@magedhelmy1
Copy link

magedhelmy1 commented Feb 25, 2024

Fix merged. Closing issue -- will be available in next release.

@FloTeu we recommend transitioning into the partner package for open ai rather than using the community version.

@eyurtsev
I am also facing this issue, can you please explain what did you mean with your comment when you said partner package for open ai rather than using the community version ? it was not clear.

This is the code causing the issue:

from langchain.adapters import openai as lc_openai

async def stream_response(model, messages, temperature, max_tokens, llm_provider, websocket=None):
    paragraph = ""
    response = ""

    for chunk in lc_openai.ChatCompletion.create(
            model=model,
            messages=messages,
            temperature=temperature,
            max_tokens=max_tokens,
            provider=llm_provider,
            stream=True,
    ):
        content = chunk["choices"][0].get("delta", {}).get("content")
        if content is not None:
            response += content
            paragraph += content
            if "\n" in paragraph:
                if websocket is not None:
                    await websocket.send_json({"type": "report", "output": paragraph})
                else:
                    print(f"{Fore.GREEN}{paragraph}{Style.RESET_ALL}")
                paragraph = ""
    return response

asaf added a commit to crossid/accessbot that referenced this issue Feb 26, 2024
Downgraded LangChain until issue below is resolved:
langchain-ai/langchain#18024

resolves #6
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🔌: openai Primarily related to OpenAI integrations
Projects
None yet
Development

No branches or pull requests

3 participants