Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: Got unexpected message type: AIMessageChunk #804

Closed
1nnovat1on opened this issue Oct 16, 2023 · 14 comments · Fixed by langchain-ai/langchain#15327
Closed

ValueError: Got unexpected message type: AIMessageChunk #804

1nnovat1on opened this issue Oct 16, 2023 · 14 comments · Fixed by langchain-ai/langchain#15327
Labels
bug Something isn't working

Comments

@1nnovat1on
Copy link

Policy and info

  • Maintainers will close issues that have been stale for 14 days if they contain relevant answers.
  • Adding the label "sweep" will automatically turn the issue into a coded pull request. Works best for mechanical tasks. More info/syntax at: https://docs.sweep.dev/

Expected Behavior

I typed in a prompt after setting it up. When it responds with clarifying questions, I expect to be able to respond.

Current Behavior

It fails with the error message in the title.

Failure Information

Windows, GPT-4, CMD,

Steps to Reproduce

If possible, provide detailed steps for reproducing the issue.

  1. pip install gpt-engineer
  2. Created folder
  3. enter gpt-engineer "C:\python\test_project" "gpt-4"
  4. Get back some feedback, respond to a few
  5. then error

Failure Logs

│ s.py:336 in _message_from_dict │
│ │
│ 333 │ elif _type == "function": │
│ 334 │ │ return FunctionMessage(**message["data"]) │
│ 335 │ else: │
│ ❱ 336 │ │ raise ValueError(f"Got unexpected message type: {_type}") │
│ 337 │
│ 338 │
│ 339 def messages_from_dict(messages: List[dict]) -> List[BaseMessage]: │
│ │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ _type = 'AIMessageChunk' │ │
│ │ message = { │ │
│ │ │ 'type': 'AIMessageChunk', │ │
│ │ │ 'data': { │ │
│ │ │ │ 'content': 'Summary of areas that need clarification:\n\n1. The specific │ │
│ │ type of reinforcement'+377, │ │
│ │ │ │ 'additional_kwargs': {}, │ │
│ │ │ │ 'type': 'AIMessageChunk', │ │
│ │ │ │ 'example': False │ │
│ │ │ } │ │
│ │ } │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
ValueError: Got unexpected message type: AIMessageChunk

@1nnovat1on 1nnovat1on added bug Something isn't working triage Interesting but stale issue. Will be close if inactive for 3 more days after label added. labels Oct 16, 2023
@sirpeebs
Copy link

I Just had the same error and found this via Google search. Running the most recent version of gpt-engineer and have previous had other versions work in my WSL setup on this same box.

/home/pb/venvs/v1/lib/python3.11/site-packages/langchain/schema/messages.py:336 in │
│ _message_from_dict │
│ │
│ 333 │ elif _type == "function": │
│ 334 │ │ return FunctionMessage(**message["data"]) │
│ 335 │ else: │
│ ❱ 336 │ │ raise ValueError(f"Got unexpected message type: {_type}") │
│ 337 │
│ 338 │
│ 339 def messages_from_dict(messages: List[dict]) -> List[BaseMessage]: │
│ │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ _type = 'AIMessageChunk' │ │
│ │ message = { │ │
│ │ │ 'type': 'AIMessageChunk', │ │
│ │ │ 'data': { │ │
│ │ │ │ 'content': 'Summary of areas that need clarification:\n\n1. The specific │ │
│ │ functionalities and c'+626, │ │
│ │ │ │ 'additional_kwargs': {}, │ │
│ │ │ │ 'type': 'AIMessageChunk', │ │
│ │ │ │ 'example': False │ │
│ │ │ } │ │
│ │ } │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
ValueError: Got unexpected message type: AIMessageChunk

@wwencel
Copy link

wwencel commented Oct 18, 2023

Traceback (most recent call last):

  File "/Users/wlodek/dev/venv-gpt-eng/bin/gpt-engineer", line 8, in <module>
    sys.exit(app())

  File "/Users/wlodek/dev/venv-gpt-eng/lib/python3.10/site-packages/gpt_engineer/main.py", line 96, in main
    messages = step(ai, dbs)

  File "/Users/wlodek/dev/venv-gpt-eng/lib/python3.10/site-packages/gpt_engineer/steps.py", line 192, in gen_clarified_code
    messages = AI.deserialize_messages(dbs.logs[clarify.__name__])

  File "/Users/wlodek/dev/venv-gpt-eng/lib/python3.10/site-packages/gpt_engineer/ai.py", line 216, in deserialize_messages
    return list(messages_from_dict(json.loads(jsondictstr)))  # type: ignore

  File "/Users/wlodek/dev/venv-gpt-eng/lib/python3.10/site-packages/langchain/schema/messages.py", line 348, in messages_from_dict
    return [_message_from_dict(m) for m in messages]

  File "/Users/wlodek/dev/venv-gpt-eng/lib/python3.10/site-packages/langchain/schema/messages.py", line 348, in <listcomp>
    return [_message_from_dict(m) for m in messages]

  File "/Users/wlodek/dev/venv-gpt-eng/lib/python3.10/site-packages/langchain/schema/messages.py", line 336, in _message_from_dict
    raise ValueError(f"Got unexpected message type: {_type}")

ValueError: Got unexpected message type: AIMessageChunk

same error I got, but it looks like langchain issue

@0x1pikachu
Copy link

Traceback (most recent call last):

  File "<frozen runpy>", line 198, in _run_module_as_main

  File "<frozen runpy>", line 88, in _run_code

  File "E:\VSC Workspaces\venv\Scripts\gpt-engineer.exe\__main__.py", line 7, in <module>
    sys.exit(app())
             ^^^^^

               ^^^^^^^^^^^^^

  File "E:\VSC Workspaces\venv\Lib\site-packages\gpt_engineer\steps.py", line 192, in gen_clarified_code
    messages = AI.deserialize_messages(dbs.logs[clarify.__name__])
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\VSC Workspaces\venv\Lib\site-packages\gpt_engineer\ai.py", line 216, in deserialize_messages
    return list(messages_from_dict(json.loads(jsondictstr)))  # type: ignore
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\VSC Workspaces\venv\Lib\site-packages\langchain\schema\messages.py", line 348, in messages_from_dict
    return [_message_from_dict(m) for m in messages]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\VSC Workspaces\venv\Lib\site-packages\langchain\schema\messages.py", line 348, in <listcomp>
    return [_message_from_dict(m) for m in messages]
            ^^^^^^^^^^^^^^^^^^^^^

  File "E:\VSC Workspaces\venv\Lib\site-packages\langchain\schema\messages.py", line 336, in _message_from_dict
    raise ValueError(f"Got unexpected message type: {_type}")

ValueError: Got unexpected message type: AIMessageChunk

Same here. Anything I can do ?

@ATheorell ATheorell removed the triage Interesting but stale issue. Will be close if inactive for 3 more days after label added. label Oct 18, 2023
@ATheorell
Copy link
Collaborator

I'm a little confused, is this the same problem as in #802 ?
What confuses me is that, without passing the --clarify (not in the reproduction steps), in the latest version (0.1.0), gpt-engineer is no longer asking clarifying questions. In #802, it is explicitly stated that the --clarify flag is passed.

@paulternate
Copy link

paulternate commented Oct 19, 2023

I traced the error back a little bit to a missing handler in C:\Python\lib\site-packages\langchain\schema\messages.py and was able to get past it by adding this to the _message_from_dict function right before the else statement.

elif _type == "AIMessageChunk":
        return AIMessageChunk(**message["data"])

The whole function now looks like this:

def _message_from_dict(message: dict) -> BaseMessage:
    _type = message["type"]
    if _type == "human":
        return HumanMessage(**message["data"])
    elif _type == "ai":
        return AIMessage(**message["data"])
    elif _type == "system":
        return SystemMessage(**message["data"])
    elif _type == "chat":
        return ChatMessage(**message["data"])
    elif _type == "function":
        return FunctionMessage(**message["data"])
    elif _type == "AIMessageChunk":
        return AIMessageChunk(**message["data"])
    else:
        raise ValueError(f"Got unexpected message type: {_type}")

@pralad-p
Copy link
Contributor

pralad-p commented Oct 21, 2023

I think how gpt-engineer deals with deserialization of user-entered messages (because the OP says it didn't happen on the prompt file input, but rather on the subsequent messages) could indicate that the implicit properties added during deserialization (or this AIMessageChunk type cast) has a strong dependency with how Langchain modifies its schema as per its requirements. What I mean is previously, Langchain might have been quite loose with the type of messages it recieved, but now (I assume due to extra features) needed the type separation. Maybe it's a good idea to brainstorm ways on how to loosen the coupling of this dependency. Additional tests in this dept. would be the first step to know the scope of the problem.

@ibrahim-sowunmi
Copy link

elif _type == "AIMessageChunk":
        return AIMessageChunk(**message["data"])

Fixed it for me on mac.

python3 -c "import site; print(site.getsitepackages())"
Then open the schema with open . and edited the file

@ATheorell
Copy link
Collaborator

@wwencel @0x1pikachu
Can you rerun on the latest main commit or at tag 0.1.0?

The stack trace
File "E:\VSC Workspaces\venv\Lib\site-packages\gpt_engineer\ai.py", line 216, in deserialize_messages
return list(messages_from_dict(json.loads(jsondictstr))) # type: ignore
is not up to date.

@ATheorell
Copy link
Collaborator

@pralad-p agreed.
I think the first thing that needs to be done is to switch to better dependency management so we get less unpleasant surprises when some dependency is updated.

@MWals
Copy link

MWals commented Oct 22, 2023

elif _type == "AIMessageChunk":
        return AIMessageChunk(**message["data"])

Fixed it for me on mac.

python3 -c "import site; print(site.getsitepackages())" Then open the schema with open . and edited the file

This also fixed the problem for me on Mac. Please consider integrating this into the code.

@ATheorell
Copy link
Collaborator

@MWals @ibrahim-sowunmi @paulternate
The code you are referring to is inside langchain and not gpt-engineer. Thus, we cannot modify it. It would be awesome if you could request this change at https://github.com/langchain-ai/langchain

@ATheorell
Copy link
Collaborator

A direct fix here, instead of modifying langchain, is probably to downgrade the langchain version in your environment.
So far, nobody has provided an example that I can reproduce, which would help me in debugging this.

@ATheorell
Copy link
Collaborator

This is now addressed in PR #833 by restricting the langchain version. Reinstalling from main branch source should solve it.

hwchase17 added a commit to langchain-ai/langchain that referenced this issue Dec 29, 2023
… breaks message history runnable. (#15327)

- **Description:** fix parse issue for AIMessageChunk when using 
  - **Issue:** #14511
  - **Dependencies:** none
  - **Twitter handle:** none

Taken from this fix:
gpt-engineer-org/gpt-engineer#804 (comment)

Please make sure your PR is passing linting and testing before
submitting. Run `make format`, `make lint` and `make test` from the root
of the package you've modified to check this locally.

See contribution guidelines for more information on how to write/run
tests, lint, etc: https://python.langchain.com/docs/contributing/

If you're adding a new integration, please include:
1. a test for the integration, preferably unit tests that do not rely on
network access,
2. an example notebook showing its use. It lives in
`docs/docs/integrations` directory.

If no one reviews your PR within a few days, please @-mention one of
@baskaryan, @eyurtsev, @hwchase17.

---------

Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
@jonnolen
Copy link

the update from this issue landed in upstream langchain really quickly... langchain-ai/langchain#15327. credit to @paulternate

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
10 participants