Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Object of type SystemMessage is not JSON serializable #9288

Closed
2 of 14 tasks
jiangying000 opened this issue Aug 16, 2023 · 4 comments
Closed
2 of 14 tasks

Object of type SystemMessage is not JSON serializable #9288

jiangying000 opened this issue Aug 16, 2023 · 4 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules

Comments

@jiangying000
Copy link
Contributor

jiangying000 commented Aug 16, 2023

System Info

0.0.247

Who can help?

@hwchase17
@agola11

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

run

messages = []
messages.append(SystemMessage(content="abc1"))
json.dumps(messages, ensure_ascii=False)
  File "C:\Users\64478\AppData\Local\Programs\Python\Python310\lib\json\__init__.py", line 238, in dumps
    **kw).encode(obj)
  File "C:\Users\64478\AppData\Local\Programs\Python\Python310\lib\json\encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "C:\Users\64478\AppData\Local\Programs\Python\Python310\lib\json\encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
  File "C:\Users\64478\AppData\Local\Programs\Python\Python310\lib\json\encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type SystemMessage is not JSON serializable

Expected behavior

I think this might not be a bug

But Would be best if JSON output is (since I am using openai):

{"role": "system", "message": "abc1你好"}

How can I do this ?


What ai says is not as expected:

from langchain.load.dump import dumps

message1 = SystemMessage(content="abc1你好")

message2 = HumanMessage(content="123")
messages = [message1, message1]

print(dumps(messages))

is printing:

[{"lc": 1, "type": "constructor", "id": ["langchain", "schema", "messages", "SystemMessage"], "kwargs": {"content": "abc1\u4f60\u597d"}}, {"lc": 1, "type": "constructor", "id": ["langchain", "schema", "messages", "SystemMessage"], "kwargs": {"content": "abc1\u4f60\u597d"}}]

And how can I use ensure_ascii with langchain.load.dump ?

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Aug 16, 2023
@dosubot
Copy link

dosubot bot commented Aug 16, 2023

🤖

Based on the information you've provided, it seems like you're trying to serialize a SystemMessage object to JSON using Python's built-in json.dumps() function. However, this function doesn't know how to serialize custom objects like SystemMessage.

In the LangChain framework, you can use the dumps function provided in the dump.py file to serialize objects. This function is capable of serializing custom objects, including the SystemMessage object. Here's how you can use it:

from langchain.load.dump import dumps

message = SystemMessage(content="abc1")
messages = [message]
json_string = dumps(messages, ensure_ascii=False)

However, if you want the output JSON string to be {"role": "system", "message": "abc1"}, you would need to ensure that SystemMessage implements the to_json method. This method should return a dictionary that represents the JSON structure you want.

Here is a general example of how you might implement this:

class SystemMessage(BaseMessage):
    def __init__(self, role: str, message: str):
        self.role = role
        self.message = message

    def to_json(self):
        return {"role": self.role, "message": self.message}

Then, you can serialize a SystemMessage object to JSON like this:

message = SystemMessage("system", "abc1")
json_string = dumps(message)

This will produce the JSON string {"role": "system", "message": "abc1"}.

Please note that this is a general example and might need to be adjusted based on the actual implementation of the SystemMessage class in the LangChain framework.

I hope this helps! If you have any other questions, feel free to ask.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@TomTom101
Copy link
Contributor

This should be no harder than writing:

 chat.save("chat.json")

However save is not implemented for ChatPromptTemplate

@TomTom101
Copy link
Contributor

It does work, although not in your desired format @jiangying000, like so:

from langchain.load import dumps, loads
from langchain.prompts import ChatPromptTemplate

chat = ChatPromptTemplate.from_messages(
    [
        ("system", "You are a helpful bot"),
        ("human", "Tell me a joke about {topic}"),
    ]
)

with open("chat.json", "w") as fh:
    fh.write(dumps(chat))

with open("chat.json") as fh:
    chat = loads(fh.read())

chat

Copy link

dosubot bot commented Feb 6, 2024

Hi, @jiangying000,

I'm helping the LangChain team manage our backlog and am marking this issue as stale. From what I understand, you were having trouble serializing a SystemMessage object to JSON and received a detailed response from me on how to achieve the expected JSON output. TomTom101 also contributed by providing an example of using the dumps function to save a ChatPromptTemplate object to a JSON file.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Feb 6, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Feb 13, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Feb 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules
Projects
None yet
Development

No branches or pull requests

2 participants