New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Object of type SystemMessage is not JSON serializable #9288
Comments
🤖 Based on the information you've provided, it seems like you're trying to serialize a In the LangChain framework, you can use the from langchain.load.dump import dumps
message = SystemMessage(content="abc1")
messages = [message]
json_string = dumps(messages, ensure_ascii=False) However, if you want the output JSON string to be Here is a general example of how you might implement this: class SystemMessage(BaseMessage):
def __init__(self, role: str, message: str):
self.role = role
self.message = message
def to_json(self):
return {"role": self.role, "message": self.message} Then, you can serialize a message = SystemMessage("system", "abc1")
json_string = dumps(message) This will produce the JSON string Please note that this is a general example and might need to be adjusted based on the actual implementation of the I hope this helps! If you have any other questions, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
This should be no harder than writing: chat.save("chat.json") However |
It does work, although not in your desired format @jiangying000, like so: from langchain.load import dumps, loads
from langchain.prompts import ChatPromptTemplate
chat = ChatPromptTemplate.from_messages(
[
("system", "You are a helpful bot"),
("human", "Tell me a joke about {topic}"),
]
)
with open("chat.json", "w") as fh:
fh.write(dumps(chat))
with open("chat.json") as fh:
chat = loads(fh.read())
chat |
Hi, @jiangying000, I'm helping the LangChain team manage our backlog and am marking this issue as stale. From what I understand, you were having trouble serializing a SystemMessage object to JSON and received a detailed response from me on how to achieve the expected JSON output. TomTom101 also contributed by providing an example of using the Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you! |
System Info
0.0.247
Who can help?
@hwchase17
@agola11
Information
Related Components
Reproduction
run
Expected behavior
I think this might not be a bug
But Would be best if JSON output is (since I am using openai):
How can I do this ?
What ai says is not as expected:
is printing:
And how can I use ensure_ascii with langchain.load.dump ?
The text was updated successfully, but these errors were encountered: