Skip to content

Commit 1e39a22

Browse files
authored
Add langgraph doc (mem0ai#1712)
1 parent daa65e7 commit 1e39a22

File tree

2 files changed

+124
-0
lines changed

2 files changed

+124
-0
lines changed

Diff for: docs/examples/langgraph.mdx

+123
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,123 @@
1+
---
2+
title: LangGraph with Mem0
3+
---
4+
5+
This guide demonstrates how to create a personalized Customer Support AI Agent using LangGraph and Mem0. The agent retains information across interactions, enabling a personalized and efficient support experience.
6+
7+
## Overview
8+
The Customer Support AI Agent leverages LangGraph for conversational flow and Mem0 for memory retention, creating a more context-aware and personalized support experience.
9+
10+
## Setup
11+
Install the necessary packages using pip:
12+
13+
```bash
14+
pip install langgraph langchain-openai mem0ai
15+
```
16+
17+
## Full Code Example
18+
Below is the complete code to create and interact with a Customer Support AI Agent using LangGraph and Mem0:
19+
20+
```python
21+
from typing import Annotated, TypedDict, List
22+
from langgraph.graph import StateGraph, START
23+
from langgraph.graph.message import add_messages
24+
from langchain_openai import ChatOpenAI
25+
from mem0 import Memory
26+
from langchain_core.messages import SystemMessage, HumanMessage, AIMessage
27+
28+
llm = ChatOpenAI(model="gpt-4o")
29+
mem0 = Memory()
30+
31+
# Define the State
32+
class State(TypedDict):
33+
messages: Annotated[List[HumanMessage | AIMessage], add_messages]
34+
mem0_user_id: str
35+
36+
graph = StateGraph(State)
37+
38+
39+
def chatbot(state: State):
40+
messages = state["messages"]
41+
user_id = state["mem0_user_id"]
42+
43+
# Retrieve relevant memories
44+
memories = mem0.search(messages[-1].content, user_id=user_id)
45+
46+
context = "Relevant information from previous conversations:\n"
47+
for memory in memories:
48+
context += f"- {memory['memory']}\n"
49+
50+
system_message = SystemMessage(content=f"""You are a helpful customer support assistant. Use the provided context to personalize your responses and remember user preferences and past interactions.
51+
{context}""")
52+
53+
full_messages = [system_message] + messages
54+
response = llm.invoke(full_messages)
55+
56+
# Store the interaction in Mem0
57+
mem0.add(f"User: {messages[-1].content}\nAssistant: {response.content}", user_id=user_id)
58+
return {"messages": [response]}
59+
60+
# Add nodes to the graph
61+
graph.add_node("chatbot", chatbot)
62+
63+
# Add edge from START to chatbot
64+
graph.add_edge(START, "chatbot")
65+
66+
# Add edge from chatbot back to itself
67+
graph.add_edge("chatbot", "chatbot")
68+
69+
compiled_graph = graph.compile()
70+
71+
def run_conversation(user_input: str, mem0_user_id: str):
72+
config = {"configurable": {"thread_id": mem0_user_id}}
73+
state = {"messages": [HumanMessage(content=user_input)], "mem0_user_id": mem0_user_id}
74+
75+
for event in compiled_graph.stream(state, config):
76+
for value in event.values():
77+
if value.get("messages"):
78+
print("Customer Support:", value["messages"][-1].content)
79+
return # Exit after printing the response
80+
81+
82+
if __name__ == "__main__":
83+
print("Welcome to Customer Support! How can I assist you today?")
84+
mem0_user_id = "test123"
85+
while True:
86+
user_input = input("You: ")
87+
if user_input.lower() in ['quit', 'exit', 'bye']:
88+
print("Customer Support: Thank you for contacting us. Have a great day!")
89+
break
90+
run_conversation(user_input, mem0_user_id)
91+
```
92+
93+
## Key Components
94+
95+
1. **State Definition**: The `State` class defines the structure of the conversation state, including messages and user ID.
96+
97+
2. **Chatbot Node**: The `chatbot` function handles the core logic, including:
98+
- Retrieving relevant memories
99+
- Preparing context and system message
100+
- Generating responses
101+
- Storing interactions in Mem0
102+
103+
3. **Graph Setup**: The code sets up a `StateGraph` with the chatbot node and necessary edges.
104+
105+
4. **Conversation Runner**: The `run_conversation` function manages the flow of the conversation, processing user input and displaying responses.
106+
107+
## Usage
108+
109+
To use the Customer Support AI Agent:
110+
111+
1. Run the script.
112+
2. Enter your queries when prompted.
113+
3. Type 'quit', 'exit', or 'bye' to end the conversation.
114+
115+
## Key Points
116+
117+
- **Memory Integration**: Mem0 is used to store and retrieve relevant information from past interactions.
118+
- **Personalization**: The agent uses past interactions to provide more contextual and personalized responses.
119+
- **Flexible Architecture**: The LangGraph structure allows for easy expansion and modification of the conversation flow.
120+
121+
## Conclusion
122+
123+
This Customer Support AI Agent demonstrates the power of combining LangGraph for conversation management and Mem0 for memory retention. As the conversation progresses, the agent's responses become increasingly personalized, providing an improved support experience.

Diff for: docs/mint.json

+1
Original file line numberDiff line numberDiff line change
@@ -126,6 +126,7 @@
126126
"examples/overview",
127127
"examples/personal-ai-tutor",
128128
"examples/customer-support-agent",
129+
"examples/langgraph",
129130
"examples/personal-travel-assistant"
130131
]
131132
},

0 commit comments

Comments
 (0)