-
Hello! I'm trying to instantiate an Open AI Functions Agent with both memory and a custom system message. I'm having a difficult time figuring out how to do it. In one method, I can set the system message - like this:
This will give me an agent that calls tools and talks like a chicken, but has no memory. Using another method:
I can declare an agent with memory, but if I send in a system_message parameter as a kwarg, it doesn't make its way to the agent. I end up with "You are a helpful AI assistant". How can I instantiate this agent with both memory and a custom system message? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
🤖 Hello, Based on the information you provided and the context from the LangChain repository, it seems you want to instantiate an OpenAI Functions Agent with both memory and a custom system message. To achieve this, you can include the system message in the system_message = SystemMessage(content="You are Jelly.")
agent_kwargs = {
"extra_prompt_messages": [MessagesPlaceholder(variable_name="memory")],
"system_message": system_message,
}
open_ai_agent = initialize_agent(
tools,
llm,
agent=AgentType.OPENAI_FUNCTIONS,
verbose=True,
agent_kwargs=agent_kwargs,
memory=memory,
) This solution was confirmed to work in LangChain version 0.0.234. However, it was noted that the If you want to use the from langchain.chains import LLMMathChain
from langchain.llms import OpenAI
from langchain.utilities import SerpAPIWrapper, SQLDatabase
from langchain_experimental.sql import SQLDatabaseChain
from langchain.agents import Tool, AgentType
from langchain.agents.openai_functions_multi_agent.base import OpenAIMultiFunctionsAgent
from langchain.chat_models import ChatOpenAI
from langchain.prompts import MessagesPlaceholder
from langchain.memory import ConversationBufferMemory
# Assuming you have a language model and tools
llm = ChatOpenAI(temperature=0, model="gpt-3.5-turbo-0613")
search = SerpAPIWrapper()
llm_math_chain = LLMMathChain.from_llm(llm=llm, verbose=True)
db = SQLDatabase.from_uri("sqlite:///../../../../../notebooks/Chinook.db")
db_chain = SQLDatabaseChain.from_llm(llm, db, verbose=True)
tools = [
Tool(
name="Search",
func=search.run,
description="useful for when you need to answer questions about current events. You should ask targeted questions",
),
Tool(
name="Calculator",
func=llm_math_chain.run,
description="useful for when you need to answer questions about math",
),
Tool(
name="FooBar-DB",
func=db_chain.run,
description="useful for when you need to answer questions about FooBar. Input should be in the form of a question containing full context",
),
]
# Create a custom system message
custom_system_message = MessagesPlaceholder(variable_name="memory")
# Create the memory
memory = ConversationBufferMemory(memory_key="memory", return_messages=True)
# Create the agent
agent = OpenAIMultiFunctionsAgent.from_llm_and_tools(
llm,
tools,
extra_prompt_messages=[custom_system_message],
) In this example, I hope this helps! If you have any other questions, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
Figured it out. llm = ChatOpenAI()
|
Beta Was this translation helpful? Give feedback.
🤖
Hello,
Based on the information you provided and the context from the LangChain repository, it seems you want to instantiate an OpenAI Functions Agent with both memory and a custom system message.
To achieve this, you can include the system message in the
agent_kwargs
when initializing the agent. Here's how you can do it:This solution was…