Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support to add conversation history in the prompt #6574

Closed
1 task done
dekanayake opened this issue Jun 24, 2023 · 1 comment
Closed
1 task done

Support to add conversation history in the prompt #6574

dekanayake opened this issue Jun 24, 2023 · 1 comment
Labels
question Further information is requested

Comments

@dekanayake
Copy link

Question Validation

  • I have searched both the documentation and discord for an answer.

Question

Hi , Much appreciated your alls work on llama_index. It is a great tool to integrate with LLM. I'm currently trying to implement a use case for a client which the chat agent askes question base on the context. I have attach my code with this . However I can't add the conversation history , hence I the chat agent starts to repeat the question since it does not know the history. Could you help me on how to integrate the chat history to the chat .

`import logging
import sys
from llama_index import StorageContext, load_index_from_storage,LLMPredictor,ServiceContext
from llama_index import (
VectorStoreIndex,
ResponseSynthesizer,
)
from llama_index.retrievers import VectorIndexRetriever
from llama_index.query_engine import RetrieverQueryEngine
from llama_index.indices.postprocessor import SimilarityPostprocessor
from langchain.llms.openai import OpenAIChat
from llama_index.indices.response.type import ResponseMode
from llama_index import Prompt
from llama_index.langchain_helpers.agents import LlamaToolkit, create_llama_chat_agent, IndexToolConfig
from langchain.chains.conversation.memory import ConversationBufferMemory

rebuild storage context

storage_context = StorageContext.from_defaults(persist_dir="./storage")

load index

index = load_index_from_storage(storage_context)

logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))
llm = OpenAIChat(temperature=0)
llm_predictor = LLMPredictor(llm=OpenAIChat(temperature=0))
service_context = ServiceContext.from_defaults(
llm_predictor=llm_predictor, chunk_size=1024
)

retriever = VectorIndexRetriever(
index=index,
similarity_top_k=10,
)

qa_template = (
"""
We have provided context information below.
---------------------
{context_str}
---------------------

You are CLientbot, a friendly AI assistant assisting homeowers to select the right hot water unit based on their needs.
Based on the data, ask the customer questions one at a time to help guide them toward the right product for them.
Be as succinct as possible. please avoid asking previously asked questions.
Do NOT ever provide any information about installation or maintenance,
if customers as questions about this then inform them to contact a licensed plumber.
If customers ask irrelevant questions, inform them you are only able to provide questions about hot water units at this stage.
with all that inforamtion ask question or provide the answer to question : {query_str} """
)

qa_template_prompt = Prompt(qa_template)

refine_template = (
"""
The existing answer is as follows:
---------------------
{existing_answer}
---------------------

The original question is as fellows:
---------------------
{query_str}
---------------------

following is the context:
------------------------
{context_msg}
------------------------

You are Clientbot, a friendly AI assistant assisting homeowers to select the right hot water unit based on their needs.
Based on the data,ask the customer questions one at a time to help guide them toward the right product for them.
Be as succinct as possible. please avoid asking previously asked questions. Do NOT ever provide any information about installation or maintenance,
if customers as questions about this then inform them to contact a licensed plumber.
If customers ask irrelevant questions, inform them you are only able to provide questions about hot water units at this stage.
if you cant ask conclude to a answer ask the original question. if you have a answer then provide the answer """
)

refine_template_prompt = Prompt(refine_template)

configure response synthesizer

response_synthesizer = ResponseSynthesizer.from_args(
service_context= service_context,
text_qa_template=qa_template_prompt,
refine_template=refine_template_prompt,
node_postprocessors=[
SimilarityPostprocessor(similarity_cutoff=0.7)
]
)

query_engine = RetrieverQueryEngine(
retriever=retriever,
response_synthesizer=response_synthesizer,
)

tool config

graph_config = IndexToolConfig(
query_engine=query_engine,
name=f"search Index",
description="useful for when you want to answer queries for client brands.",
tool_kwargs={"return_direct": True}
)

toolkit = LlamaToolkit(
index_configs=[graph_config],
)

memory = ConversationBufferMemory(memory_key="chat_history")
agent_chain = create_llama_chat_agent(
toolkit,
llm,
verbose=True,
memory=memory,
salesperson_name="Vince Black"
)

while True:
text_input = input("User: ")
response = agent_chain.run(input=text_input)
print(f'Agent: {response}')
`

@dekanayake dekanayake added the question Further information is requested label Jun 24, 2023
@dekanayake dekanayake changed the title Support o add conversation history in the prompt Support to add conversation history in the prompt Jun 24, 2023
@logan-markewich
Copy link
Collaborator

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants