-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sweep: [Bug]: GPTIndexChatMemory doesn't work as expected with langchain and the agent doesn't use the chat history #12
Comments
Here's the PR! #14.💎 Sweep Pro: I used GPT-4 to create this ticket. You have 26 GPT-4 tickets left.
Step 1: 🔍 Code SearchI found the following snippets in your repository. I will now analyze these snippets and come up with a plan. Some code snippets I looked at (click to expand). If some file is missing from here, you can mention the path in the ticket description.llama_index/llama_index/langchain_helpers/memory_wrapper.py Lines 1 to 189 in 50e6bfe
llama_index/examples/langchain_demo/LangchainDemo.ipynb Lines 1 to 131 in 50e6bfe
llama_index/llama_index/chat_engine/simple.py Lines 1 to 135 in 50e6bfe
llama_index/llama_index/agent/openai_agent.py Lines 52 to 328 in 50e6bfe
Step 2: 🧐 Snippet AnalysisFrom looking through the relevant snippets, I decided to make the following modifications:
Step 3: 📝 PlanningI have created a plan for writing the pull request. I am now working my plan and coding the required changes to address this issue. Here is the planned pull request: Fix memory update issue in GPTIndexChatMemory
Step 4: ⌨️ CodingI have finished coding the issue. I am now reviewing it for completeness. Step 5: 🔁 Code ReviewSuccess! 🚀 I'm a bot that handles simple bugs and feature requests but I might make mistakes. Please be kind! |
Bug Description
The code works without any errors but when I'm trying to use GPTIndexChatMemory to embed my conversation and store the whole of it, this feature (memory) doesn't work anymore, and this is my code, also I want to save the memory in a folder with its embeddings but I can't
Clone of run-llama#6923.
Version
langchain version: '0.0.230' llama-index version: '0.7.4' python: 3.10.11
Steps to Reproduce
this would be the output for the first print statement.
AI: Hello Zeyad! How can I assist you today?
print("Do you know my name?")
this would be the output for the first print statement (unexcepted output), one week ago it was working fine without any problems.
AI: As an AI language model, I don't have access to personal information unless you provide it to me. Therefore, I don't know your name unless you tell me. Is there anything specific you would like assistance with?
Expected behavior
the expected output for the second statement must be:
AI: Yes, you told me before that your name is Zeyad.
I really appreciate any help you can provide.
Relevant Logs/Tracbacks
No response
The text was updated successfully, but these errors were encountered: