-
Notifications
You must be signed in to change notification settings - Fork 4.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question]: Is it possible integrate langchain ConversationBufferMemory ? #12845
Comments
Integrating langchain To start, examine the If you're addressing specific limitations of the For a successful integration, closely follow the implementation patterns and interface requirements outlined in llama-index's memory management system, adapting langchain's Sources
|
@sura2k its not possible currently, but could maybe be possible. I'm curious what ChatMemoryBuffer lacks that you need? |
@logan-markewich sorry for late reply. I'm actually having a requirement to use something similar to |
We would also like something along the lines of ConversationSummaryBufferMemory to be implemented in LlamaIndex. The reason is that for our use case, we want to keep latency and costs down for our end users. Using the full history up to the context length increases costs and latency for each follow-up question due to increased number of tokens. Limiting the memory to the last X messages works, but is suboptimal, because the older messages may still have relevance. We would like older messages < X to be summarized, then appended by the full text of the last X messages. @logan-markewich I can contribute to this if you are okay with adding this into LlamaIndex |
Yea go for it |
Question Validation
Question
Is it possible to integrate langchain
ConversationBufferMemory
with llama-index chat engine?I'm checking on this because current llama-index
ChatMemoryBuffer
is not sufficient.The text was updated successfully, but these errors were encountered: