Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: Is it possible integrate langchain ConversationBufferMemory ? #12845

Closed
1 task done
sura2k opened this issue Apr 15, 2024 · 5 comments · Fixed by #13155
Closed
1 task done

[Question]: Is it possible integrate langchain ConversationBufferMemory ? #12845

sura2k opened this issue Apr 15, 2024 · 5 comments · Fixed by #13155
Labels
question Further information is requested

Comments

@sura2k
Copy link

sura2k commented Apr 15, 2024

Question Validation

  • I have searched both the documentation and discord for an answer.

Question

Is it possible to integrate langchain ConversationBufferMemory with llama-index chat engine?
I'm checking on this because current llama-index ChatMemoryBuffer is not sufficient.

@sura2k sura2k added the question Further information is requested label Apr 15, 2024
Copy link

dosubot bot commented Apr 15, 2024

Integrating langchain ConversationBufferMemory with the llama-index chat engine is indeed feasible. This process involves creating an adapter or bridge that allows llama-index to leverage langchain's ConversationBufferMemory for enhanced memory management. Specifically, you would need to either extend the BaseMemory class or develop a new class that adheres to the BaseMemory interface expected by llama-index, but internally utilizes langchain's memory management features.

To start, examine the BaseMemory class in llama-index for an understanding of the required interface and then look into how ConversationBufferMemory operates within langchain. This approach will enable you to incorporate langchain's advanced memory management capabilities into llama-index, potentially overcoming limitations you've encountered with the current ChatMemoryBuffer.

If you're addressing specific limitations of the ChatMemoryBuffer, such as token limit constraints, message role handling, or serialization issues, a tailored solution focusing on those aspects would be more effective. Understanding the exact features or capabilities you need will help in providing a more precise integration strategy.

For a successful integration, closely follow the implementation patterns and interface requirements outlined in llama-index's memory management system, adapting langchain's ConversationBufferMemory functionalities within that framework.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@logan-markewich
Copy link
Collaborator

@sura2k its not possible currently, but could maybe be possible.

I'm curious what ChatMemoryBuffer lacks that you need?

@sura2k
Copy link
Author

sura2k commented Apr 17, 2024

@logan-markewich sorry for late reply. I'm actually having a requirement to use something similar to ConversationSummaryBufferMemory

@shadyelgewily-slimstock
Copy link
Contributor

shadyelgewily-slimstock commented Apr 25, 2024

We would also like something along the lines of ConversationSummaryBufferMemory to be implemented in LlamaIndex. The reason is that for our use case, we want to keep latency and costs down for our end users. Using the full history up to the context length increases costs and latency for each follow-up question due to increased number of tokens. Limiting the memory to the last X messages works, but is suboptimal, because the older messages may still have relevance. We would like older messages < X to be summarized, then appended by the full text of the last X messages.

@logan-markewich I can contribute to this if you are okay with adding this into LlamaIndex

@logan-markewich
Copy link
Collaborator

Yea go for it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants