Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add token buffer based memory, port of ConversationTokenBufferMemory in langchain python #1264

Conversation

aseem2625
Copy link
Contributor

@aseem2625 aseem2625 commented May 14, 2023

JS port of python version here
k window based memory doesn't work in scenarios where the answer length cannot be predicted, so openai doesn't return any response in such cases or the answer doesn't have natural STOP. So, this token based buffer memory allows to insert chat history safely.

  • Local testing
  • Test cases - TBD after this warning is fixed. Not a blocker as it fallbacks to approx tiktoken calculation but prior to this PR, langchainJS doesn't have any token based library, so this warning has never surfaced. Since it is polluting the global logs, this must be fixed in original repo.
image

@vercel
Copy link

vercel bot commented May 14, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Updated (UTC)
langchainjs-docs ❌ Failed (Inspect) May 14, 2023 4:21pm

@jacoblee93
Copy link
Collaborator

Hey @aseem2625 is this ready for a look?

@jacoblee93 jacoblee93 self-assigned this May 23, 2023
@dmanresa-saes
Copy link

This would be very good. If this branch is not merged. Is there any workaround to do the same thing that ConversationTokenBufferMemory does with what is implemented in the JS version of Langchain?

@jacoblee93
Copy link
Collaborator

This was later merged in #3211

@jacoblee93 jacoblee93 closed this Nov 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants