Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Infinite AI memory? With automatic summary! #801

Closed
Jijiza34 opened this issue Jan 10, 2024 · 6 comments
Closed

Infinite AI memory? With automatic summary! #801

Jijiza34 opened this issue Jan 10, 2024 · 6 comments
Labels
invalid This doesn't seem right

Comments

@Jijiza34
Copy link

Make the AI automatically summarize the oldest 1K tokens of the chat history, when it reaches the limit of its tokens.
Not only could this increase the AI's memory times ten, it would also resemble human memory. Because the further something is in the past for us humans, the less details we remember about it.

After a while, even the summaries could be summarized again, to make them even smaller, similar to how we humans can barely remember any details about something which happened a long time ago. This way, could we potentially have infinite AI memory, if the AI just continues summarizing past events so that they fit in its token limit?
Exciting!!

@Endovior
Copy link

This is really more of a model design suggestion instead of an Agnai issue... and, for that matter, an xkcd.com/1425 moment.

Sure, there are bots that you could tell to do this, but they aren't especially likely to do a good job, or to save that much space, and imposing harsh token constraints on how long the summary can be is more likely to make them not do a good job. With current models, you'd most likely just be encouraging repetition, as tends to happen whenever you feed an AI's output back into itself.

Getting the AI to understand the meaning of what it has in context well enough to know which details can be safely discarded (as opposed to randomly pattern-matching and hoping for the best) would represent a significant AI breakthrough.

@sceuick
Copy link
Member

sceuick commented Jan 12, 2024

Agnai already uses RAG for old messages, but it needs another iteration to provide surrounding messages to provide better context for the "relevant" messages.
I'm not convinced that arbitrarily summarizing messages will provide "infinite memory". There is still limited context. And what happens when the conversation is 10x the size of the context window? As far as I understand it, this proposal still leaves massive gaps in the conversation that won't be summarized. The summarized text will be erase the language/personality/style used in the messages.

@sceuick sceuick added the invalid This doesn't seem right label Jan 12, 2024
@Jijiza34
Copy link
Author

Getting the AI to understand the meaning of what it has in context well enough to know which details can be safely discarded (as opposed to randomly pattern-matching and hoping for the best) would represent a significant AI breakthrough.

GPT 3 can do that.

@lt455067
Copy link
Contributor

Automatic summary has its place -- it can also be an embed. It has its flaws though in that the summaries aren't always accurate or useful in and of themselves. I'd love to see "summaries" added to Agnai eventually, automatic or button-push or otherwise. There's a couple characters that can make summaries (see the sharing channels) and I think a good first step is a "quick memory button" which you could combo with a summarizer character but which could also be used for lots of other purposes

@lt455067
Copy link
Contributor

I like the idea of "summarizing" in general, but it can definitely lose a lot of information and may or may not be as useful as we might suspect.

Let's say the chat history is 10x the context size. Let's say we summarized a full context window of chat history (say, 8k tokens) and we've stored it in a "summary" embedding somewhere in the library with chat window demarcation so as to signify the summary of the mesages between start and end timestamps.

Ignoring for a second the challenge of editing, deleting, and regenerating these messages, let's say it's just a straight 8k tokens worth of existing unedited chat messages. Not that these problems are trivial, but that for the base case this simplifies the conversation.

Now, if we summarized the "first" 8k context of chat history and then another 8k context of chat history is present we would summarize that separately. Now we have two summary embeds, possibly 500 tokens each (or less). If we say there's a summary context limit of 2000 tokens, then when we hit the 2000 token limit (i.e. 4 summaries) we then re-summarize all the summaries and reduce down to a single summary. In this way, as context grows to 10x the context window, summaries will continue to reduce and reduce and reduce.

This is one approach.

Another is through the "quick-memory add" with "summary message generations" so the user could generate a summary of "all previous chat messages" or "automatically sumarize the chat when the window closes" or similar. Then these messages could simply be added to the chat history, optionally, or included only when the "quick memory add" button is pressed (which would add a memory entry to the current memory book with a pop-up for editing memory seettings like which memory book, keywords, weights, and priority). A quick memory add feature could combo nicely with a summarizer feature.

This request I think needs more thought, but I do think there's room here for something interesting.

@lt455067
Copy link
Contributor

as I understand it, the current take is that the juice here isn't worth the squeeze because summarizations will drop a lot of valuable information from the chat history. That's likely true, and comboing it with a memory entry that we could edit could be a good fix. That said, this is not a trivial feature and would be a large time commitment with lots of edge cases and design considerations were it to ever get added.

@sceuick sceuick closed this as completed Feb 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
invalid This doesn't seem right
Projects
None yet
Development

No branches or pull requests

4 participants