Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

nKeep Persistent Context + Persistent Context Token Counts #62

Conversation

neCo2
Copy link
Contributor

@neCo2 neCo2 commented May 19, 2024

Built on #56
Made memory and wi use n-keep for llama.cpp and kobold.cpp so it won't get flushed out of context. Since the opportunity arose, I also added token counts for the persistent context fields.

firefox_2024-05-19_11-05-15
firefox_2024-05-19_11-25-28

@neCo2 neCo2 marked this pull request as draft May 21, 2024 08:56
@neCo2 neCo2 marked this pull request as ready for review May 23, 2024 16:47
@neCo2
Copy link
Contributor Author

neCo2 commented May 23, 2024

All right, I fixed my botched merge. Should be ready for review again.
Also for some reason I remember you changing something related to world info, but I can't for the life of me find the change I'm looking for.
If I didn't hallucinate that change, please check whether this PR overwrote anything or would otherwise cause problems with it.

@neCo2
Copy link
Contributor Author

neCo2 commented May 23, 2024

A bit late to realize, but am I misunderstanding the functionality of n-keep? Can the memory even be "flushed" out of the context like I thought it could? If not, a lot of this PR could be scrapped.

@lmg-anon
Copy link
Owner

Afaik, n_keep is just what we are already doing but more accurate (i.e., it keeps n tokens at the beginning of the context). No backend will 'flush the context' as long as the prompt is within the context window.

@lmg-anon
Copy link
Owner

Also for some reason I remember you changing something related to world info, but I can't for the life of me find the change I'm looking for.
If I didn't hallucinate that change, please check whether this PR overwrote anything or would otherwise cause problems with it.

Nah, I think you must be imagining things. As far as I remember, the only changes I made to the code related to WI were when I moved some code to the WorldInfoModal and the draft PR #63. But neither change really altered how WI works in any way.

@neCo2
Copy link
Contributor Author

neCo2 commented May 24, 2024

No backend will 'flush the context'

Well, that's an afternoon of coding down the drain. The Token Counts for Memory, AN and WI could still be useful, but frankly, they don't seem all that important to me.
On the other hand all the work's already done, and cutting the PR down to just the token counts should be trivial now, so maybe I'll open a new PR just for that?

moved some code to the WorldInfoModal and the draft PR #63

That's probably what I was thinking of then.

@neCo2 neCo2 closed this May 24, 2024
@neCo2 neCo2 deleted the keep-n-persistent-context-+-persistent-context-token-counts branch June 1, 2024 19:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants