-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Knowledge as system prompt #100
Comments
I agree, this is idea makes more sense than the current XML-like implementation. I put together a draft PR for this feature and deployed it here: https://knowledge-system-prompt.hollama.pages.dev/ I was planning on creating a new issue to ask for help regarding prompt engineering in general. It would give me a better idea of how the Knowledge feature is currently being used. |
I checked out your branch locally and it does not send the custom prompt at all. One example for a system prompt is giving it a character, like
and it will share the wisdom of the Andes with us or some other role-play. One other example was making the model impersonate a fictional character by giving it the wiki article on that character. Or more "useful": I some instructions for ChatGPT that could be imported there that don't make sense as knowledge:
|
Yeah, I think it does send the prompt but the models I tried sort of ignore. Here's an example using I think the system prompt needs to be formatted in a specific way for it to "really stick", right now I'm just feeding the Knowledge directly. |
No, I looked at the network traffic. |
You are right, it wasn't being sent in the first prompt, but it was indeed being sent on the 2nd interaction, that's why I was seeing the correct response. I just re-deployed a fix to that issue, however, it is now sending the system prompt on every session message. Do you know if we should continuously send the same system prompt on every message or we should only limit it to the first one? |
As far as I experienced it, it gets sent with every request, changing it was immediately reflected in the conversation. |
Your patch works :-) |
👌 I haven't had time yet but I need to investigate if sending the system prompt on every message is a good idea. |
I just hit the context window, it forgot who it is and then the conversation fell apart. Oh it was even with system prompt on every message. 🤔 |
@binarynoise just pushed another patch so the system prompt is only sent once. Give it another shot. |
I don't know if it's related or not, but after some messages, I do get a lot of these (in both variants?) and the context gets lost:
|
It's possible that might be a different bug, but I'll look into it. |
🎉 This issue has been resolved in version 0.5.0 🎉 The release is available on GitHub release Your semantic-release bot 📦🚀 |
Currently, knowledge is passed as xml block before the first user input.
If it's not only information but instructions for the model (you are ..., pretend to be ...), it gets confused and interprets it as if I – the user – was doing these things.
When using the ollama cli and setting the knowledge as system prompt, the model behaves as expected.
The knowledge system itself is great, especially if later multiple knowledges can be passed to the model.
I'd suggest adding the possibility to set (one of) the knowledge(s) as system prompt.
The text was updated successfully, but these errors were encountered: