Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Knowledge as system prompt #100

Closed
binarynoise opened this issue Jul 19, 2024 · 13 comments · Fixed by #106
Closed

Knowledge as system prompt #100

binarynoise opened this issue Jul 19, 2024 · 13 comments · Fixed by #106
Labels
released triage Need to investigate further

Comments

@binarynoise
Copy link
Contributor

Currently, knowledge is passed as xml block before the first user input.
If it's not only information but instructions for the model (you are ..., pretend to be ...), it gets confused and interprets it as if I – the user – was doing these things.
When using the ollama cli and setting the knowledge as system prompt, the model behaves as expected.

The knowledge system itself is great, especially if later multiple knowledges can be passed to the model.
I'd suggest adding the possibility to set (one of) the knowledge(s) as system prompt.

@fmaclen
Copy link
Owner

fmaclen commented Jul 19, 2024

I agree, this is idea makes more sense than the current XML-like implementation.

I put together a draft PR for this feature and deployed it here: https://knowledge-system-prompt.hollama.pages.dev/
If you can, would you test it out?

I was planning on creating a new issue to ask for help regarding prompt engineering in general.
On that note, would you be willing to provide examples of useful system prompts?

It would give me a better idea of how the Knowledge feature is currently being used.
I personally have only used it (with mixed results) for querying documentation:

image

@binarynoise
Copy link
Contributor Author

I checked out your branch locally and it does not send the custom prompt at all.

One example for a system prompt is giving it a character, like

You are a holy llama.

and it will share the wisdom of the Andes with us or some other role-play. One other example was making the model impersonate a fictional character by giving it the wiki article on that character.

Or more "useful": I some instructions for ChatGPT that could be imported there that don't make sense as knowledge:

I am a skilled professional developer, just like you.
I am interested in complete, unbiased, independent, absolute and universal truth.
My primary programming languages are kotlin and bash.

@fmaclen
Copy link
Owner

fmaclen commented Jul 19, 2024

I checked out your branch locally and it does not send the custom prompt at all.

Yeah, I think it does send the prompt but the models I tried sort of ignore.

Here's an example using phi3:latest:

image
image

I think the system prompt needs to be formatted in a specific way for it to "really stick", right now I'm just feeding the Knowledge directly.

@binarynoise
Copy link
Contributor Author

binarynoise commented Jul 19, 2024

No, I looked at the network traffic.
It does not get sent at all.
For me, system prompts didn't need any special formatting (in ollama cli), they just worked.

@fmaclen
Copy link
Owner

fmaclen commented Jul 19, 2024

You are right, it wasn't being sent in the first prompt, but it was indeed being sent on the 2nd interaction, that's why I was seeing the correct response.

I just re-deployed a fix to that issue, however, it is now sending the system prompt on every session message.

Do you know if we should continuously send the same system prompt on every message or we should only limit it to the first one?

@binarynoise
Copy link
Contributor Author

As far as I experienced it, it gets sent with every request, changing it was immediately reflected in the conversation.

@binarynoise
Copy link
Contributor Author

Your patch works :-)

@fmaclen
Copy link
Owner

fmaclen commented Jul 19, 2024

Your patch works :-)

👌

I haven't had time yet but I need to investigate if sending the system prompt on every message is a good idea.

@binarynoise
Copy link
Contributor Author

binarynoise commented Jul 19, 2024

I just hit the context window, it forgot who it is and then the conversation fell apart. Oh it was even with system prompt on every message. 🤔

@fmaclen
Copy link
Owner

fmaclen commented Jul 19, 2024

@binarynoise just pushed another patch so the system prompt is only sent once. Give it another shot.

@binarynoise
Copy link
Contributor Author

binarynoise commented Jul 19, 2024

I don't know if it's related or not, but after some messages, I do get a lot of these (in both variants?) and the context gets lost:

Sorry, something went wrong.

SyntaxError: JSON.parse: unexpected end of data at line 1 column 32762 of the JSON data

@fmaclen
Copy link
Owner

fmaclen commented Jul 19, 2024

It's possible that might be a different bug, but I'll look into it.

@fmaclen fmaclen added the triage Need to investigate further label Jul 20, 2024
@fmaclen
Copy link
Owner

fmaclen commented Jul 21, 2024

🎉 This issue has been resolved in version 0.5.0 🎉

The release is available on GitHub release

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
released triage Need to investigate further
Projects
None yet
2 participants