Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CODE IMPROVEMENT] Add system prompt to the chat interface #548

Open
pascal-pfeiffer opened this issue Dec 20, 2023 · 2 comments
Open

[CODE IMPROVEMENT] Add system prompt to the chat interface #548

pascal-pfeiffer opened this issue Dec 20, 2023 · 2 comments
Labels
area/core Core code related issue

Comments

@pascal-pfeiffer
Copy link
Collaborator

🔧 Proposed code refactoring

Add system prompt to the chat interface

Motivation

Removes the need to test system prompts outside of LLM Studio

@pascal-pfeiffer pascal-pfeiffer added the area/core Core code related issue label Dec 20, 2023
@FavioVazquez
Copy link

Could be interesting to add a system prompt before fine-tuning directly from an input box or something like that

@maxjeblick
Copy link
Contributor

maxjeblick commented Dec 20, 2023

Could be interesting to add a system prompt before fine-tuning directly from an input box or something like that

This is probably not too helpful, as all finetuning samples would then get the exact same system prompt (i.e. there is no information to be learned from the system prompt).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/core Core code related issue
Projects
None yet
Development

No branches or pull requests

3 participants