Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chat ML settings #11

Open
kajuberdut opened this issue Jan 28, 2024 · 3 comments
Open

Chat ML settings #11

kajuberdut opened this issue Jan 28, 2024 · 3 comments

Comments

@kajuberdut
Copy link

kajuberdut commented Jan 28, 2024

I read through the code for how User String and Bot String are used to try and make sure I'm setting them in a sane way for ChatML:

The ChatML format looks like:

<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
{prompt}<|im_end|>

a chatML model will add <|im_end|> for the end of generation so I don't think that needs to be explicitly in the User String but a user won't want to manually add <|im_end|> when using quick instructions so I added it to the start of the Bot String.

Does this seem like the correct "Custom" config for ChatML?
image

@bridgesense
Copy link

There's actually a bug here. The User and Bot string are switched around.

user and bot switched

This is an amazing extension!

@bridgesense
Copy link

bridgesense commented Feb 3, 2024

It could just be that way in Notebook B. It's a bit confusing but not a real issue. I just started using this, today. I suppose I should clarify, that when using the custom string, the response calls the user what is present in the Bot string.

@bridgesense
Copy link

Nevermind. It finally dawned on me what's happening here. This notebook is really geared for dialogue. You can simple tell it, who's who during the custom instructions --- ...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants