Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request Chatml support #203

Closed
GameOverFlowChart opened this issue Dec 9, 2023 · 9 comments
Closed

Feature request Chatml support #203

GameOverFlowChart opened this issue Dec 9, 2023 · 9 comments
Labels
enhancement New feature or request

Comments

@GameOverFlowChart
Copy link

As #193 (comment) says it's already planned so this issue is just to follow the development. My tests with the ai show that this is really needed and I expect big improvements from it. Hopefully Chatml will become the standard and no more new formats will be needed.

@danemadsen
Copy link
Member

danemadsen commented Dec 10, 2023

ChatML is a newer feature of llama.cpp. currently theres issues running the most up to date versions of llama.cpp, so we will look at chatML support once we can get llamacpp to be stable again

danemadsen added a commit to danemadsen/maid that referenced this issue Dec 10, 2023
@danemadsen danemadsen added the enhancement New feature or request label Dec 10, 2023
@GameOverFlowChart
Copy link
Author

ChatML is a newer feature of llama.cpp. currently theres issues running the most up to date versions of llama.cpp, so we will look at chatML support once we can get llamacpp to be stable again

Does only Maid have problems with the new llamacpp or are the people at llamacpp aware of the problem on their end?

@danemadsen
Copy link
Member

ChatML is a newer feature of llama.cpp. currently theres issues running the most up to date versions of llama.cpp, so we will look at chatML support once we can get llamacpp to be stable again

Does only Maid have problems with the new llamacpp or are the people at llamacpp aware of the problem on their end?

Pretty sure its just a maid thing but also pretty sure I've fixed it with the latest actions build

@Ar57m
Copy link
Contributor

Ar57m commented Dec 11, 2023

stop generation button disappeared on android

@GameOverFlowChart
Copy link
Author

GameOverFlowChart commented Dec 11, 2023

stop generation button disappeared on android

Does this problem have to do with this issue? Also there was a stop generation button? I never saw that one.

@Ar57m
Copy link
Contributor

Ar57m commented Dec 11, 2023

stop generation button disappeared on android

Does this problem have to do with this issue? Also there was a stop generation button? I never saw that one.

No, I'm sorry to mention that here, I didn't want to open a new issue 😉, there was a red button some time ago now I saw it disappeared

@danemadsen
Copy link
Member

Not really sure what needs to be done to add chatml support? You can already add input suffix and input prefix though user and response alias?

@danemadsen danemadsen closed this as not planned Won't fix, can't repro, duplicate, stale Dec 14, 2023
@danemadsen
Copy link
Member

I added it in as a special prefix anyway

@GameOverFlowChart
Copy link
Author

Not really sure what needs to be done to add chatml support? You can already add input suffix and input prefix though user and response alias?

Uhm I'm a bit confused because the build which I installed has that as an option?

As I opened this issue I had problems with adging <|im_start|>system in the system prompt and <|im_start|>user / assistant because somehow that made the app (or model I don't remember) crash.

It's hard to tell if the prompt is correct because there is no way to see the raw input and output.

<|im_start|> is also a special token if I'm not wrong but with the right tokenizer it should work?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants