Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Command R chat template #6650

Merged
merged 4 commits into from Apr 14, 2024
Merged

Conversation

jc19chaoj
Copy link
Contributor

Added the chat template for c4ai-command-r-v01 and c4ai-command-r-plus in llama_chat_apply_template_internal so that we can use --chat-template command-r option when running command-r models using llama.cpp's openai api server.

The command-r chat template follows this format: <BOS_TOKEN><|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>{system}<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|USER_TOKEN|>{prompt}<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>{response}

@ngxson
Copy link
Collaborator

ngxson commented Apr 13, 2024

We're missing test case for this template. Please follow the full procedure from this article to add a new template.

@satyaloka93
Copy link

Thanks for submitting this, eagerly awaiting it's approval!!

@jc19chaoj
Copy link
Contributor Author

We're missing test case for this template. Please follow the full procedure from this article to add a new template.

Chat template test for command-R is now added.

@satyaloka93
Copy link

Works for me! Had to drop my context to 2k on my RTX 4090 to get it working in IQ_4_XS quant.

Copy link
Collaborator

@ngxson ngxson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ngxson ngxson merged commit 04fbc5f into ggerganov:master Apr 14, 2024
56 of 59 checks passed
@jc19chaoj jc19chaoj deleted the command-r-chat-template branch April 15, 2024 01:37
tybalex pushed a commit to tybalex/function.cpp that referenced this pull request Apr 17, 2024
* Add chat template for command-r model series

* Fix indentation

* Add chat template test for command-r models and update the implementation to trim whitespaces

* Remove debug print
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants