Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for baichuan #365

Merged
merged 10 commits into from
Jul 17, 2023
Merged

Add support for baichuan #365

merged 10 commits into from
Jul 17, 2023

Conversation

codethazine
Copy link
Contributor

Closes #303

Should be merged after #364 - as the model requires its remote tokenizer to run correctly

@codethazine codethazine marked this pull request as draft July 5, 2023 09:43
@codethazine codethazine changed the title Add support for baichuan [WIP] Add support for baichuan Jul 5, 2023
@codethazine codethazine changed the title [WIP] Add support for baichuan Add support for baichuan Jul 5, 2023
@codethazine codethazine marked this pull request as ready for review July 5, 2023 17:10
@gesanqiu
Copy link
Contributor

gesanqiu commented Jul 6, 2023

Thanks, but I think still need to add a conversation template in FastChat since /v1/chat/completion API requires it. Or can you give a prompt template for /v1/completions, I can't wait to try this model.

@codethazine codethazine changed the title Add support for baichuan [WIP] Add support for baichuan Jul 6, 2023
@codethazine
Copy link
Contributor Author

codethazine commented Jul 7, 2023

This is a good prompt template I came up with for the completions:

curl http://localhost:8000/v1/completions \
    -d '{
    "model": "baichuan-inc/baichuan-7b",
        "prompt": "A chat between a curious user and an artificial intelligence assistant. \n### USER: Hello!\n### ASSISTANT:",
        "stop": "\n",
        "max_tokens": 42
    }'

@codethazine codethazine changed the title [WIP] Add support for baichuan Add support for baichuan Jul 7, 2023
@lucasjinreal
Copy link

Any updates on this?

I saw there a some PRs try support baichuan now, please consider merge one otherwise none of them can be properly supported

Copy link
Collaborator

@zhuohan123 zhuohan123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your excellent contribution! This PR looks good to me. Sorry that we have been busy in the past several weeks so we didn't get a chance to review this. I just tested this out and it works great!

@zhuohan123 zhuohan123 merged commit 20b0d88 into vllm-project:main Jul 17, 2023
2 checks passed
hongxiayang pushed a commit to hongxiayang/vllm that referenced this pull request Feb 13, 2024
sjchoi1 pushed a commit to casys-kaist-internal/vllm that referenced this pull request May 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

why not support baichuan-7b?
4 participants