Skip to content

Support using chat template in the tokenizer and use the generation config in model for vllm and openai endpoint #3141

Support using chat template in the tokenizer and use the generation config in model for vllm and openai endpoint

Support using chat template in the tokenizer and use the generation config in model for vllm and openai endpoint #3141

Triggered via pull request May 21, 2024 04:03
Status Success
Total duration 37s
Artifacts

python-package.yml

on: pull_request
Matrix: build
Fit to window
Zoom out
Zoom in

Annotations

1 warning
build (3.10)
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: actions/checkout@v3, actions/setup-python@v4. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.