Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

truncate parameter ignored for OpenAI chat_completions endpoint #1765

Open
calycekr opened this issue Mar 25, 2025 · 0 comments
Open

truncate parameter ignored for OpenAI chat_completions endpoint #1765

calycekr opened this issue Mar 25, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@calycekr
Copy link
Contributor

calycekr commented Mar 25, 2025

Bug description

The truncate parameter in the ChatUI configuration is not being applied when using the OpenAI chat_completions endpoint.

Root Cause

The issue arises because the chat_completions endpoint does not utilize the buildPrompt function where the truncate parameter is handled. The logic for truncation is solely within buildPrompt and is therefore bypassed entirely when processing chat_completions requests. This means there's no truncation mechanism applied to the chat history before it's sent to vllm-openai or OpenAI.

#1654

@calycekr calycekr added the bug Something isn't working label Mar 25, 2025
@calycekr calycekr changed the title truncate parameter ignored for OpenAI chat_completions endpoint truncate parameter ignored for OpenAI chat_completions endpoint Mar 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant