Skip to content

[Question]: Configure OpenAI + vLLM (OpenAI endpoint) at the same time #1432

Answered by danny-avila
Tostino asked this question in Q&A
Discussion options

You must be logged in to vote

The only way to use both simultaneously is to have a dedicated endpoint for reverse proxy: #1344

Not yet implemented, will work on it soon.

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@danny-avila
Comment options

@Tostino
Comment options

@danny-avila
Comment options

Answer selected by Tostino
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
question Further information is requested
2 participants
Converted from issue

This discussion was converted from issue #1431 on December 25, 2023 17:05.