[Tracking] Support more OpenAI parameters for chat completion endpoint in REST API #683
Closed
7 tasks done
Labels
status: tracking
Tracking work in progress
Overview
The OpenAI chat completion API currently supports a number of parameters that are not currently supported in the mlc-llm REST API, such as temperature, stop sequences, etc. This issue tracks adding support for these parameters, so that they can be used by downstream dependencies of the REST API, such as Langchain integration.
Note that some parameters may require changes in the underlying chat module or
llm_chat
.Action Items
temperature
top_p
n
stop
max_tokens
presence_penalty
frequency_penalty
See the OpenAI API Reference for details on each of these parameters.
Links to Related Issues and PRs
The text was updated successfully, but these errors were encountered: