Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Tracking] Support more OpenAI parameters for chat completion endpoint in REST API #683

Closed
7 tasks done
sudeepag opened this issue Aug 7, 2023 · 1 comment
Closed
7 tasks done
Assignees
Labels
status: tracking Tracking work in progress

Comments

@sudeepag
Copy link
Collaborator

sudeepag commented Aug 7, 2023

Overview

The OpenAI chat completion API currently supports a number of parameters that are not currently supported in the mlc-llm REST API, such as temperature, stop sequences, etc. This issue tracks adding support for these parameters, so that they can be used by downstream dependencies of the REST API, such as Langchain integration.

Note that some parameters may require changes in the underlying chat module or llm_chat.

Action Items

  • temperature
  • top_p
  • n
  • stop
  • max_tokens
  • presence_penalty
  • frequency_penalty

See the OpenAI API Reference for details on each of these parameters.

Links to Related Issues and PRs

@davidpissarra
Copy link
Member

Concluded by #1107

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status: tracking Tracking work in progress
Projects
Status: Done
Development

No branches or pull requests

2 participants