Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[OpenAI] Support logprobs and top_logprobs in ChatCompletion #333

Merged
merged 2 commits into from
Mar 14, 2024

Conversation

CharlieFRuan
Copy link
Contributor

@CharlieFRuan CharlieFRuan commented Mar 14, 2024

Support logprobs and top_logprobs for OpenAI API ChatCompletion. Subsequently, these are added to GenerationConfig as well. See this link for reference.

Implementation-wise, since we need the actual distribution, we use tvmjs.applySoftmaxWithTemperature() and tvmjs.sampleTopPFromProb() instead of tvmjs.sampleTopPFromLogits().

Performance-wise, despite all sampling being done on CPU, with top_logprobs=2, we observe no degradation with Llama-2-7b-q4f32 on M3 Macbook. Note that without logprobs, exact same behavior (hence performance) is maintained.

We also support CompletionUsage in this PR for token counting.

@CharlieFRuan CharlieFRuan marked this pull request as draft March 14, 2024 07:34
@CharlieFRuan
Copy link
Contributor Author

Depends on apache/tvm#16650 for exposing sampleTopPFromProb(). However, we do not need to recompile wasms since no change is done to TVM internally (i.e. method is already included, just not exposed to runtime.ts).

@CharlieFRuan
Copy link
Contributor Author

Below are output of example/openai_api (nonStreaming, then streaming):
non-streaming:

streaming:

@CharlieFRuan CharlieFRuan marked this pull request as ready for review March 14, 2024 16:15
@CharlieFRuan CharlieFRuan merged commit 6d665b9 into mlc-ai:main Mar 14, 2024
CharlieFRuan added a commit that referenced this pull request Mar 14, 2024
Changes in WebLLM:
- Stateful chat completion: #330
- OpenAI's `logit_bias`: #331
- OpenAI's `logprobs` and `top_logprobs`:
#333

Changes in TVMjs:
- apache/tvm#16650
- Fix param download issues (already reflected in 0.2.26, but at the
time this PR was not merged yet)
  - Expose `sampleTopPFromProb` to support `logprobs` (new in 0.2.27)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant