Skip to content
This repository has been archived by the owner on Sep 12, 2024. It is now read-only.

implement new sampling logic for llama.cpp #36

Closed
hlhr202 opened this issue May 1, 2023 · 0 comments · Fixed by #38
Closed

implement new sampling logic for llama.cpp #36

hlhr202 opened this issue May 1, 2023 · 0 comments · Fixed by #38
Assignees
Labels
enhancement New feature or request

Comments

@hlhr202
Copy link
Member

hlhr202 commented May 1, 2023

llama.cpp introduce new sampling logic here:
ggerganov/llama.cpp#1126

@hlhr202 hlhr202 self-assigned this May 1, 2023
@hlhr202 hlhr202 added the enhancement New feature or request label May 1, 2023
@hlhr202 hlhr202 linked a pull request May 2, 2023 that will close this issue
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant