Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: batch inference for nitro #41

Closed
tikikun opened this issue Oct 2, 2023 · 1 comment · Fixed by #101
Closed

feat: batch inference for nitro #41

tikikun opened this issue Oct 2, 2023 · 1 comment · Fixed by #101
Assignees
Labels
P1: important Important feature / fix type: enhancement New feature or request
Milestone

Comments

@tikikun
Copy link
Collaborator

tikikun commented Oct 2, 2023

upstream from

ggerganov/llama.cpp#3228

@tikikun tikikun added the P0: critical Mission critical label Oct 2, 2023
@tikikun tikikun self-assigned this Oct 2, 2023
@0xSage 0xSage added P1: important Important feature / fix type: enhancement New feature or request and removed P0: critical Mission critical labels Oct 9, 2023
@0xSage 0xSage added this to the Nitro v0.2 milestone Oct 9, 2023
@tikikun
Copy link
Collaborator Author

tikikun commented Oct 26, 2023

continue working on this since the upstream is already somewhat stable

@tikikun tikikun linked a pull request Nov 3, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
P1: important Important feature / fix type: enhancement New feature or request
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

2 participants