-
-
Notifications
You must be signed in to change notification settings - Fork 354
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLaMa.cpp broken right now #24
Labels
closed | done
Fixed or otherwise implemented.
Comments
Can you try with the latest version? |
Josh-XT
added
type | report | bug
Confirmed bug in source code.
reply needed | waiting for response
Waiting for more info from the creator of the issue. If not responded to in a week, may be closed.
reply needed | please retest
Waiting for a retest from the creator of the issue. If not responded to in a week, may be closed.
labels
Apr 22, 2023
Had to do
|
eraviart
added a commit
to eraviart/Agent-LLM
that referenced
this issue
Apr 23, 2023
Josh-XT
added
closed | done
Fixed or otherwise implemented.
and removed
type | report | bug
Confirmed bug in source code.
reply needed | waiting for response
Waiting for more info from the creator of the issue. If not responded to in a week, may be closed.
reply needed | please retest
Waiting for a retest from the creator of the issue. If not responded to in a week, may be closed.
labels
Apr 23, 2023
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Using vicuna-13B
Requested tokens exceed context window of 2000
The text was updated successfully, but these errors were encountered: