Replies: 1 comment 2 replies
-
|
One question: does the scope of this potentially include making |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
We are planning to improve the UX of
llama-cli, more details can be found in this issue.The plan is to migrate the code base of
llama-cliinto allama-server-based client. This will effectively allow CLI to inherit all of the features available on server, including:llama-clifails in certain cases)The current
llama-cliwill be moved to a new example calledllama-completion, and the code will be kept simple to serve as a learning example. If you are already usingllama-cliin a deterministic way in your pipeline, please consider usingllama-completionif you encouters any problems.The new
llama-cliwill have enhanced features (as mentioned above) and improved user experience.This discussion is added so that users can discuss issues and workarounds if needed.
Beta Was this translation helpful? Give feedback.
All reactions