Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chat mode --no-stream option #248

Closed
vividfog opened this issue Sep 6, 2023 · 2 comments
Closed

chat mode --no-stream option #248

vividfog opened this issue Sep 6, 2023 · 2 comments
Labels
enhancement New feature or request
Milestone

Comments

@vividfog
Copy link

vividfog commented Sep 6, 2023

I am writing a plugin, and I've understood that this will tell llm to always be in --no-stream mode when these models are called:

class Chat(llm.Model):
    can_stream: bool = False

Is this what's needed and all that is needed? The plugin will only run when stream=False to manage upstream user expectations.

def execute(self, prompt, stream, response, conversation):

When debugging that method, stream correctly ends up being false, even without --no-stream when not in chat mode.

But it seems that stream=True is always sent to execute() when in chat mode, and using --no-stream via command line will disable the chat mode itself. Here are some test runs with the plugin, vs. OpenAI gpt4 alias. The behavior seems similar.

> llm "hi"
Hello! How can I assist you today?

> llm chat
Error: Sorry, this plugin doesn't support streaming yet.     # stream was set to to True by the host

> llm --no-stream chat
Hello! How can I assist you today?     # chat mode not applied but there is a greeting

> llm chat -m gpt4
Chatting with gpt-4
Type 'exit' or 'quit' to exit

> llm --no-stream chat -m gpt4
Hello! How can I assist you today?     # chat mode not applied but there is a greeting

> llm chat -m gpt4 --no-stream
Usage: llm chat [OPTIONS]
Try 'llm chat --help' for help.
Error: No such option: --no-stream Did you mean --system?

> llm chat -o verbose     # the default model does have this option in non-chat mode
Usage: llm chat [OPTIONS]
Try 'llm chat --help' for help.
Error: No such option: -o

Applying the plugin's own announced streaming boolean would fix this for me. Allowing explicit --no-stream while in chat mode is nice but I don't personally have a strong use case to offer for that, other than perhaps consistency.

Do you plan to support model options in chat mode? They are often needed with local models. The new mode is a very welcome addition, so thanks a lot for working on it.

@vividfog
Copy link
Author

vividfog commented Sep 6, 2023

#244 talks about chat options, that's good

@simonw simonw added the bug Something isn't working label Sep 10, 2023
@simonw simonw changed the title 0.10a0: chat mode and --no-stream mode don't work together / plugin can_stream not applied chat mode --no-stream option Sep 10, 2023
@simonw simonw added enhancement New feature or request and removed bug Something isn't working labels Sep 10, 2023
@simonw simonw closed this as completed in 5912bd4 Sep 10, 2023
@simonw
Copy link
Owner

simonw commented Sep 10, 2023

I tested llm chat --no-stream manually against both GPT-4 and Llama 2, it worked as intended in both cases.

simonw added a commit that referenced this issue Sep 10, 2023
@simonw simonw added this to the 0.10 milestone Sep 10, 2023
simonw added a commit that referenced this issue Sep 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants