Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mechanism for continuing an existing conversation #6

Closed
simonw opened this issue Apr 2, 2023 · 10 comments
Closed

Mechanism for continuing an existing conversation #6

simonw opened this issue Apr 2, 2023 · 10 comments
Labels
enhancement New feature or request

Comments

@simonw
Copy link
Owner

simonw commented Apr 2, 2023

The ChatGPT endpoints only work for chatting if you manually send back your previous questions and responses:

https://til.simonwillison.net/gpt3/chatgpt-api

This tool could help with that, maybe through a llm chat command?

UPDATE: Or a -c/--continue option for continuing the most recent conversation.

@simonw simonw added the enhancement New feature or request label Apr 2, 2023
@simonw
Copy link
Owner Author

simonw commented Apr 2, 2023

But how can it maintain track of a specific conversation?

There could be a command to start a new chat, then once started future chat commands maintain within that chat until you end the chat - or switch to another one.

Chats will make a lot more sense in the web interface, see:

Maybe there's a mode where a chat session is an interactive terminal process where you type in more text and hit enter to send a line at a time.

@simonw
Copy link
Owner Author

simonw commented Apr 2, 2023

The logging database schema should take chats into account as well.

simonw added a commit that referenced this issue Jun 14, 2023
Refs #6

* Add a chat_id to requests
* Fail early if the log.db is absent
* Automatically add the chat_id column.

Co-authored-by: Simon Willison <swillison@gmail.com>
@simonw
Copy link
Owner Author

simonw commented Jun 14, 2023

Demo of the freshly-merged #14 by @amjith:

$ llm 'What do otters like to eat?'
Otters are carnivores and they mainly eat fish, shellfish, crustaceans, and other aquatic animals. They have a varied diet, which includes clams, mussels, snails, crabs, crayfish, frogs, small mammals, and birds. They also enjoy eating insects and some vegetation occasionally.
$ llm -c 'where do they live?'
Usage: llm chatgpt [OPTIONS] [PROMPT]
Try 'llm chatgpt --help' for help.

Error: Invalid value for '-c' / '--continue': 'where do they live?' is not a valid integer.
$ llm 'where do they live?' -c
Otters are found in various parts of the world, inhabiting both freshwater and marine environments. There are 13 different species of otters, and their habitats vary depending on the species. 

River otters are commonly found in North America, from Alaska to Mexico, while sea otters live along the west coast of North America, from California to Alaska. Eurasian otters inhabit the rivers, lakes, and wetlands of Europe and Asia. 

Giant river otters, as their name suggests, are the largest of the otter species and can be found in the Amazon River basin of South America. 

Other species such as the smooth-coated otter, clawless otter and African clawless otter can be found in various parts of Asia and Africa, respectively.

It's a bit annoying that llm -c 'where do they live?' doesn't work, because the -c option needs to come at the very end. I wonder if we can improve that.

@simonw
Copy link
Owner Author

simonw commented Jun 14, 2023

Idea: llm -c/--continue could always act as a flag. That way you can use -c anywhere in the command.

Then llm --chat 44 could be an alternative option which requires a valid chat ID argument.

@simonw simonw changed the title chat command, for chatting Mechanism for continuing an existing conversation Jun 14, 2023
@simonw
Copy link
Owner Author

simonw commented Jun 14, 2023

I think --continue should reuse the same model as the logged conversation, unless the user specifies -m directly.

@simonw
Copy link
Owner Author

simonw commented Jun 14, 2023

I broke this - I forgot to set the default model so now 'llm "prompt"` complains that no model was specified.

@simonw
Copy link
Owner Author

simonw commented Jun 14, 2023

I'm having second thoughts about the --chat 123 option now - because maybe I want to use --chat as an option that starts an interactive chat UI instead?

@simonw simonw reopened this Jun 14, 2023
@simonw
Copy link
Owner Author

simonw commented Jun 14, 2023

Options for an interactive chat mode:

  • It's a new command - llm chat -m gpt-4 for example. This feels a bit odd since the current default command is actually llm chatgpt ... and llm chat feels confusing.
  • It's part of the default command: llm --chat -4 starts one running.

Maybe the llm chatgpt command is mis-named, especially since it can be used to work with GPT-4.

@simonw
Copy link
Owner Author

simonw commented Jun 14, 2023

I named llm chatgpt that because I thought I'd have a separate command for bard and for llama and so-on, and because I thought the other OpenAI complete APIs (the non-chat ones, like GPT-3) may end up with a separate command.

@simonw
Copy link
Owner Author

simonw commented Jun 14, 2023

I'm closing this in favour of an issue to redesign the top-level set of commands.

@simonw simonw closed this as completed Jun 14, 2023
simonw added a commit that referenced this issue Jun 17, 2023
simonw added a commit that referenced this issue Jul 10, 2023
Refs #6

* Add a chat_id to requests
* Fail early if the log.db is absent
* Automatically add the chat_id column.

Co-authored-by: Simon Willison <swillison@gmail.com>
simonw added a commit that referenced this issue Jul 10, 2023
simonw added a commit that referenced this issue Jul 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant