Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support openai chat #5

Closed
drorm opened this issue Mar 29, 2023 · 1 comment
Closed

Support openai chat #5

drorm opened this issue Mar 29, 2023 · 1 comment

Comments

@drorm
Copy link
Owner

drorm commented Mar 29, 2023

Background
See https://platform.openai.com/chat for details about about what chat means, but fundamentally, we're providing a way to send the following:

  messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Who won the world series in 2020?"},
        {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
        {"role": "user", "content": "Where was it played?"}
    ]
)

This is similar experience to the one at https://chat.openai.com/chat.

Implementation details
The different ways to start a chat.

  • CLI: gish -c. To keep the conversation, you need to give the flag each time, but it's simple to do something like
alias gchat gish -c

and then call gchat instead of gish when you want to be in a chat.

  • Interactive: chat. In interactive mode, your prompt changes from '> ' to 'chat >' to remind you that you're in chat mode. To exit chat mode you type 'exit'.

limitations
Since GPT3.5 is limited to 4K tokens, it can be quite easy to reach the limit in a conversation.
We currently have no way to prevent this, and probably don't handle the exception gracefully. Need to explore this in a separate ticket.
It seems like https://chat.openai.com/chat handles these issues gracefully, probably when it reaches limits, it summarizes the previous conversation and then continues.

We are sticking to the roles: system, user, assistant for this initial version.

@drorm
Copy link
Owner Author

drorm commented Mar 30, 2023

Initially the plan was to provide an argument n to the -c or the "chat" in interactive, but on further thought, since as long as you keep the chat going, each request includes all the previous requests in the current chat, it's probably not necessary, and the syntax just gets awkward.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant