New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chatgpt output word limit #66
Comments
gpt-3.5-turbo has a maximum limit of 4096 tokens, and currently there is no way to surpass this limit. Can only wait for the release of new models or restrictions to be lifted. |
If in Chat mode (set config.conversation_first true), the ability to pipe the input file so that we can type 'continue' to let chatgpt continue typing before. The chatgpt web(https://chat.openai.com/chat) is able to do this. |
The only reason the stream has ended but the output hasn't finished is due to the max_tokens limit. It's the same with or without conversation. There is no chatgpt like |
yeah, I get this. Can you add the ability to pipe the input file in Chat mode, so we can type 'continue' to let chatgpt continue typing before(When the stream has ended but the output hasn't finished). |
You can copy from |
No, not fetch the output, is importing a large file |
Why not use command mode?
If you insist on using chat mode, you can use |
The output word too much, once can not be output completed, I need touch 'continue' manually in Chat mode. |
I don't think aichat/gpt-3.5 will interrupt prematurely before using up 4096 tokens. If a request/conversation has no more output, it means either the response is completed or the token has run out. No continue in aichat/gpt-3.5 |
pipe input/output, but output word too much, once can not be output completed, how to continue to output the previous content? Or in Chat mode (set config.conversation_first true), the ability to pipe the input file so that we can type 'continue' to let chatgpt continue typing before ?
The text was updated successfully, but these errors were encountered: