New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
token streaming to stdout #637
Comments
same here , using Mac M1 Max . |
same here, wou ld love a fix. happening on ubuntu console. Turned off debug, not sure how to prevent this from happening. |
Same. |
same |
Same... |
same |
1 similar comment
same |
It's LiteLLM printing that out. If you go to whereever that is installed on your system ("C:\Users\jim\miniconda3\Lib\site-packages\litellm\utils.py" on mine using miniconda) you can comment out the line ~427 print(f"stream result: {result}") to stop that. You will need to start a new session before it takes effect. |
Does it answer to you with the question you asked so when you say hi the respons is hihello how can i help you? |
Not in my case. The suggested hack works, but the LiteLLM peeps should indeed fix this. Amazed this was even committed. |
It worked. Thx a lot!
…On Sat, Oct 14, 2023 at 1:22 AM James White ***@***.***> wrote:
It's LiteLLM printing that out. If you go to whereever that is installed
on your system
("C:\Users\jim\miniconda3\Lib\site-packages\litellm\utils.py" on mine using
miniconda) you can comment out the line ~427 print(f"stream result:
{result}") to stop that. You will need to start a new session before it
takes effect.
—
Reply to this email directly, view it on GitHub
<#637 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AHDJSWUHJHMIE5T74267QKLX7IOQLAVCNFSM6AAAAAA56YUSB6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONRSGYZDINRWGE>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Has anyone raised this issue there (litellm) ? |
this is fixed - BerriAI/litellm#598 let me know if any of y'all see this on an updated version. We're also working on adding testing to make sure errant print statements aren't outputted. Most of our print logic is behind print_verbose. Which is useful for debugging errors. |
As an FYI, here's how you force upgrade litellm using pip to get the error to stop pip install --upgrade litellm Cheers! |
I'm going to close this one as I believe it was fixed in Please feel encouraged to reopen this issue if you experience the same behavior with the latest version. |
Describe the bug
versions after 0.16 verbose streaming
Hello! How can I assist you today?
Reproduce
install latest version
Expected behavior
no streaming jsons
Screenshots
No response
Open Interpreter version
Python version
3.11.6
Operating System name and version
Windows 11
Additional context
No response
The text was updated successfully, but these errors were encountered: