Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

token streaming to stdout #637

Closed
veonua opened this issue Oct 13, 2023 · 15 comments
Closed

token streaming to stdout #637

veonua opened this issue Oct 13, 2023 · 15 comments
Labels
Bug Something isn't working

Comments

@veonua
Copy link

veonua commented Oct 13, 2023

Describe the bug

versions after 0.16 verbose streaming

hi

stream result: {
  "id": "chatcmpl-897VdZUcQblQ4J5aHRIPfxQLk7Q0K",
  "object": "chat.completion.chunk",
  "created": 1697184509,
  "model": "gpt-4-0613",
  "choices": [
    {
      "index": 0,
      "delta": {
        "role": "assistant",
        "content": ""
      },
      "finish_reason": null
    }
  ]
}
stream result: {
  "id": "chatcmpl-897VdZUcQblQ4J5aHRIPfxQLk7Q0K",
  "object": "chat.completion.chunk",
  "created": 1697184509,
  "model": "gpt-4-0613",
  "choices": [
    {
      "index": 0,
      "delta": {
        "content": "Hello"
      },
      "finish_reason": null
    }
  ]
}
stream result: {
  "id": "chatcmpl-897VdZUcQblQ4J5aHRIPfxQLk7Q0K",
  "object": "chat.completion.chunk",
  "created": 1697184509,
  "model": "gpt-4-0613",
  "choices": [
    {
      "index": 0,
      "delta": {
        "content": "!"
      },
      "finish_reason": null
    }
  ]
}
stream result: {
  "id": "chatcmpl-897VdZUcQblQ4J5aHRIPfxQLk7Q0K",
  "object": "chat.completion.chunk",
  "created": 1697184509,
  "model": "gpt-4-0613",
  "choices": [
    {
      "index": 0,
      "delta": {
        "content": " How"
      },
      "finish_reason": null
    }
  ]
}
stream result: {
  "id": "chatcmpl-897VdZUcQblQ4J5aHRIPfxQLk7Q0K",
  "object": "chat.completion.chunk",
  "created": 1697184509,
  "model": "gpt-4-0613",
  "choices": [
    {
      "index": 0,
      "delta": {
        "content": " can"
      },
      "finish_reason": null
    }
  ]
}
stream result: {
  "id": "chatcmpl-897VdZUcQblQ4J5aHRIPfxQLk7Q0K",
  "object": "chat.completion.chunk",
  "created": 1697184509,
  "model": "gpt-4-0613",
  "choices": [
    {
      "index": 0,
      "delta": {
        "content": " I"
      },
      "finish_reason": null
    }
  ]
}
stream result: {
  "id": "chatcmpl-897VdZUcQblQ4J5aHRIPfxQLk7Q0K",
  "object": "chat.completion.chunk",
  "created": 1697184509,
  "model": "gpt-4-0613",
  "choices": [
    {
      "index": 0,
      "delta": {
        "content": " assist"
      },
      "finish_reason": null
    }
  ]
}
stream result: {
  "id": "chatcmpl-897VdZUcQblQ4J5aHRIPfxQLk7Q0K",
  "object": "chat.completion.chunk",
  "created": 1697184509,
  "model": "gpt-4-0613",
  "choices": [
    {
      "index": 0,
      "delta": {
        "content": " you"
      },
      "finish_reason": null
    }
  ]
}
stream result: {
  "id": "chatcmpl-897VdZUcQblQ4J5aHRIPfxQLk7Q0K",
  "object": "chat.completion.chunk",
  "created": 1697184509,
  "model": "gpt-4-0613",
  "choices": [
    {
      "index": 0,
      "delta": {
        "content": " today"
      },
      "finish_reason": null
    }
  ]
}
stream result: {
  "id": "chatcmpl-897VdZUcQblQ4J5aHRIPfxQLk7Q0K",
  "object": "chat.completion.chunk",
  "created": 1697184509,
  "model": "gpt-4-0613",
  "choices": [
    {
      "index": 0,
      "delta": {
        "content": "?"
      },
      "finish_reason": null
    }
  ]
}
stream result: {
  "id": "chatcmpl-897VdZUcQblQ4J5aHRIPfxQLk7Q0K",
  "object": "chat.completion.chunk",
  "created": 1697184509,
  "model": "gpt-4-0613",
  "choices": [
    {
      "index": 0,
      "delta": {},
      "finish_reason": "stop"
    }
  ]
}

Hello! How can I assist you today?

Reproduce

install latest version

Expected behavior

no streaming jsons

Screenshots

No response

Open Interpreter version

=0.1.7

Python version

3.11.6

Operating System name and version

Windows 11

Additional context

No response

@veonua veonua added the Bug Something isn't working label Oct 13, 2023
@owgit
Copy link

owgit commented Oct 13, 2023

same here , using Mac M1 Max .
even if "%debug false"

@couturelp
Copy link

same here, wou ld love a fix. happening on ubuntu console. Turned off debug, not sure how to prevent this from happening.

@grexzen
Copy link

grexzen commented Oct 13, 2023

Same.

@brisklad
Copy link

same

@xmi1an
Copy link

xmi1an commented Oct 14, 2023

Same...

@wxtt522
Copy link

wxtt522 commented Oct 14, 2023

same

1 similar comment
@QIanGua
Copy link

QIanGua commented Oct 14, 2023

same

@jakenuts
Copy link

It's LiteLLM printing that out. If you go to whereever that is installed on your system ("C:\Users\jim\miniconda3\Lib\site-packages\litellm\utils.py" on mine using miniconda) you can comment out the line ~427 print(f"stream result: {result}") to stop that. You will need to start a new session before it takes effect.

@AdelElo13
Copy link

It's LiteLLM printing that out. If you go to whereever that is installed on your system ("C:\Users\jim\miniconda3\Lib\site-packages\litellm\utils.py" on mine using miniconda) you can comment out the line ~427 print(f"stream result: {result}") to stop that. You will need to start a new session before it takes effect.

Does it answer to you with the question you asked so when you say hi the respons is hihello how can i help you?

@Panoplos
Copy link

It's LiteLLM printing that out. If you go to whereever that is installed on your system ("C:\Users\jim\miniconda3\Lib\site-packages\litellm\utils.py" on mine using miniconda) you can comment out the line ~427 print(f"stream result: {result}") to stop that. You will need to start a new session before it takes effect.

Does it answer to you with the question you asked so when you say hi the respons is hihello how can i help you?

Not in my case. The suggested hack works, but the LiteLLM peeps should indeed fix this. Amazed this was even committed.

@brisklad
Copy link

brisklad commented Oct 14, 2023 via email

@QIanGua
Copy link

QIanGua commented Oct 14, 2023

It's LiteLLM printing that out. If you go to whereever that is installed on your system ("C:\Users\jim\miniconda3\Lib\site-packages\litellm\utils.py" on mine using miniconda) you can comment out the line ~427 print(f"stream result: {result}") to stop that. You will need to start a new session before it takes effect.

Does it answer to you with the question you asked so when you say hi the respons is hihello how can i help you?

Not in my case. The suggested hack works, but the LiteLLM peeps should indeed fix this. Amazed this was even committed.

Has anyone raised this issue there (litellm) ?

@krrishdholakia
Copy link
Contributor

krrishdholakia commented Oct 15, 2023

this is fixed - BerriAI/litellm#598

let me know if any of y'all see this on an updated version.

We're also working on adding testing to make sure errant print statements aren't outputted. Most of our print logic is behind print_verbose. Which is useful for debugging errors.

@couturelp
Copy link

As an FYI, here's how you force upgrade litellm using pip to get the error to stop

pip install --upgrade litellm

Cheers!

@ericrallen
Copy link
Collaborator

I'm going to close this one as I believe it was fixed in 0.1.10.

Please feel encouraged to reopen this issue if you experience the same behavior with the latest version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Something isn't working
Projects
None yet
Development

No branches or pull requests