Skip to content

In streaming mode, CLI sends each chunk back to the model as part of the conversation #379

@williammartin

Description

@williammartin

Describe the bug

With streaming mode on (default), each streamed chunk is being sent back to the LLM as a separate assistant message entry.

Affected version

0.0.348

Steps to reproduce the behavior

No response

Expected behavior

No response

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions