Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

unable to stream #258

Closed
hyusetiawan opened this issue Sep 21, 2023 · 2 comments · Fixed by #288
Closed

unable to stream #258

hyusetiawan opened this issue Sep 21, 2023 · 2 comments · Fixed by #288
Assignees
Labels
enhancement New feature or request

Comments

@hyusetiawan
Copy link

I brought it up in another thread that the callbacks seem to only provide token count here: #46 (comment)

Want to surface this to the wider audience because the intent of the callback was to stream but it doesn't seem to satisfy the requirement? except if the intent is to stream token count. So, how to actually stream the response?

@hyusetiawan hyusetiawan added the bug Something isn't working label Sep 21, 2023
@collindutter
Copy link
Member

Hey @hyusetiawan!

Here is an example that uses a few different types of events to tap into the messages as Griptape goes through a Pipeline's Tasks.

import logging
from griptape.events import (
    StartTaskEvent,
    StartSubtaskEvent,
    FinishTaskEvent,
    FinishSubtaskEvent,
)
from griptape.tools import Calculator
from griptape.tasks import PromptTask, ToolkitTask
from griptape.structures import Pipeline


pipeline = Pipeline(
    event_listeners={
        StartSubtaskEvent: [
            lambda e: print(f"SUBTASK THOUGHT: {e.subtask.thought}"),
        ],
        FinishSubtaskEvent: [
            lambda e: print(f"SUBTASK OBSERVATION: {e.subtask.output.value}"),
        ],
        StartTaskEvent: [
            lambda e: print(f"TASK INPUT: {e.task.input.value}"),
        ],
        FinishTaskEvent: [
            lambda e: print(f"TASK OUTPUT: {e.task.output.value}"),
        ],
    },
    logger_level=logging.ERROR,  # disable regular griptape logging
)

pipeline.add_tasks(
    PromptTask("Tell me about large language models"),
    ToolkitTask("What is 10 ** 7", tools=[Calculator()]),
)

pipeline.run()

Though I still don't think that this addresses your request of true streaming. To address this, I propose that we add a new event called something like CompletionStreamChunk that will contain the chunks as they are streamed from the LLM provider.

@vasinov
Copy link
Member

vasinov commented Sep 21, 2023

I propose that we add a new event called something like CompletionStreamChunk that will contain the chunks as they are streamed from the LLM provider.

I think that makes sense. The tricky part is that not all model providers support streaming, so it will have to be done at the prompt driver level.

@vasinov vasinov added enhancement New feature or request and removed bug Something isn't working labels Sep 21, 2023
@collindutter collindutter self-assigned this Sep 27, 2023
@collindutter collindutter linked a pull request Sep 27, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants