Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feat Request] Add AIStreamCallbacksAndOptions for simple text stream response. #1964

Closed
arnab710 opened this issue Jun 15, 2024 · 1 comment
Labels
ai/ui enhancement New feature or request wontfix This will not be worked on

Comments

@arnab710
Copy link

Feature Description

Also for simple text stream responses (via toTextStreamResponse() or other methods), we should be able to use AIStreamCallbacksAndOptions (onStart, onToken, onFinish, etc.).

Use Case

Raw stream data protocol has been changed in version 3.1 or later.
I upgraded to version 3.1 and StreamingTextResponse stopped streaming simple text responses. The new format is now -> 0: "Je", 0: " suis", 0: "des".
However, this method does support AIStreamCallbacksAndOptions.

const result = await streamText({
        model: openAi(model, {
            user: user.uid,
        }),
        messages,
    });
    
    const stream = result.toAIStream({
        async onStart() {
            console.debug('start streaming..');
            do_something();
        },
        onToken(token) {
            console.debug('Token received', token);
        },
        onCompletion: async (completion) => {
            console.debug('onCompletion called', completion);
        },
        async onFinal(completion) {
            console.debug('ending stream..');
            do_something();
        },
    });

    return new StreamingTextResponse(stream);

So, for simple text responses, I found the method toTextStreamResponse.

const result = await streamText({
        model: openAi(model, {
            user: user.uid,
        }),
        messages,
    });

    result.toTextStreamResponse();

But, afaik it doesn't support AIStreamCallbacksAndOptions, which I require in my case. Specifically, I would like to know exactly when the stream processing starts (i.e., the onStart use case).

(note: I have a Chrome extension, useChat or useCompletion can't be used. I implemented an HTTP streaming response handler for streaming in the extension.)

Additional context

...

@lgrammel lgrammel added enhancement New feature or request wontfix This will not be worked on ai/ui labels Jun 15, 2024
@lgrammel
Copy link
Collaborator

lgrammel commented Jun 15, 2024

This is not possible for a text stream. By definition, it only transmits text chunks. If you want to use more complex objects, please use the AIStream instead.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai/ui enhancement New feature or request wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

2 participants