title | description |
---|---|
OpenAI Streaming Assistant Node |
The OpenAI Streaming Assistant Node allows you to integrate the OpenAI Assistant API into your workflows and receive responses in a streamed format. |
import { Callout } from 'nextra/components'; import Image from 'next/image'; import { Card, Cards } from 'nextra/components'; import Video from '../../components/video/index.jsx'; import streaming1 from '/public/integrations/ai-assistant/streaming-1.png';
The OpenAI Streaming Assistant Node allows you to interact with the powerful OpenAI Assistant API and receive intelligent responses generated by the OpenAI language model in a streamed format. This node supports the latest version of the OpenAI API, providing access to the latest features and capabilities.
This is the streaming version of the OpenAI Assistant Node. If you are looking for the standard version, please refer to the [OpenAI Assistant Node](/ai-assistant/openai-assistant).To get started quickly, BuildShip offers a pre-built template. Click on the templates below to clone them to your workspace and start using them right away!
<> ![An Assistant that streams back a text response and returns the Thread ID as a response header.](../../public/remix/openai-streaming.png) <> ![BuildShip AI Chat Widget: A pre-built chat widget that integrates with the OpenAI Streaming Assistant.](../../public/remix/chat-widget.png)The OpenAI Streaming Assistant Node accepts the same inputs as the OpenAI Assistant Node, learn more.
This output property is the stream object to which the AI assistant's response is piped. The client, upon receiving this object, can accept the response in the form of a stream.
To have the workflow respond with the text stream, this **Response Stream** object needs to be only value being returned by the workflow's **Return node**.