Skip to content

rishi-raj-jain/sse-streaming-llm-response

Repository files navigation

Using Server-Sent Events (SSE) to stream LLM responses in Next.js

Introduction · Tech Stack + Features · Author

Introduction

Learn how to build use Server-Sent Events (SSE) to stream LLM responses in Next.js with OpenAI.

Tech Stack + Features

Frameworks

  • Next.js – The React Framework for the Web.

  • Vercel AI SDK – An open source library for building AI-powered user interfaces.

Database

  • Upstash - Serverless database platform. We're going to use Upstash Redis to cache OpenAI API responses.

Artificial Intelligence

  • OpenAI - OpenAI is an artificial intelligence research lab focused on developing advanced AI technologies.

  • LangChain - Framework for developing applications powered by language models.

UI

  • TailwindCSS – A CSS framework for rapid and responsive styling.

Platofrms

  • Vercel - A cloud platform for deploying and scaling web applications.

Author