Skip to content

Server-sent events streaming library for LLM responses

License

Notifications You must be signed in to change notification settings

chu2bard/eventpipe

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

85 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

eventpipe

SSE streaming library for LLM responses. Parses Server-Sent Events from OpenAI, Anthropic and Google, normalizes into a common format.

Install

npm install
npm run build

Usage

import { processStream } from "eventpipe";

const response = await fetch("https://api.openai.com/v1/chat/completions", {
  method: "POST",
  headers: { "Content-Type": "application/json", Authorization: `Bearer ${key}` },
  body: JSON.stringify({ model: "gpt-4o", messages: [...], stream: true }),
});

const text = await processStream(response, {
  onToken: (token) => process.stdout.write(token),
  onDone: (full) => console.log("\n---done---"),
  onError: (err) => console.error(err),
});

Auto-detects provider from response format. Pass provider param to skip detection.

License

MIT

About

Server-sent events streaming library for LLM responses

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors