Skip to content

jvoltci/zero-jitter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

ZeroJitter

Zero-layout-jitter streaming text renderer for LLM tokens.

A production-grade React/TypeScript library that streams LLM tokens into a jitter-free <canvas> surface. Text measurement and layout are offloaded to a Web Worker via @chenglou/pretext, keeping the main thread free for 60fps interactions.

✨ Features

  • πŸš€ Zero Layout Thrashing β€” All text measurement happens in a Web Worker. The main thread never calls measureText().
  • 🎨 Canvas Rendering β€” Bypasses DOM layout entirely. No reflows, no forced synchronous layouts, no scrollbar jitter.
  • β™Ώ Fully Accessible β€” Parallel visually-hidden aria-live DOM mirror for screen readers.
  • πŸ“ HiDPI / Retina β€” Automatic devicePixelRatio scaling with monitor-switching detection.
  • ⚑ Viewport Culling β€” O(log n) binary search paints only visible lines. Handles 10,000+ lines smoothly.
  • πŸ”€ Font Sync β€” Blocks layout until custom fonts are loaded. No flash of wrong font.
  • πŸ“¦ Tree-Shakeable β€” ESM + CJS dual output, sideEffects: false.

πŸ“¦ Installation

npm install zero-jitter

Zero external runtime dependencies. The text layout engine is vendored. Only peer deps:

{
  "react": ">=18.0.0",
  "react-dom": ">=18.0.0"
}

πŸš€ Quick Start

import { useRef, useEffect } from 'react';
import { ZeroJitter, type ZeroJitterHandle } from 'zero-jitter';

function ChatMessage() {
  const ref = useRef<ZeroJitterHandle>(null);

  useEffect(() => {
    const es = new EventSource('/api/stream');
    es.onmessage = (e) => ref.current?.appendText(e.data);
    return () => es.close();
  }, []);

  return <ZeroJitter ref={ref} font="16px Inter" maxHeight={400} />;
}

πŸ—οΈ Architecture

β”Œβ”€ Main Thread ──────────────────────────────────────────────┐
β”‚                                                            β”‚
β”‚  SSE tokens β†’ useZeroJitter hook β†’ postMessage β†’ Worker    β”‚
β”‚                                                            β”‚
β”‚  Worker response β†’ CanvasRenderer.paint() β†’ <canvas>       β”‚
β”‚                 β†’ AccessibilityMirror  β†’ <div aria-live>   β”‚
β”‚                                                            β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

β”Œβ”€ Web Worker Thread ────────────────────────────────────────┐
β”‚                                                            β”‚
β”‚  Vendored pretext engine: prepareWithSegments() β†’ layout() β”‚
β”‚  (Intl.Segmenter, CJK, BiDi, emoji correction)            β”‚
β”‚  Returns: { lines[], totalHeight, lineCount }              β”‚
β”‚                                                            β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Note: The text layout engine (pretext) is vendored into src/vendor/pretext/ (MIT licensed) rather than kept as an npm dependency. This eliminates single-author risk and enables future streaming optimizations (incremental prepare() for measuring only new tokens).

πŸ“– API

<ZeroJitter /> Component

Prop Type Default Description
font string '16px sans-serif' CSS font shorthand
fontSize number 16 Font size in px
lineHeight number fontSize Γ— 1.5 Line height in px
color string '#000' Text color
whiteSpace 'normal' | 'pre-wrap' 'normal' White space mode
height number | 'auto' 'auto' Container height
maxHeight number β€” Max height before scroll
autoScroll boolean true Auto-scroll on new content
padding number | {top,right,bottom,left} 0 Canvas padding
ariaLive 'polite' | 'assertive' | 'off' 'polite' Screen reader mode
className string β€” Container CSS class
style CSSProperties β€” Container inline styles
workerUrl string | URL auto Custom worker URL

ZeroJitterHandle (imperative ref)

interface ZeroJitterHandle {
  appendText(chunk: string): void;  // Append token (no re-render)
  setText(text: string): void;      // Replace all text
  clear(): void;                    // Clear text and reset
  layout: LayoutState;              // Current layout result
  containerRef: (node: HTMLElement | null) => void;
  fontReady: boolean;               // Font loaded?
}

useZeroJitter Hook

For advanced use cases where you need direct access to the layout engine:

import { useZeroJitter } from 'zero-jitter';

function CustomRenderer() {
  const { appendText, layout, containerRef, fontReady } = useZeroJitter({
    font: '16px Inter',
    lineHeight: 24,
  });

  // Use layout.lines to render however you want
}

πŸ› οΈ Development

# Install dependencies
npm install

# Type check
npm run typecheck

# Build
npm run build

# Storybook
npm run storybook

# Lint
npm run lint

πŸ“ How It Works

  1. Token arrives β†’ appendText(chunk) appends to a useRef (zero React re-renders)
  2. rAF batch β†’ Multiple tokens within a frame are coalesced into one worker message
  3. Worker measures β†’ @chenglou/pretext does prepareWithSegments() + layoutWithLines()
  4. Result returns β†’ Worker posts { lines[], totalHeight } back to main thread
  5. Canvas paints β†’ Only visible lines are fillText()'d (O(log n) viewport culling)
  6. A11y updates β†’ Debounced (300ms) aria-live region mirrors text for screen readers

Why Canvas?

DOM text rendering triggers layout recalculation on every token append. In a streaming LLM chat UI, this means:

  • Forced synchronous layouts (Layout Thrashing)
  • Scrollbar position jumps (Jitter)
  • Frame drops during rapid token arrival

Canvas fillText() bypasses the entire DOM layout pipeline. Combined with off-thread measurement, the main thread stays free for user interaction.

πŸ”— Companion: StreamMD

For streaming markdown that incrementally parses and renders only the active block, check out StreamMD β€” block-level memoization with built-in syntax highlighting.

zero-jitter   β†’ streaming plain text (canvas, zero reflows)
stream-md     β†’ streaming markdown (smart DOM, incremental parsing)

Together, they own the "streaming LLM display" category.

πŸ“„ License

MIT

About

Stop layout thrashing. Stream LLM tokens without jitter.

Topics

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors