Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
108 changes: 70 additions & 38 deletions apps/website/content/docs-v2/getting-started/introduction.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -27,29 +27,16 @@ No RxJS. No manual subscriptions. No async pipes. Just Signals that work with An

## The Architecture

StreamResource sits between your Angular app and LangGraph Platform:
Watch a full conversation turn flow through the stack — from user input to rendered response:

<Steps>
<Step title="Your Angular app calls streamResource()">
Creates a reactive resource bound to a specific agent. All state is exposed as Signals.
</Step>
<Step title="FetchStreamTransport opens an SSE connection">
Sends HTTP POST to LangGraph Platform, receives Server-Sent Events with state updates.
</Step>
<Step title="LangGraph Platform runs your agent graph">
Executes nodes, calls tools, manages checkpoints. Streams results back in real-time.
</Step>
<Step title="Signals update your templates automatically">
As tokens arrive, `messages()` updates. Angular re-renders only the affected components.
</Step>
</Steps>
<ArchFlowDiagram />

## Build Your Agent

LangGraph agents are Python programs defined as directed graphs. Here's a minimal chat agent using the example from this repository:

<Tabs items={['agent.py', 'langgraph.json']}>
<Tab>
<Tabs>
<Tab label="agent.py">

```python
# examples/chat-agent/src/chat_agent/agent.py
Expand All @@ -58,7 +45,7 @@ from langchain_core.runnables import RunnableConfig
from langgraph.graph import END, START, MessagesState, StateGraph
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(model="gpt-4o-mini")
llm = ChatOpenAI(model="gpt-5-mini")

def call_model(state: MessagesState, config: RunnableConfig) -> dict:
"""Invoke the LLM with the current message history."""
Expand All @@ -79,7 +66,7 @@ graph = builder.compile()
```

</Tab>
<Tab>
<Tab label="langgraph.json">

```json
{
Expand Down Expand Up @@ -162,8 +149,8 @@ export const appConfig: ApplicationConfig = {
</Step>
<Step title="Build your chat component">

<Tabs items={['TypeScript', 'Template']}>
<Tab>
<Tabs>
<Tab label="chat.component.ts">

```typescript
// chat.component.ts
Expand Down Expand Up @@ -202,7 +189,7 @@ export class ChatComponent {
```

</Tab>
<Tab>
<Tab label="chat.component.html">

```html
<!-- chat.component.html -->
Expand Down Expand Up @@ -257,46 +244,91 @@ Open `http://localhost:4200` and start chatting with your agent. Messages stream

Here's what streamResource() gives you out of the box:

| Feature | Signal | Description |
|---------|--------|-------------|
| **Messages** | `chat.messages()` | Live message list, updates as tokens arrive |
| **Status** | `chat.status()` | Current state: idle, loading, resolved, error |
| **Thread persistence** | `threadId` option | Conversations survive page refreshes |
| **Interrupts** | `chat.interrupt()` | Agent pauses for human input |
| **History** | `chat.history()` | Full checkpoint timeline for time-travel |
| **Subagents** | `chat.subagents()` | Track delegated agent work |
| **Tool calls** | `chat.toolCalls()` | See what tools the agent is using |
<CardGroup cols={2}>
<Card title="Messages" href="/docs/guides/streaming">
`chat.messages()` — live message list that updates as each token arrives from the agent
</Card>
<Card title="Status" href="/docs/guides/streaming">
`chat.status()` — current state: idle, loading, resolved, or error
</Card>
<Card title="Thread Persistence" href="/docs/guides/persistence">
`threadId` option — conversations survive page refreshes via localStorage or backend
</Card>
<Card title="Interrupts" href="/docs/guides/interrupts">
`chat.interrupt()` — agent pauses for human approval, your UI handles the decision
</Card>
<Card title="Time Travel" href="/docs/guides/time-travel">
`chat.history()` — full checkpoint timeline for debugging and branching
</Card>
<Card title="Subagents" href="/docs/guides/subgraphs">
`chat.subagents()` — track delegated agent work across multiple graphs
</Card>
<Card title="Tool Calls" href="/docs/guides/streaming">
`chat.toolCalls()` — see what tools the agent is invoking in real-time
</Card>
<Card title="Testing" href="/docs/guides/testing">
`MockStreamTransport` — deterministic testing without a running server
</Card>
</CardGroup>

## Deploy to Production

When you're ready to go live, deploy your agent to LangGraph Cloud.
When you're ready to go live, deploy your agent to LangGraph Cloud and point your Angular app to the deployment URL.

<Steps>
<Step title="Push your agent to GitHub">

Your agent code (the Python project with `langgraph.json`) needs to be in a GitHub repository.
Your agent code (the Python project with `langgraph.json`) needs to be in a GitHub repository. Make sure your `langgraph.json` references the correct graph entry point.

```bash
git init && git add . && git commit -m "initial agent"
gh repo create my-agent --public --source=. --push
```

</Step>
<Step title="Deploy via LangSmith">

Go to [LangSmith Deployments](https://smith.langchain.com) and click **+ New Deployment**. Connect your GitHub repo and deploy. This takes about 15 minutes.
Go to [LangSmith Deployments](https://smith.langchain.com) and click **+ New Deployment**. Connect your GitHub account, select your repository, and deploy. The first deployment takes about 15 minutes.

You'll receive a deployment URL like `https://my-agent-abc123.langsmith.dev`.

</Step>
<Step title="Update your Angular config">

Point `apiUrl` to your deployment URL:
Point `apiUrl` to your deployment URL and set up environment-based configuration:

```typescript
// environment.ts
export const environment = {
langgraphUrl: 'http://localhost:2024', // dev
};

// environment.prod.ts
export const environment = {
langgraphUrl: 'https://my-agent-abc123.langsmith.dev', // prod
};

// app.config.ts
provideStreamResource({
apiUrl: 'https://your-deployment.langsmith.dev',
apiUrl: environment.langgraphUrl,
})
```

</Step>
<Step title="Deploy your Angular app">

Deploy your Angular frontend to any hosting platform — Vercel, Netlify, AWS, or your own infrastructure. Since streamResource() is a stateless client, your frontend has no server-side state requirements.

```bash
ng build --configuration production
# Deploy dist/ to your hosting platform
```

</Step>
</Steps>

<Callout type="info" title="Stateless frontend">
Your Angular app is a stateless client. All agent state lives on LangGraph Platform. This means you can deploy your Angular app anywhere CDN, edge, SSRwithout state management concerns.
<Callout type="tip" title="Stateless architecture">
Your Angular app is a stateless client. All agent state — threads, checkpoints, memory — lives on LangGraph Platform. This means you can deploy your frontend anywhere (CDN, edge, SSR) without state management concerns. Scale your frontend independently of your agent infrastructure.
</Callout>

## What's Next
Expand Down
169 changes: 169 additions & 0 deletions apps/website/src/components/docs/ArchFlowDiagram.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,169 @@
'use client';
import { useState, useEffect, useRef } from 'react';
import { tokens } from '../../../lib/design-tokens';

interface LogEntry {
time: string;
source: 'angular' | 'transport' | 'langgraph' | 'signal';
message: string;
}

const SCENARIO: { delay: number; chatBubble?: { role: 'user' | 'assistant'; text: string; streaming?: boolean }; log: LogEntry }[] = [
{ delay: 0, chatBubble: { role: 'user', text: 'How do Angular Signals work with streaming?' }, log: { time: '0.00s', source: 'angular', message: 'chat.submit({ messages: [userMsg] })' } },
{ delay: 800, log: { time: '0.02s', source: 'transport', message: 'POST /threads/t_8f3a/runs/stream → 200' } },
{ delay: 1200, log: { time: '0.04s', source: 'langgraph', message: 'Executing node: call_model (gpt-5-mini)' } },
{ delay: 2200, log: { time: '0.82s', source: 'langgraph', message: 'SSE event: { type: "values", messages: [...] }' } },
{ delay: 2600, log: { time: '0.84s', source: 'transport', message: 'Received chunk → messages$.next([...])' } },
{ delay: 2800, log: { time: '0.85s', source: 'signal', message: 'messages() updated → 2 messages' } },
{ delay: 3000, chatBubble: { role: 'assistant', text: 'Angular Signals', streaming: true }, log: { time: '0.86s', source: 'signal', message: 'status() → "loading"' } },
{ delay: 3400, chatBubble: { role: 'assistant', text: 'Angular Signals provide a synchronous', streaming: true }, log: { time: '1.12s', source: 'transport', message: 'Received chunk → values event' } },
{ delay: 3900, chatBubble: { role: 'assistant', text: 'Angular Signals provide a synchronous, reactive way to', streaming: true }, log: { time: '1.45s', source: 'signal', message: 'messages() updated → streaming token' } },
{ delay: 4500, chatBubble: { role: 'assistant', text: 'Angular Signals provide a synchronous, reactive way to track streaming state.', streaming: true }, log: { time: '1.82s', source: 'langgraph', message: 'SSE event: { type: "values", status: "done" }' } },
{ delay: 5200, chatBubble: { role: 'assistant', text: 'Angular Signals provide a synchronous, reactive way to track streaming state. Each token updates the Signal, and OnPush change detection re-renders automatically.' }, log: { time: '2.10s', source: 'signal', message: 'status() → "resolved" ✓' } },
{ delay: 6000, log: { time: '2.12s', source: 'angular', message: 'Template re-rendered (OnPush) — 1 component' } },
];

const SOURCE_COLORS: Record<string, { bg: string; text: string; label: string }> = {
angular: { bg: 'rgba(221,0,49,0.08)', text: '#c62828', label: 'ANGULAR' },
transport: { bg: 'rgba(100,80,200,0.08)', text: '#5e35b1', label: 'TRANSPORT' },
langgraph: { bg: 'rgba(0,64,144,0.08)', text: '#004090', label: 'LANGGRAPH' },
signal: { bg: 'rgba(16,185,129,0.08)', text: '#059669', label: 'SIGNAL' },
};

export function ArchFlowDiagram() {
const [logs, setLogs] = useState<LogEntry[]>([]);
const [bubbles, setBubbles] = useState<{ role: 'user' | 'assistant'; text: string; streaming?: boolean }[]>([]);
const [cycle, setCycle] = useState(0);
const logRef = useRef<HTMLDivElement>(null);

useEffect(() => {
const timeouts: ReturnType<typeof setTimeout>[] = [];

const runScenario = () => {
setLogs([]);
setBubbles([]);

SCENARIO.forEach((step, i) => {
timeouts.push(setTimeout(() => {
setLogs(prev => [...prev, step.log]);
if (step.chatBubble) {
setBubbles(prev => {
const existing = prev.findIndex(b => b.role === step.chatBubble!.role && b.role === 'assistant');
if (existing >= 0 && step.chatBubble!.role === 'assistant') {
const updated = [...prev];
updated[existing] = step.chatBubble!;
return updated;
}
return [...prev, step.chatBubble!];
});
}
if (logRef.current) logRef.current.scrollTop = logRef.current.scrollHeight;
}, step.delay));
});

// Restart after completion
timeouts.push(setTimeout(() => {
setCycle(c => c + 1);
}, 8000));
};

runScenario();
return () => timeouts.forEach(clearTimeout);
}, [cycle]);

return (
<div style={{
width: '100%',
margin: '28px 0 36px',
borderRadius: 14,
overflow: 'hidden',
border: `1px solid ${tokens.glass.border}`,
boxShadow: tokens.glass.shadow,
background: 'rgba(255,255,255,0.5)',
backdropFilter: `blur(${tokens.glass.blur})`,
WebkitBackdropFilter: `blur(${tokens.glass.blur})`,
}}>
{/* Header bar */}
<div style={{
padding: '10px 16px',
borderBottom: `1px solid ${tokens.glass.border}`,
display: 'flex', alignItems: 'center', justifyContent: 'space-between',
background: 'rgba(255,255,255,0.6)',
}}>
<div style={{ display: 'flex', alignItems: 'center', gap: 8 }}>
<div style={{ width: 8, height: 8, borderRadius: '50%', background: '#FF5F57' }} />
<div style={{ width: 8, height: 8, borderRadius: '50%', background: '#FEBC2E' }} />
<div style={{ width: 8, height: 8, borderRadius: '50%', background: '#28C840' }} />
</div>
<span style={{ fontFamily: 'var(--font-mono)', fontSize: 11, color: tokens.colors.textMuted }}>streamResource() — live architecture flow</span>
<span style={{ fontFamily: 'var(--font-mono)', fontSize: 9, color: tokens.colors.textMuted, background: 'rgba(0,0,0,0.04)', padding: '2px 6px', borderRadius: 4 }}>localhost:4200</span>
</div>

<div style={{ display: 'flex', minHeight: 320 }}>
{/* Left: Chat simulation */}
<div style={{ flex: 1, padding: 16, borderRight: `1px solid ${tokens.glass.border}`, display: 'flex', flexDirection: 'column' }}>
<div style={{ fontFamily: 'var(--font-mono)', fontSize: 9, color: tokens.colors.textMuted, textTransform: 'uppercase', letterSpacing: '0.06em', marginBottom: 10, fontWeight: 600 }}>Chat Interface</div>

<div style={{ flex: 1, display: 'flex', flexDirection: 'column', gap: 8, overflow: 'hidden' }}>
{bubbles.map((b, i) => (
<div key={`${b.role}-${i}`} style={{
display: 'flex',
justifyContent: b.role === 'user' ? 'flex-end' : 'flex-start',
alignItems: 'flex-start', gap: 6,
}}>
{b.role === 'assistant' && (
<div style={{
width: 22, height: 22, borderRadius: 6, flexShrink: 0,
background: 'rgba(0,64,144,0.1)',
display: 'flex', alignItems: 'center', justifyContent: 'center',
fontFamily: 'var(--font-mono)', fontSize: 8, color: tokens.colors.accent, fontWeight: 700,
}}>AI</div>
)}
<div style={{
padding: '8px 12px', borderRadius: 10, maxWidth: '85%',
fontSize: 12, lineHeight: 1.5, color: b.role === 'user' ? '#fff' : tokens.colors.textPrimary,
background: b.role === 'user' ? tokens.colors.accent : 'rgba(0,0,0,0.04)',
}}>
{b.text}
{b.streaming && <span style={{ opacity: 0.5, animation: 'blink 0.6s infinite' }}>▊</span>}
</div>
</div>
))}
</div>
</div>

{/* Right: Console log */}
<div ref={logRef} style={{ flex: 1, padding: 12, background: '#1a1b26', overflow: 'auto', fontFamily: 'var(--font-mono)', fontSize: 10 }}>
<div style={{ fontSize: 9, color: '#4A527A', textTransform: 'uppercase', letterSpacing: '0.06em', marginBottom: 8, fontWeight: 600 }}>Developer Console</div>

{logs.map((log, i) => {
const sc = SOURCE_COLORS[log.source];
return (
<div key={i} style={{
display: 'flex', gap: 6, marginBottom: 4, alignItems: 'flex-start',
animation: 'fadeIn 0.2s ease-out',
}}>
<span style={{ color: '#4A527A', flexShrink: 0, width: 36, textAlign: 'right' }}>{log.time}</span>
<span style={{
padding: '1px 5px', borderRadius: 3, flexShrink: 0,
background: sc.bg, color: sc.text, fontSize: 8, fontWeight: 600,
minWidth: 62, textAlign: 'center',
}}>{sc.label}</span>
<span style={{ color: '#a9b1d6', wordBreak: 'break-all' }}>{log.message}</span>
</div>
);
})}

{logs.length === 0 && (
<div style={{ color: '#4A527A', fontStyle: 'italic' }}>Waiting for interaction...</div>
)}
</div>
</div>

<style>{`
@keyframes blink { 0%,100% { opacity:0.5; } 50% { opacity:0; } }
@keyframes fadeIn { from { opacity:0; transform:translateY(4px); } to { opacity:1; transform:translateY(0); } }
`}</style>
</div>
);
}
4 changes: 4 additions & 0 deletions apps/website/src/components/docs/MdxRenderer.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ import { Steps, Step } from './mdx/Steps';
import { Tabs, Tab } from './mdx/Tabs';
import { Card, CardGroup } from './mdx/Card';
import { CodeGroup } from './mdx/CodeGroup';
import { Pre } from './mdx/CodeBlock';
import { ArchFlowDiagram } from './ArchFlowDiagram';
import { DocsBreadcrumb } from './DocsBreadcrumb';
import { DocsPrevNext } from './DocsPrevNext';
import rehypePrettyCode from 'rehype-pretty-code';
Expand All @@ -19,6 +21,8 @@ const mdxComponents = {
Card,
CardGroup,
CodeGroup,
ArchFlowDiagram,
pre: Pre,
};

const rehypeOptions = {
Expand Down
Loading