From 160f06f51044f9160c840ecb3fca42f85d637b1b Mon Sep 17 00:00:00 2001 From: Brian Love Date: Sat, 4 Apr 2026 15:30:33 -0700 Subject: [PATCH 1/2] docs(website): massively expand LangGraph Basics with agent patterns MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit From 66 to 377 lines. Covers: - Core concepts: nodes, edges, state with detailed code - 4 agent patterns: ReAct, human-in-the-loop, multi-agent, persistence - Each pattern with Python graph code + Angular streamResource connection - Signal mapping table showing every LangGraph concept → Signal - Graph API vs Functional API comparison - Expanded What's Next with 6 cards --- .../docs-v2/concepts/langgraph-basics.mdx | 388 ++++++++++++++++-- apps/website/next-env.d.ts | 2 +- 2 files changed, 351 insertions(+), 39 deletions(-) diff --git a/apps/website/content/docs-v2/concepts/langgraph-basics.mdx b/apps/website/content/docs-v2/concepts/langgraph-basics.mdx index 4a8e27d94..b3f86ecaf 100644 --- a/apps/website/content/docs-v2/concepts/langgraph-basics.mdx +++ b/apps/website/content/docs-v2/concepts/langgraph-basics.mdx @@ -1,65 +1,377 @@ # LangGraph Basics -LangGraph is a framework for building stateful AI agents as directed graphs. This page explains the core concepts for Angular developers who are new to agent development. +LangGraph is a framework for building stateful AI agents as directed graphs. If you're an Angular developer building AI-powered applications, this page teaches you how LangGraph agents work and why streamResource() is the natural bridge between your frontend and your agent backend. -## Graphs, nodes, and edges + +Graphs give you explicit control over agent behavior. Instead of a black-box prompt-and-pray approach, you define exactly how your agent reasons, when it calls tools, and where it pauses for human input. Every step is visible, testable, and debuggable. + + +## The Core Concepts + +A LangGraph agent has three building blocks: + +### Nodes — Functions That Do Work + +A node is a Python function that receives the current state, does something, and returns updated state. Every node has the same signature: + +```python +def my_node(state: State, config: RunnableConfig) -> dict: + # Read from state + messages = state["messages"] + + # Do work (call LLM, query DB, invoke tool) + response = llm.invoke(messages) + + # Return state updates (merged into existing state) + return {"messages": [response]} +``` + + +Nodes don't replace state — they return updates that get **merged** into the existing state. For lists like messages, LangGraph uses reducers (like `operator.add`) to accumulate entries instead of overwriting. + + +### Edges — Connections Between Nodes + +Edges define the execution flow. There are two types: + +**Normal edges** — always route to the next node: +```python +builder.add_edge(START, "call_model") # Start → call_model +builder.add_edge("call_model", END) # call_model → End +``` + +**Conditional edges** — route based on state: +```python +def should_continue(state: State) -> str: + last_msg = state["messages"][-1] + if last_msg.tool_calls: + return "tools" # Agent wants to use a tool + return END # Agent is done, return response + +builder.add_conditional_edges("call_model", should_continue) +``` + +### State — The Shared Memory + +All nodes read from and write to a shared state object. You define its shape as a Python `TypedDict`: + +```python +from typing_extensions import TypedDict, Annotated +from operator import add + +class State(TypedDict): + messages: Annotated[list, add] # Accumulates messages + plan: list[str] # Agent's current plan + results: dict # Tool results +``` + +This state is exactly what streamResource() exposes to your Angular app through Signals. + +## Building Your First Agent + +Here's the simplest possible agent — a chat model that takes messages and responds: + + + -A LangGraph agent is a directed graph where: +```python +from langgraph.graph import END, START, MessagesState, StateGraph +from langchain_openai import ChatOpenAI - - -Each node performs one action — calling an LLM, querying a database, or making an API request. Nodes receive state and return updated state. - - -Edges connect nodes. Conditional edges route execution based on state, enabling branching logic. - - -All nodes read from and write to a shared state object. This state is what streamResource() exposes through its signals. - - +llm = ChatOpenAI(model="gpt-5-mini") -## How streamResource connects +def call_model(state: MessagesState) -> dict: + response = llm.invoke(state["messages"]) + return {"messages": [response]} -Your Angular app doesn't run the graph — LangGraph Platform does. streamResource() is the bridge: +# Build the graph: START → call_model → END +builder = StateGraph(MessagesState) +builder.add_node("call_model", call_model) +builder.add_edge(START, "call_model") +builder.add_edge("call_model", END) -1. Your component calls `submit()` with user input -2. FetchStreamTransport sends an HTTP POST to LangGraph Platform -3. The platform runs the graph and streams state updates via SSE -4. streamResource() updates its Signals as events arrive -5. Angular re-renders your templates automatically +graph = builder.compile() +``` + + + -## State design +```json +{ + "dependencies": ["."], + "graphs": { + "chat_agent": "./src/chat_agent/agent.py:graph" + }, + "env": ".env", + "python_version": "3.12" +} +``` -The generic type parameter in `streamResource()` defines your agent's state shape. + + ```typescript -// Simple chat state -streamResource<{ messages: BaseMessage[] }>({ ... }) - -// Rich agent state with custom fields -interface AgentState { - messages: BaseMessage[]; - plan: string[]; - currentStep: number; - results: Record; +// This is all you need on the Angular side +const chat = streamResource<{ messages: BaseMessage[] }>({ + assistantId: 'chat_agent', +}); + +// chat.messages() updates as the agent streams its response +// chat.status() tells you if it's idle, loading, or done +``` + + + + +## Agent Patterns + +The power of LangGraph is in the patterns you can build. Each pattern maps to specific streamResource() signals. + +### Pattern 1: ReAct Agent (Tool Calling) + +The agent reasons, decides to call a tool, observes the result, and loops until it has an answer. + +```python +from langgraph.prebuilt import ToolNode + +@tool +def search_docs(query: str) -> str: + """Search the knowledge base.""" + return vector_store.similarity_search(query) + +tools = [search_docs] + +def call_model(state: State) -> dict: + response = llm.bind_tools(tools).invoke(state["messages"]) + return {"messages": [response]} + +def should_continue(state: State) -> str: + if state["messages"][-1].tool_calls: + return "tools" + return END + +builder = StateGraph(State) +builder.add_node("model", call_model) +builder.add_node("tools", ToolNode(tools)) +builder.add_edge(START, "model") +builder.add_conditional_edges("model", should_continue) +builder.add_edge("tools", "model") # Loop back after tool execution + +graph = builder.compile() +``` + +**Angular connection:** Track tool execution in real-time: +```typescript +const agent = streamResource({ + assistantId: 'react_agent', +}); + +// Watch tools execute +const activeTools = computed(() => agent.toolProgress()); +const completedTools = computed(() => agent.toolCalls()); +``` + +### Pattern 2: Human-in-the-Loop (Approval) + +The agent proposes an action and pauses. Your Angular UI shows an approval dialog. The user decides, and the agent resumes. + +```python +from langgraph.types import Interrupt + +def propose_action(state: State) -> dict: + action = llm.invoke(state["messages"]) + # Pause execution — Angular will show approval UI + raise Interrupt(value={ + "action": "send_email", + "to": "client@example.com", + "body": action.content, + }) + +def execute_action(state: State) -> dict: + # Only runs after human approves + send_email(state["pending_action"]) + return {"messages": [{"role": "assistant", "content": "Email sent."}]} +``` + +**Angular connection:** The interrupt surfaces automatically: +```typescript +const agent = streamResource({ + assistantId: 'approval_agent', +}); + +// Show approval UI when agent pauses +const pendingAction = computed(() => agent.interrupt()); + +// User clicks approve → resume the agent +approve() { + agent.submit(null, { resume: { approved: true } }); } -streamResource({ ... }) ``` - -For deeper LangGraph concepts (persistence, interrupts, memory), see the individual guide pages. +### Pattern 3: Multi-Agent Orchestration + +A supervisor agent delegates work to specialist sub-agents. Each sub-agent is its own graph. + +```python +def supervisor(state: State) -> dict: + routing = llm.invoke([ + {"role": "system", "content": "Route to: researcher, analyst, or writer"}, + *state["messages"] + ]) + return {"next_agent": routing.tool_calls[0].args["agent"]} + +builder = StateGraph(State) +builder.add_node("supervisor", supervisor) +builder.add_node("researcher", researcher_subgraph) +builder.add_node("analyst", analyst_subgraph) +builder.add_conditional_edges("supervisor", lambda s: s["next_agent"]) +``` + +**Angular connection:** Track each sub-agent independently: +```typescript +const orchestrator = streamResource({ + assistantId: 'orchestrator', + subagentToolNames: ['researcher', 'analyst', 'writer'], +}); + +// See all active sub-agents +const workers = computed(() => orchestrator.activeSubagents()); +const workerCount = computed(() => workers().length); +``` + +### Pattern 4: Persistent Conversations + +Thread-based persistence means conversations survive page refreshes, browser restarts, and even server deployments. + +```python +from langgraph.checkpoint.postgres import PostgresSaver + +checkpointer = PostgresSaver.from_connection_string(DATABASE_URL) +graph = builder.compile(checkpointer=checkpointer) + +# Each thread_id is a persistent conversation +result = graph.invoke( + {"messages": [user_message]}, + config={"configurable": {"thread_id": "user_123_session"}} +) +``` + +**Angular connection:** Thread persistence is built into streamResource: +```typescript +const chat = streamResource({ + assistantId: 'chat_agent', + threadId: signal(localStorage.getItem('threadId')), + onThreadId: (id) => localStorage.setItem('threadId', id), +}); + +// User returns tomorrow — same thread, full history restored +// No code needed — streamResource handles it +``` + +## How streamResource() Bridges the Gap + +Here's why streamResource() is the natural Angular companion for LangGraph: + + + + +``` +Your Angular App + ↓ submit({ messages: [userMsg] }) +streamResource() + ↓ HTTP POST to LangGraph Platform +FetchStreamTransport + ↓ Creates thread, starts run +LangGraph Platform + ↓ Executes graph nodes + ↓ Streams SSE events back +FetchStreamTransport + ↓ Parses events into BehaviorSubjects +streamResource() + ↓ Converts to Angular Signals via toSignal() +Your Angular App + → Templates re-render automatically +``` + + + + +```typescript +// Every LangGraph concept maps to a Signal: + +// Agent state values +agent.value() // Signal — full state object + +// Conversation +agent.messages() // Signal — message history + +// Lifecycle +agent.status() // Signal — idle/loading/done +agent.isLoading() // Signal — is the agent running? + +// Human-in-the-loop +agent.interrupt() // Signal — agent is paused + +// Debugging +agent.history() // Signal — checkpoint timeline +agent.branch() // Signal — time-travel branch + +// Multi-agent +agent.subagents() // Signal — delegated agents +agent.activeSubagents() // Signal — running workers +agent.toolCalls() // Signal — tool results +``` + + + + + +You don't configure SSE, parse events, manage WebSocket connections, or handle reconnection. streamResource() does all of that. You call `submit()` and read Signals — that's the entire API surface for your Angular code. +## Graph API vs Functional API + +LangGraph offers two ways to define agents: + +**Graph API** (recommended for most cases): +```python +builder = StateGraph(State) +builder.add_node("model", call_model) +builder.add_edge(START, "model") +graph = builder.compile() +``` + +**Functional API** (for simpler workflows): +```python +from langgraph.func import entrypoint, task + +@entrypoint +async def agent(messages): + response = await call_model(messages) + return response +``` + +Both APIs produce the same output and work identically with streamResource(). Choose the Graph API when you need conditional routing, subgraphs, or interrupts. Choose the Functional API for simple, linear workflows. + ## What's Next - Understand the planning, tool-calling, and execution lifecycle. + Deep dive into the planning, tool-calling, and execution lifecycle - Stream token-by-token responses from your LangGraph agent. + Stream token-by-token responses with multiple stream modes + + + Build human-in-the-loop approval flows + + + Compose multi-agent systems with orchestrators + + + Thread-based conversation persistence - Learn how streamResource exposes agent state as Angular Signals. + How Signals power streamResource's reactive model diff --git a/apps/website/next-env.d.ts b/apps/website/next-env.d.ts index c4b7818fb..fdbfe5258 100644 --- a/apps/website/next-env.d.ts +++ b/apps/website/next-env.d.ts @@ -1,6 +1,6 @@ /// /// -import "./.next/dev/types/routes.d.ts"; +import "./../../dist/apps/website/.next/types/routes.d.ts"; // NOTE: This file should not be edited // see https://nextjs.org/docs/app/api-reference/config/typescript for more information. From 4e1461a7c00f0873051e27b05b942870de784e81 Mon Sep 17 00:00:00 2001 From: Brian Love Date: Sat, 4 Apr 2026 16:09:55 -0700 Subject: [PATCH 2/2] fix(website): replace ASCII data flow diagram with Steps component --- .../docs-v2/concepts/langgraph-basics.mdx | 40 +++++++++++-------- apps/website/next-env.d.ts | 2 +- 2 files changed, 24 insertions(+), 18 deletions(-) diff --git a/apps/website/content/docs-v2/concepts/langgraph-basics.mdx b/apps/website/content/docs-v2/concepts/langgraph-basics.mdx index b3f86ecaf..7e9e10536 100644 --- a/apps/website/content/docs-v2/concepts/langgraph-basics.mdx +++ b/apps/website/content/docs-v2/concepts/langgraph-basics.mdx @@ -275,23 +275,29 @@ Here's why streamResource() is the natural Angular companion for LangGraph: -``` -Your Angular App - ↓ submit({ messages: [userMsg] }) -streamResource() - ↓ HTTP POST to LangGraph Platform -FetchStreamTransport - ↓ Creates thread, starts run -LangGraph Platform - ↓ Executes graph nodes - ↓ Streams SSE events back -FetchStreamTransport - ↓ Parses events into BehaviorSubjects -streamResource() - ↓ Converts to Angular Signals via toSignal() -Your Angular App - → Templates re-render automatically -``` + + +Calls `submit({ messages: [userMsg] })` to send user input + + +Passes input to the transport layer + + +Sends HTTP POST to LangGraph Platform, opens SSE connection + + +Executes graph nodes, calls tools, streams SSE events back + + +Parses SSE chunks into BehaviorSubjects + + +Converts BehaviorSubjects to Angular Signals via `toSignal()` + + +Templates re-render automatically via OnPush change detection + + diff --git a/apps/website/next-env.d.ts b/apps/website/next-env.d.ts index fdbfe5258..c4b7818fb 100644 --- a/apps/website/next-env.d.ts +++ b/apps/website/next-env.d.ts @@ -1,6 +1,6 @@ /// /// -import "./../../dist/apps/website/.next/types/routes.d.ts"; +import "./.next/dev/types/routes.d.ts"; // NOTE: This file should not be edited // see https://nextjs.org/docs/app/api-reference/config/typescript for more information.