|
| 1 | +# The Unified Data Pipeline Architecture |
| 2 | + |
| 3 | +## Introduction |
| 4 | + |
| 5 | +Historically, frontend frameworks have handled local API requests (like fetching a JSON file) and remote procedure calls (RPC) to backend services using completely different paradigms. You might use `fetch()` or `XMLHttpRequest` for local data, and a completely separate library or abstraction for WebSockets and RPC calls. |
| 6 | + |
| 7 | +Neo.mjs introduces the **Unified Data Pipeline Architecture**. This architecture eliminates the boundary between local fetches and remote calls. Whether you are loading a local JSON file, executing a standard REST API call, subscribing to a continuous WebSocket stream, or dispatching an RPC command to a remote service, the data flows through the exact same Pipeline mechanism. |
| 8 | + |
| 9 | +This guide explains the four core pillars of this architecture: **Pipelines**, **Connections**, **Parsers**, and **Normalizers**, and how they enable powerful features like "Turbo Mode" and Cross-Worker execution. |
| 10 | + |
| 11 | +## The Four Pillars |
| 12 | + |
| 13 | +A Data Pipeline in Neo.mjs is essentially an assembly line. Raw bytes or requests enter at the start, and fully formed, predictable JavaScript objects exit at the end, ready for a `Store` to consume. |
| 14 | + |
| 15 | +### 1. Connections (The Transport Layer) |
| 16 | +The Connection is the gateway to the outside world. Its **only** job is transport. It handles the low-level protocols, network handshakes, and returning the raw payload. |
| 17 | + |
| 18 | +* `Neo.data.connection.Fetch`: Wraps the modern browser `fetch` API. |
| 19 | +* `Neo.data.connection.Xhr`: Wraps the legacy `XMLHttpRequest` API. |
| 20 | +* `Neo.data.connection.WebSocket`: Handles persistent, bidirectional socket connections. |
| 21 | +* `Neo.data.connection.Stream`: A specialized transport that returns a raw `ReadableStream` (byte pipe), useful for chunked data. |
| 22 | + |
| 23 | +### 2. Parsers (The Deserializer) |
| 24 | +Connections often return data in formats that JavaScript cannot natively digest (e.g., text, byte streams, XML, CSV, NDJSON). The Parser takes the raw output from the Connection and translates it into JavaScript objects. |
| 25 | + |
| 26 | +* `Neo.data.parser.Stream`: Takes a raw byte stream, chunks it by newlines, parses the NDJSON, and trickles the data forward. |
| 27 | +* *Note: Standard JSON responses from `Fetch` or `Xhr` often don't need a formal parser if the native `.json()` method is sufficient.* |
| 28 | + |
| 29 | +### 3. Normalizers (The Shaper) |
| 30 | +Even if data is valid JSON, its shape might not match what your `Store` expects. Your backend might wrap the data in metadata (`{ success: true, payload: [...] }`), or use different property names. The Normalizer bridges this gap, flattening or mapping the data into the canonical structure defined by your `Model`. |
| 31 | + |
| 32 | +### 4. The Pipeline (The Orchestrator) |
| 33 | +The `Neo.data.Pipeline` ties these three pieces together. It manages the flow of data from Connection -> Parser -> Normalizer. |
| 34 | + |
| 35 | +Crucially, the Pipeline is the **Worker Execution Boundary**. You can configure a Pipeline to execute its heavy lifting in the `App` Worker or offload it entirely to the `Data` Worker to prevent UI freezing during massive data loads. |
| 36 | + |
| 37 | +--- |
| 38 | + |
| 39 | +## 1. A Basic Fetch Pipeline |
| 40 | + |
| 41 | +Let's start with the most common scenario: fetching a standard JSON file. |
| 42 | + |
| 43 | +When you define a `url` on a Store, under the hood, Neo.mjs automatically creates a Pipeline using an `Xhr` or `Fetch` connection. However, explicitly defining the pipeline gives you total control. |
| 44 | + |
| 45 | +```javascript live-preview |
| 46 | +import Button from '../../src/button/Base.mjs'; |
| 47 | +import Container from '../../src/container/Base.mjs'; |
| 48 | +import ConnectionFetch from '../../src/data/connection/Fetch.mjs'; |
| 49 | +import Model from '../../src/data/Model.mjs'; |
| 50 | +import Store from '../../src/data/Store.mjs'; |
| 51 | +import Table from '../../src/table/Container.mjs'; |
| 52 | + |
| 53 | +class UserModel extends Model { |
| 54 | + static config = { |
| 55 | + className: 'Docs.UserModel', |
| 56 | + keyProperty: 'id', |
| 57 | + fields: [ |
| 58 | + {name: 'id', type: 'Integer'}, |
| 59 | + {name: 'name', type: 'String'}, |
| 60 | + {name: 'role', type: 'String'} |
| 61 | + ] |
| 62 | + } |
| 63 | +} |
| 64 | + |
| 65 | +class UserStore extends Store { |
| 66 | + static config = { |
| 67 | + className: 'Docs.UserStore', |
| 68 | + model : UserModel, |
| 69 | + // The unified pipeline configuration |
| 70 | + pipeline : { |
| 71 | + connection: { |
| 72 | + module: ConnectionFetch, |
| 73 | + url : '../../resources/data/users.json' |
| 74 | + } |
| 75 | + } |
| 76 | + } |
| 77 | +} |
| 78 | + |
| 79 | +const myStore = Neo.create(UserStore); |
| 80 | + |
| 81 | +export default class Example extends Container { |
| 82 | + static config = { |
| 83 | + layout: {ntype: 'vbox', align: 'stretch'}, |
| 84 | + items : [{ |
| 85 | + module : Button, |
| 86 | + text : 'Load Data via Pipeline', |
| 87 | + handler: () => myStore.load() |
| 88 | + }, { |
| 89 | + module : Table, |
| 90 | + flex : 1, |
| 91 | + store : myStore, |
| 92 | + columns: [ |
| 93 | + {dataField: 'id', text: 'ID'}, |
| 94 | + {dataField: 'name', text: 'Name'}, |
| 95 | + {dataField: 'role', text: 'Role'} |
| 96 | + ] |
| 97 | + }] |
| 98 | + } |
| 99 | +} |
| 100 | +``` |
| 101 | + |
| 102 | +--- |
| 103 | + |
| 104 | +## 2. Offloading to the Data Worker |
| 105 | + |
| 106 | +If your JSON file is massive (e.g., 50,000 records), processing that fetch, parsing the JSON string, and converting it into Records inside the App Worker will cause a noticeable stutter in your UI. |
| 107 | + |
| 108 | +Because the Pipeline acts as the Execution Boundary, you can simply tell it to execute in the `Data` Worker. The App Worker Pipeline becomes a lightweight proxy. It instructs the Data Worker to establish the Connection, parse the data, and send only the finalized chunks back via fast IPC (Inter-Process Communication). |
| 109 | + |
| 110 | +To enable this, simply add `workerExecution: 'data'` to your pipeline config: |
| 111 | + |
| 112 | +```javascript |
| 113 | +pipeline: { |
| 114 | + workerExecution: 'data', |
| 115 | + connection: { |
| 116 | + className: 'Neo.data.connection.Fetch', |
| 117 | + url : '../../resources/data/massive_dataset.json' |
| 118 | + } |
| 119 | +} |
| 120 | +``` |
| 121 | +*Note: When using `workerExecution: 'data'`, you must use string-based `className` references (e.g., `'Neo.data.connection.Fetch'`) instead of `module` imports for your connection/parser/normalizer. This ensures the configs can be cleanly serialized and sent to the other thread without dragging App-specific modules across the worker boundary.* |
| 122 | + |
| 123 | +--- |
| 124 | + |
| 125 | +## 3. The RPC/WebSocket Universe |
| 126 | + |
| 127 | +Because RPC and local fetching now share the same architecture, integrating a WebSocket backend is identical to setting up a local Fetch. |
| 128 | + |
| 129 | +If your project defines an RPC API (via `remotes-api.json`), you don't even need to define the connection manually. You just reference the API endpoint, and the system dynamically constructs the WebSocket pipeline for you. |
| 130 | + |
| 131 | +```javascript live-preview |
| 132 | +import Container from '../../src/container/Base.mjs'; |
| 133 | +import Model from '../../src/data/Model.mjs'; |
| 134 | +import Store from '../../src/data/Store.mjs'; |
| 135 | +import Table from '../../src/table/Container.mjs'; |
| 136 | + |
| 137 | +// Assume remotes-api.json defines a WebSocket stream: |
| 138 | +// "services": { "Backend": { "streams": { "LiveUsers": { "type": "websocket", "url": "wss://..." } } } } |
| 139 | + |
| 140 | +class LiveUserModel extends Model { |
| 141 | + static config = { |
| 142 | + className: 'Docs.LiveUserModel', |
| 143 | + keyProperty: 'id', |
| 144 | + fields: [ |
| 145 | + {name: 'id', type: 'Integer'}, |
| 146 | + {name: 'status', type: 'String'} |
| 147 | + ] |
| 148 | + } |
| 149 | +} |
| 150 | + |
| 151 | +class LiveUserStore extends Store { |
| 152 | + static config = { |
| 153 | + className: 'Docs.LiveUserStore', |
| 154 | + model : LiveUserModel, |
| 155 | + // Instead of 'url', we use the 'api' shortcut. |
| 156 | + // The Store automatically builds a Pipeline with a WebSocket connection. |
| 157 | + api : 'MyApp.backend.LiveUsers' |
| 158 | + } |
| 159 | +} |
| 160 | + |
| 161 | +const liveStore = Neo.create(LiveUserStore); |
| 162 | + |
| 163 | +export default class Example extends Container { |
| 164 | + static config = { |
| 165 | + layout: {ntype: 'vbox', align: 'stretch'}, |
| 166 | + items : [{ |
| 167 | + module : Table, |
| 168 | + flex : 1, |
| 169 | + store : liveStore, |
| 170 | + columns: [ |
| 171 | + {dataField: 'id', text: 'User ID'}, |
| 172 | + {dataField: 'status', text: 'Status'} |
| 173 | + ] |
| 174 | + }] |
| 175 | + } |
| 176 | +} |
| 177 | +``` |
| 178 | + |
| 179 | +### Unsolicited Pushes & UI Reactivity |
| 180 | + |
| 181 | +The true power of the Pipeline architecture shines with real-time data. |
| 182 | + |
| 183 | +If a WebSocket Connection receives an unsolicited "push" from the server (e.g., `{ "id": 4, "status": "offline" }`), the Connection fires a `push` event. |
| 184 | +The Pipeline catches this, passes it through the Parser and Normalizer, and forwards it to the Store. |
| 185 | + |
| 186 | +The Store intercepts the `push` event, automatically looks up Record ID #4, and calls `record.set({ status: 'offline' })`. This triggers the surgical reactivity engine, updating **only that specific row** in the Grid, without ever reloading the collection. |
| 187 | + |
| 188 | +## Migration Path for Legacy Configs |
| 189 | + |
| 190 | +If you are upgrading from an older version of Neo.mjs, your existing `url` and `api` configs on Stores are still fully supported. |
| 191 | + |
| 192 | +* **Legacy `url`:** If you define `url: 'data.json'`, the Store automatically creates a Pipeline using `connection-xhr` behind the scenes. |
| 193 | +* **Legacy `api`:** If you define `api: 'MyService'`, the Store resolves the API definition and builds the appropriate Pipeline (Fetch, Xhr, or WebSocket) dynamically. |
| 194 | + |
| 195 | +However, to unlock advanced features like Data Worker offloading (`workerExecution: 'data'`) or custom Parsers, you must switch to explicitly defining the `pipeline` config block. |
0 commit comments