You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: resources/content/release-notes/v12.1.0.md
+36Lines changed: 36 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -54,6 +54,28 @@ We engineered a completely new hierarchical data foundation that fully supports
54
54
***Filtering (`filter`):** When you filter a `TreeStore`, it employs "Ancestor-Aware Filtering" via a top-down recursive evaluation. If a deeply nested child matches the search query, the engine automatically bubbles up a visibility flag, ensuring all of its parent nodes remain visible in the projection. This integrates perfectly with "Turbo Mode Soft Hydration", validating complex nested paths without the V8 garbage-collection penalty of instantiating thousands of Records.
55
55
***Sorting (`doSort`):** Sorting is now perfectly hierarchical. The Store sorts siblings within their localized parent boundary, recursively maintaining the structural integrity before surgically re-calculating the visible projection array.
56
56
***Structural Integrity (The Split-Brain Fix):** Complex operations like `updateKey` have been overridden to automatically migrate descendant `parentId` references. In a traditional DOM, updating an ID deep in a tree is terrifying; in Neo, we simply update the O(1) lookup map and let the VDOM worker handle the visual delta. Full CRUD support is built-in.
57
+
58
+
```javascript readonly
59
+
// Syncing the Structural Layer in O(1) time without rebuilding the VDOM.
***Async Subtree Loading:** For massive datasets that cannot be sent to the client at once, the `TreeStore` supports asynchronous, on-demand loading of child nodes when a user expands a parent branch. We also implemented explicit Error States and dedicated events to handle network failures gracefully within the UI.
58
80
59
81
```mermaid
@@ -94,6 +116,20 @@ To orchestrate complex data ingestion without freezing the UI, we completely ove
94
116
***The Worker Execution Boundary:** The most powerful feature of the new `Neo.data.Pipeline` is that it acts as a fluid execution boundary. Developers can now trivially toggle where the heavy lifting occurs via a simple config (`workerExecution: 'data' | 'app'`). A pipeline can run entirely inside the `App` worker for typical datasets, or be instantly offloaded to the `Data` worker. When offloaded, the App Worker Pipeline becomes a lightweight proxy—it instructs the Data Worker to establish the connection, parse massive JSON payloads, and stream only the finalized records back via fast IPC.
95
117
***The Parser-Normalizer Split:** Data now flows through a strict, modular architecture: `Connection -> Parser -> Normalizer`. The new `Neo.data.normalizer.Base` (and its hierarchical sibling, `Tree`) acts as the final transformation layer, reshaping complex nested payloads from external APIs into predictable shapes before handing them to the Store.
96
118
***The Merged Universe (RPC API Integration):** Historically, Neo's Remote API (`Neo.remotes.Api`) generated typed proxy functions that bypassed data shaping. Now, these two architectures are merged. Developers can define `parser` and `normalizer` configurations directly inside the `remotes-api.json` manifest. When an App Worker ViewController invokes an RPC proxy, the Data Worker automatically intercepts the raw backend response, pipes it through the requested Pipeline components, and delivers perfectly shaped data back to the UI.
119
+
120
+
```json readonly
121
+
// remotes-api.json
122
+
{
123
+
"MyApp.backend.Users": {
124
+
"getUsers": {
125
+
"pipeline": {
126
+
"parser" : "MyApp.parser.User",
127
+
"normalizer": "MyApp.normalizer.User"
128
+
}
129
+
}
130
+
}
131
+
}
132
+
```
97
133
***Progressive Hydration & Delta-Aware Pipelines:** Modern backends often push lightweight "Quick Wins" (like IDs) immediately, streaming complex aggregations later via WebSockets as Operational Transforms. Pipelines are now natively "Delta-Aware." A specialized Parser can evaluate a proprietary backend Opcode, translate it into standard Deltas (Insert, Update, Delete), and feed it to the Store. The `RecordFactory` increments the entity version, and the VDOM Worker surgical-patches only the affected grid cells without losing local UI state or resetting the collection.
98
134
***Dynamic Module Loading:** To keep the worker bundles instantaneously responsive, the `Data` worker dynamically requests and instantiates Connection, Parser, and Normalizer modules exactly when a proxied Pipeline requires them.
0 commit comments