From d3a832baa6979d0d5c435798cad246da94c0f3ab Mon Sep 17 00:00:00 2001 From: Chenglong Wang Date: Sat, 11 Apr 2026 20:18:05 -0700 Subject: [PATCH 1/6] data plugin ui plan --- design-docs/8-unified-data-source-panel.md | 384 +++++++++++++++++++++ tests/superset/.env.superset | 11 + tests/superset/README.md | 67 ++++ tests/superset/docker-compose.yml | 55 +++ tests/superset/init-superset.sh | 8 + tests/superset/sample_data.py | 179 ++++++++++ tests/superset/start.sh | 55 +++ 7 files changed, 759 insertions(+) create mode 100644 design-docs/8-unified-data-source-panel.md create mode 100644 tests/superset/.env.superset create mode 100644 tests/superset/README.md create mode 100644 tests/superset/docker-compose.yml create mode 100755 tests/superset/init-superset.sh create mode 100644 tests/superset/sample_data.py create mode 100755 tests/superset/start.sh diff --git a/design-docs/8-unified-data-source-panel.md b/design-docs/8-unified-data-source-panel.md new file mode 100644 index 00000000..f6c07d32 --- /dev/null +++ b/design-docs/8-unified-data-source-panel.md @@ -0,0 +1,384 @@ +# Unified Data Source Panel — File-Directory Approach + +## Status: Draft / Discussion + +## 1. Problem + +The current Superset plugin uses a two-tab layout (Dashboards tab + Datasets tab) that ultimately does the same thing: load a dataset into the workspace. As we add more data plugins (Superset, Metabase, databases, file uploads, etc.), users need a single, intuitive way to browse and import data from all sources. + +Additionally, there's no clear starting experience — before any data is loaded, the user sees a landing page with upload options and demos, but no persistent "data browser" that encourages exploration. + +## 2. Proposal + +### 2.1 File-Directory Panel on Left Side of Data Thread + +Add a collapsible **data source browser** on the left side, styled like a file system tree. Users can expand/collapse sources, browse their contents, and import data into the workspace with a single click. + +``` +DATA SOURCES (collapsible sidebar) +───────────────────────────────── +▸ 📂 Local Files + upload.csv + paste-data.tsv + +▾ 📂 Superset (connected) + ▾ 📊 Q3 Sales Dashboard + orders_fact (150k rows) [⊕] + product_dim (2k rows) [⊕] + region_hierarchy (500 rows) [⊕] + ▸ 📊 Customer Analytics + ▸ 📁 Ungrouped Datasets + raw_events (1M rows) [⊕] + +▸ 📂 MySQL — analytics-db + schema: public + ▸ users + ▸ events + +▸ 📂 Metabase (not connected) +``` + +**[⊕] = one click to import** into workspace (adds table to data thread). + +### 2.2 Hierarchy Design + +``` +Plugin (data source) + └─ Group (optional: dashboard, schema, folder) + └─ Table / Dataset +``` + +**Open question: should groups nest deeper?** + +| Approach | Example | Pros | Cons | +|----------|---------|------|------| +| **Flat (2 levels)** | Plugin → Tables | Simple, fast to scan | Databases with many schemas may be overwhelming | +| **Grouped (3 levels)** | Plugin → Group → Tables | Natural for dashboards, schemas | Deeper nesting = more clicks | +| **Plugin-defined** | Plugins define their own depth (Superset uses groups, file upload is flat) | Each plugin presents data naturally; respects the source's native structure | Slightly inconsistent tree depth | + +**Recommendation: Plugin-defined hierarchy.** The tree renders whatever structure the plugin provides — Data Formulator doesn't impose or flatten it. Each plugin knows its data best: Superset naturally groups by dashboard, a database plugin exposes schema → table, and file uploads are flat. This respects the source system's native organization and avoids lossy abstraction. + +### 2.3 Interaction Model + +| Action | Behavior | +|--------|----------| +| **Expand plugin** | If not connected, show login/connect prompt inline. If connected, fetch and show contents. | +| **Expand group (dashboard)** | Fetch datasets in that group. Shows row count and column info. | +| **Click [⊕] on a table** | If dataset fits within row limit → import directly. If it exceeds the limit → pop up a filter/column-selection dialog (see §2.5). | +| **Right-click / long-press** | Context menu: Force open filter dialog, custom table name. | +| **Drag table** | (Future) Drag into data thread to position in a specific chain. | +| **Search** | Filter tree by name across all sources. | +| **Switch plugin** | All plugins visible at once — no switching needed. Collapse ones you don't use. | + +### 2.4 Plugin Switching + +Since all plugins appear as top-level folders in one tree, there's **no need to switch** between them. Users just expand the source they want. This is better than tabs/dropdowns: +- No "which tab am I on?" confusion +- Easy to pull data from multiple sources in one session +- Collapsed plugins take minimal space + +### 2.5 Progressive Import: Auto-Filter for Large Datasets + +One button [⊕] handles both small and large datasets: + +**Small dataset (within row limit):** Import happens immediately, no extra steps. + +**Large dataset (exceeds row limit):** A filter dialog pops up automatically: + +``` +┌─ Import: orders_fact (1.2M rows) ──────────────────┐ +│ │ +│ This dataset exceeds the row limit (50,000). │ +│ Select columns and filters to narrow the data. │ +│ │ +│ Columns (12 available): │ +│ ☑ order_id ☑ customer_id ☑ amount │ +│ ☑ region ☐ internal_id ☐ updated_at │ +│ ☑ order_date ☐ raw_payload ☑ status │ +│ │ +│ Filters: │ +│ ┌─────────────┬────┬──────────────────────┐ │ +│ │ region │ = │ US, EU │ │ +│ │ order_date │ >= │ 2025-01-01 │ │ +│ │ │ │ [+ Add filter] │ │ +│ └─────────────┴────┴──────────────────────┘ │ +│ │ +│ Estimated rows after filter: ~38,000 │ +│ │ +│ [Cancel] [Import] │ +└─────────────────────────────────────────────────────┘ +``` + +**Design rationale:** +- **One button for everything** — no upfront decision about "raw vs filtered" +- **Zero friction for small data** — most imports are instant +- **Progressive disclosure** — filter UI only appears when actually needed +- **Column selection** — users can drop columns they don't need, reducing data size +- **Server-side filtering** — filters are applied as SQL WHERE clauses before download, so only the relevant subset crosses the wire +- Users can also right-click any dataset to force-open the filter dialog even for small datasets + +### 2.6 Views vs Tables + +Data sources can expose both **tables** (raw data) and **views** (pre-filtered/transformed data). The tree doesn't distinguish between them at the interaction level — both are leaf nodes with [⊕] to import. The difference is just metadata. + +- A Superset dashboard's filtered dataset = a **view** +- A MySQL `CREATE VIEW` = a **view** +- A raw database table = a **table** + +The plugin labels each leaf node with its type, and optionally shows the view definition as a code snippet: + +``` +▾ 📂 Superset + ▾ 📊 Q3 Sales Dashboard + orders_fact (view) (150k rows) [⊕] + WHERE region IN ('US','EU') AND order_date >= '2025-01-01' + product_dim (table) (2k rows) [⊕] + +▾ 📂 MySQL — analytics-db + ▾ 📁 public + users (table) (500k rows) [⊕] + active_users (view) (50k rows) [⊕] + SELECT * FROM users WHERE status = 'active' +``` + +The `TreeNode` supports this simply: + +```typescript +interface TreeNode { + // ... existing fields ... + metadata?: { + rowCount?: number; + columnCount?: number; + nodeKind?: 'table' | 'view'; // Displayed as label + viewDefinition?: string; // Shown as code snippet if present + }; +} +``` + +Users click [⊕] on either — the import flow is identical. The view definition is informational so users understand what data they're getting. + +## 3. Starting Panel (Empty State) + +Before the user loads any data, the current landing page shows upload options + demo sessions. The question: **how should the data source panel appear here?** + +### Landing Page (Before Data is Loaded) + +The existing landing page is preserved — it shows quick-start actions, example sessions, and recent workspaces. The data source browser is embedded as a section within the landing page, giving users a preview of available sources and encouraging them to connect before entering the editor. + +``` +┌─────────────────────────────────────────────────────────┐ +│ DATA FORMULATOR │ +│ AI-powered data visualization │ +│ │ +│ ┌─ Quick Start ───────┐ ┌─ Data Sources ──────────┐ │ +│ │ 📎 Upload CSV │ │ ▸ Superset (Connect →) │ │ +│ │ 📋 Paste data │ │ ▸ MySQL (Connect →) │ │ +│ │ 🔗 From URL │ │ ▸ Metabase (Connect →) │ │ +│ └─────────────────────┘ └─────────────────────────┘ │ +│ │ +│ ┌─ Examples ──────────────────────────────────────┐ │ +│ │ [Stock Prices] [Gas Prices] [Movies] [...] │ │ +│ └─────────────────────────────────────────────────┘ │ +│ │ +│ Recent Workspaces │ +│ ┌──────────┐ ┌──────────┐ ┌──────────┐ │ +│ │ Sales Q3 │ │ Customer │ │ Survey │ │ +│ │ 3 tables │ │ 5 tables │ │ 2 tables │ │ +│ └──────────┘ └──────────┘ └──────────┘ │ +└─────────────────────────────────────────────────────────┘ +``` + +- The "Data Sources" card lists configured plugins as collapsed entries +- Clicking "Connect →" opens the plugin's auth flow inline or in a dialog +- Once connected, the entry expands to show top-level groups/tables right on the landing page +- Clicking a dataset or uploading a file transitions into the **editor layout** + +### Editor Layout (After Data is Loaded) + +Once the user imports data, the UI transitions to the editor with the file-tree data source panel on the left: + +``` +┌─────────────────┬───────────────────────────────────────┐ +│ DATA SOURCES │ │ +│ │ Data Thread / Visualization │ +│ ▸ Upload Files │ │ +│ ▾ Superset │ (editor content) │ +│ ▾ Q3 Sales │ │ +│ orders_fact │ │ +│ product_dim │ │ +│ ▸ Analytics │ │ +│ ▸ MySQL │ │ +│ ───────────────│ │ +│ WORKSPACE │ │ +│ orders_fact ✓ │ │ +│ my_upload.csv │ │ +└─────────────────┴───────────────────────────────────────┘ +``` + +- The file-tree panel is collapsible to save space +- Already-imported tables show a ✓ badge in the source tree +- The WORKSPACE section below shows tables currently in the workspace + +## 4. Plugin-Provided Metadata Contract + +The UI is **entirely metadata-driven**. Plugins have **no custom frontend code** — there are no per-plugin React components, no `SupersetPanel.tsx` or `MySQLPanel.tsx`. Instead, the frontend reads structured metadata from the plugin's backend API and renders one generic tree component for all plugins. + +The flow: +1. Frontend calls `GET /api/plugins/` → gets list of registered plugins + their descriptors +2. Frontend calls `GET /api/plugins/{id}/children?parentId=...` → gets tree nodes +3. Frontend renders everything using the same generic tree component + +### 4.1 Plugin Descriptor (Static Metadata) + +Returned by the backend at plugin registration / discovery time. Tells the UI what this plugin looks like and what it can do: + +```typescript +interface DataSourcePluginDescriptor { + id: string; // e.g. "superset", "mysql" + displayName: string; // e.g. "Superset" + icon: string; // Icon identifier or URL + + // Authentication + requiresAuth: boolean; + authType?: 'sso' | 'credentials' | 'connection-string'; + + // Hierarchy declaration — tells the UI what levels to expect + hierarchy: HierarchyLevel[]; + + // Capabilities — tells the UI what actions to offer + capabilities: { + search?: boolean; // Can this plugin handle server-side search? + preview?: boolean; // Can tables be previewed before import? + serverSideFilter?: boolean; // Can the plugin apply WHERE clauses before download? + rowLimitOptions?: number[]; // e.g. [20000, 50000, 100000] + }; +} + +// Each level describes one tier of the tree +interface HierarchyLevel { + type: string; // e.g. "dashboard", "schema", "table" + label: string; // Display name for this level, e.g. "Dashboards" + icon?: string; // Default icon for nodes at this level + expandable: boolean; // Does this level have children? + isLeaf?: boolean; // Is this the importable data level? +} +``` + +**Example descriptors:** + +```typescript +// Superset: 2 levels (dashboard → dataset) +{ + id: 'superset', + hierarchy: [ + { type: 'dashboard', label: 'Dashboards', icon: '📊', expandable: true }, + { type: 'dataset', label: 'Datasets', icon: '📄', expandable: false, isLeaf: true } + ], + capabilities: { search: true, serverSideFilter: true, rowLimitOptions: [20000, 50000, 100000] } +} + +// MySQL: 2 levels (schema → table) +{ + id: 'mysql', + hierarchy: [ + { type: 'schema', label: 'Schemas', icon: '📁', expandable: true }, + { type: 'table', label: 'Tables', icon: '📄', expandable: false, isLeaf: true } + ], + capabilities: { search: true, preview: true, rowLimitOptions: [10000, 50000, 200000] } +} + +// File upload: flat (just tables) +{ + id: 'local-files', + hierarchy: [ + { type: 'file', label: 'Files', icon: '📄', expandable: false, isLeaf: true } + ], + capabilities: { search: false } +} +``` + +### 4.2 Backend API Endpoints (Dynamic Metadata) + +Each plugin backend exposes a standard set of REST endpoints. The frontend fetches tree content lazily as the user expands nodes — all through the same generic API shape: + +``` +# All plugins expose the same endpoint pattern: +GET /api/plugins/{plugin_id}/auth/status +POST /api/plugins/{plugin_id}/auth/login +GET /api/plugins/{plugin_id}/children?parentId= +POST /api/plugins/{plugin_id}/load +``` + +The backend plugin implements a standard Python interface: + +```python +class DataSourcePlugin: + descriptor: DataSourcePluginDescriptor + + def get_auth_status(self, session) -> AuthStatus: ... + def authenticate(self, session, credentials) -> AuthResult: ... + + # Tree content — generic node fetching + # parent_id=None → root-level nodes (dashboards, schemas, etc.) + # parent_id= → children of that node + def get_children(self, session, parent_id: str | None) -> list[TreeNode]: ... + + # Import a leaf node into workspace + # options may include column selection, filters, row limit (from the generic filter dialog) + def load_table(self, session, node_id: str, options: LoadOptions) -> LoadResult: ... +``` + +interface TreeNode { + id: string; + name: string; + type: string; // Matches a HierarchyLevel.type + icon?: string; // Override default icon + metadata?: { // Displayed as secondary info + rowCount?: number; + columnCount?: number; + [key: string]: any; // Plugin can add custom display fields + }; + hasChildren: boolean; // Whether expand arrow is shown +} +``` + +### 4.3 How the UI Uses This + +The tree renderer is **one generic React component** (``) shared across all plugins: + +1. On startup, fetches `GET /api/plugins/` → gets all plugin descriptors +2. Reads `descriptor.hierarchy` to know what levels to expect, what icons/labels to use +3. Calls `GET /api/plugins/{id}/children?parentId=` to populate root nodes when plugin is expanded +4. Calls the same endpoint with `parentId=` when a non-leaf node is expanded +5. Shows [⊕] import button on leaf nodes (`isLeaf: true`) +6. On [⊕] click: if dataset fits within row limit, imports directly; if it exceeds the limit, opens the generic filter/column-selection dialog (§2.5) +7. If plugin declares `serverSideFilter: true`, the filter dialog sends column/filter selections to the plugin backend for server-side execution + +**No plugin-specific frontend code exists.** Adding a new data source means writing only a backend plugin that implements the standard Python interface — the UI picks it up automatically. + +## 5. Migration from Current Design + +| Current | New | +|---------|-----| +| SupersetPanel with 2 tabs | Single tree under "Superset" folder, datasets grouped by dashboard | +| DataLoadMenu (upload/paste/URL) | "Local Files" / "Upload" top-level folder in tree | +| Separate plugin panels | All plugins in one tree | +| Landing page with demos | Main area welcome when workspace is empty (Option C) | + +## 6. Design Decisions + +1. **Search scope**: Global search across all plugins with source badges. +2. **Lazy loading**: Load on expand, cache aggressively. +3. **Workspace section in tree**: A "Recently Imported Tables" section appears in the tree, showing tables the user has previously imported across sessions. This makes it easy to reuse the same data for new analysis. For v1, this can simply display tables from existing sessions rather than maintaining a separate copy. +4. **Multi-instance plugins**: Supported. Users can connect multiple instances of the same plugin type (e.g., two MySQL databases). Each instance gets a unique plugin instance ID and appears as a separate top-level folder. +5. **Drag-and-drop**: Click-to-import only for v1. Drag-and-drop from source tree to data thread is a future enhancement. + +## 7. Open Questions + +(None — all resolved.) + +## 7. Related Docs + +- [1-data-source-plugin-architecture.md](1-data-source-plugin-architecture.md) — Plugin system design +- [1-sso-plugin-architecture.md](1-sso-plugin-architecture.md) — SSO authentication +- [2-external-dataloader-enhancements.md](2-external-dataloader-enhancements.md) — Data loading improvements diff --git a/tests/superset/.env.superset b/tests/superset/.env.superset new file mode 100644 index 00000000..a52561de --- /dev/null +++ b/tests/superset/.env.superset @@ -0,0 +1,11 @@ +# Data Formulator env overrides for Superset plugin testing. +# Source this file or pass it to the start script. + +# -- Superset plugin config -- +PLG_SUPERSET_URL=http://localhost:8088 + +# -- Optional: credential vault (Fernet-encrypted SQLite) -- +# CREDENTIAL_VAULT=local + +# -- Optional: SSO login URL (not needed for password testing) -- +# PLG_SUPERSET_SSO_LOGIN_URL=http://localhost:8088/login/?next=/df-sso-bridge/ diff --git a/tests/superset/README.md b/tests/superset/README.md new file mode 100644 index 00000000..755d1d1a --- /dev/null +++ b/tests/superset/README.md @@ -0,0 +1,67 @@ +# Superset Plugin Test Setup + +Spin up a local Apache Superset instance with sample data and connect it to Data Formulator's Superset plugin. + +## Quick Start + +```bash +# Start both Superset and DF (Superset takes ~2 min on first run) +./tests/superset/start.sh + +# Or start them separately: +./tests/superset/start.sh superset # start Superset only +./tests/superset/start.sh df # start DF (assumes Superset is running) + +# Check status +./tests/superset/start.sh status + +# Stop +./tests/superset/start.sh stop +``` + +## What Gets Created + +| Component | URL | Credentials | +|-----------|-----|-------------| +| Superset | http://localhost:8088 | `admin` / `admin` | +| Data Formulator | http://localhost:5567 | — | + +### Sample Datasets + +| Table | Rows | Description | +|-------|------|-------------| +| `df_test_sales` | 100 | Sales data with date, region, product, quantity, price | +| `df_test_employees` | 30 | Employee directory with department, hire date, salary | +| `df_test_weather` | 365 | Daily weather readings for 3 cities | + +Plus Superset's built-in example datasets (if `load_examples` succeeds). + +## Testing the Plugin + +1. Start both services: `./tests/superset/start.sh` +2. Open http://localhost:5567 in your browser +3. Click **Add Data** (the upload button) +4. Under **Connect to Live Data**, you should see an **Apache Superset** card +5. Click it, then log in with `admin` / `admin` +6. Browse datasets and load one into Data Formulator + +## Manual Setup (without the script) + +```bash +# 1. Start Superset +docker compose -f tests/superset/docker-compose.yml up -d + +# 2. Wait for it to be healthy +docker logs -f df-test-superset + +# 3. Start DF with the plugin env var +PLG_SUPERSET_URL=http://localhost:8088 python -m data_formulator +``` + +## Troubleshooting + +- **Superset takes too long**: First startup downloads the image and runs migrations. Check `docker logs df-test-superset`. +- **Plugin tab not showing**: Verify `PLG_SUPERSET_URL` is set. Check `./tests/superset/start.sh status`. +- **Login fails**: Make sure Superset is healthy: `curl http://localhost:8088/health`. +- **Datasets not visible**: Log into Superset UI at http://localhost:8088 and check Data → Datasets. You may need to manually add the `df_test_*` tables if auto-registration failed. +- **Port conflict**: Edit `docker-compose.yml` to change the port mapping and update `.env.superset` accordingly. diff --git a/tests/superset/docker-compose.yml b/tests/superset/docker-compose.yml new file mode 100644 index 00000000..7c2a6bed --- /dev/null +++ b/tests/superset/docker-compose.yml @@ -0,0 +1,55 @@ +# Superset test instance for Data Formulator plugin testing. +# Usage: docker compose -f tests/superset/docker-compose.yml up -d +# +# Default credentials: admin / admin +# Superset UI: http://localhost:8088 + +services: + superset: + image: apache/superset:4.1.1 + container_name: df-test-superset + ports: + - "8088:8088" + environment: + # Required secret key + SUPERSET_SECRET_KEY: "df-test-secret-key-change-in-prod-1234567890" + # Disable CSP / CSRF for local testing + TALISMAN_ENABLED: "False" + # Allow CORS from DF dev server + SUPERSET_CORS_ENABLED: "true" + SUPERSET_CORS_ORIGINS: '["http://localhost:5567","http://127.0.0.1:5567","http://localhost:5173","http://127.0.0.1:5173"]' + # Enable public role access for guest browsing + PUBLIC_ROLE_LIKE: "Gamma" + volumes: + - superset-data:/app/superset_home + - ./init-superset.sh:/docker-entrypoint-initdb.d/init-superset.sh:ro + - ./sample_data.py:/tmp/sample_data.py:ro + # Override entrypoint to run init then start + entrypoint: ["/bin/bash", "-c"] + command: + - | + # Run DB migration and create admin user + superset db upgrade + superset fab create-admin \ + --username admin \ + --firstname Admin \ + --lastname User \ + --email admin@example.com \ + --password admin || true + superset init + + # Load sample data (bundled examples + our custom datasets) + superset load_examples --force || echo "Examples load failed (non-fatal)" + python /tmp/sample_data.py || echo "Custom sample data load failed (non-fatal)" + + # Start the server + superset run -h 0.0.0.0 -p 8088 --with-threads --reload + healthcheck: + test: ["CMD", "curl", "-f", "http://localhost:8088/health"] + interval: 15s + timeout: 10s + retries: 10 + start_period: 120s + +volumes: + superset-data: diff --git a/tests/superset/init-superset.sh b/tests/superset/init-superset.sh new file mode 100755 index 00000000..59b75c6e --- /dev/null +++ b/tests/superset/init-superset.sh @@ -0,0 +1,8 @@ +#!/usr/bin/env bash +# This script is mounted into the container but is NOT used as the entrypoint. +# The actual init sequence is in docker-compose.yml command. +# This file is kept as a reference if you need to customize init further. + +echo "[init-superset] Initialization is handled by docker-compose command." +echo "[init-superset] Admin credentials: admin / admin" +echo "[init-superset] Test datasets: df_test_sales, df_test_employees, df_test_weather" diff --git a/tests/superset/sample_data.py b/tests/superset/sample_data.py new file mode 100644 index 00000000..61055388 --- /dev/null +++ b/tests/superset/sample_data.py @@ -0,0 +1,179 @@ +#!/usr/bin/env python3 +"""Load sample datasets into the Superset test instance's default SQLite DB. + +This runs inside the Superset container after `superset init`. +It creates small, self-contained tables useful for testing the DF plugin: + - df_test_sales (100 rows, mixed types) + - df_test_employees (30 rows, names and departments) + - df_test_weather (365 rows, daily temps) +""" + +import random +import datetime +import sqlite3 +import os + +DB_PATH = os.path.expanduser("~/.superset/superset.db") +# Fallback: newer Superset images may use a different path +if not os.path.exists(DB_PATH): + DB_PATH = "/app/superset_home/superset.db" + + +def create_tables(conn: sqlite3.Connection) -> None: + cur = conn.cursor() + + # -- df_test_sales -- + cur.execute("DROP TABLE IF EXISTS df_test_sales") + cur.execute(""" + CREATE TABLE df_test_sales ( + id INTEGER PRIMARY KEY, + date TEXT, + region TEXT, + product TEXT, + quantity INTEGER, + unit_price REAL, + revenue REAL + ) + """) + + regions = ["North", "South", "East", "West"] + products = ["Widget A", "Widget B", "Gadget X", "Gadget Y", "Gizmo Z"] + rng = random.Random(42) + + rows = [] + base = datetime.date(2025, 1, 1) + for i in range(1, 101): + d = base + datetime.timedelta(days=rng.randint(0, 364)) + region = rng.choice(regions) + product = rng.choice(products) + qty = rng.randint(1, 50) + price = round(rng.uniform(5.0, 100.0), 2) + rows.append((i, d.isoformat(), region, product, qty, price, round(qty * price, 2))) + + cur.executemany( + "INSERT INTO df_test_sales VALUES (?, ?, ?, ?, ?, ?, ?)", rows + ) + + # -- df_test_employees -- + cur.execute("DROP TABLE IF EXISTS df_test_employees") + cur.execute(""" + CREATE TABLE df_test_employees ( + id INTEGER PRIMARY KEY, + name TEXT, + department TEXT, + hire_date TEXT, + salary REAL + ) + """) + + departments = ["Engineering", "Marketing", "Sales", "HR", "Finance"] + first_names = [ + "Alice", "Bob", "Carol", "David", "Eve", "Frank", "Grace", + "Heidi", "Ivan", "Judy", "Karl", "Laura", "Mike", "Nina", + "Oscar", "Pat", "Quinn", "Rose", "Steve", "Tina", + "Uma", "Vic", "Wendy", "Xander", "Yuki", "Zara", + "Amir", "Beth", "Chen", "Diana" + ] + + emp_rows = [] + for i, name in enumerate(first_names, start=1): + dept = departments[i % len(departments)] + hire = (datetime.date(2018, 1, 1) + datetime.timedelta(days=rng.randint(0, 2500))).isoformat() + salary = round(rng.uniform(50000, 150000), 2) + emp_rows.append((i, name, dept, hire, salary)) + + cur.executemany( + "INSERT INTO df_test_employees VALUES (?, ?, ?, ?, ?)", emp_rows + ) + + # -- df_test_weather -- + cur.execute("DROP TABLE IF EXISTS df_test_weather") + cur.execute(""" + CREATE TABLE df_test_weather ( + date TEXT PRIMARY KEY, + city TEXT, + temp_high REAL, + temp_low REAL, + precipitation REAL, + condition TEXT + ) + """) + + cities = ["Seattle", "New York", "Austin"] + conditions = ["Sunny", "Cloudy", "Rainy", "Snowy", "Partly Cloudy"] + + weather_rows = [] + for day_offset in range(365): + d = (datetime.date(2025, 1, 1) + datetime.timedelta(days=day_offset)).isoformat() + city = cities[day_offset % len(cities)] + month = (day_offset // 30) % 12 + # Rough seasonal variation + base_temp = 40 + 30 * (1 - abs(month - 6) / 6.0) + high = round(base_temp + rng.uniform(0, 15), 1) + low = round(base_temp - rng.uniform(5, 15), 1) + precip = round(max(0, rng.gauss(0.1, 0.3)), 2) + cond = rng.choice(conditions) + weather_rows.append((d, city, high, low, precip, cond)) + + cur.executemany( + "INSERT INTO df_test_weather VALUES (?, ?, ?, ?, ?, ?)", weather_rows + ) + + conn.commit() + print(f"[sample_data] Created df_test_sales (100), df_test_employees (30), df_test_weather (365)") + + +def register_datasets_in_superset() -> None: + """Register our tables as Superset datasets via the Superset Python API. + + This runs inside the Superset process context so we can use + superset's own SQLAlchemy models. + """ + try: + from superset.app import create_app + from superset.connectors.sqla.models import SqlaTable + from superset.extensions import db as superset_db + + app = create_app() + with app.app_context(): + # Find the default "examples" database + from superset.models.core import Database + examples_db = superset_db.session.query(Database).filter_by( + database_name="examples" + ).first() + + if not examples_db: + print("[sample_data] Warning: 'examples' database not found, skipping dataset registration") + return + + for table_name in ["df_test_sales", "df_test_employees", "df_test_weather"]: + existing = superset_db.session.query(SqlaTable).filter_by( + table_name=table_name, database_id=examples_db.id + ).first() + if existing: + print(f"[sample_data] Dataset '{table_name}' already registered") + continue + + dataset = SqlaTable( + table_name=table_name, + database_id=examples_db.id, + schema=None, + ) + superset_db.session.add(dataset) + print(f"[sample_data] Registered dataset '{table_name}'") + + superset_db.session.commit() + + except Exception as e: + print(f"[sample_data] Dataset registration failed (non-fatal): {e}") + print("[sample_data] Tables exist in SQLite but may need manual registration in Superset UI") + + +if __name__ == "__main__": + # Step 1: Create the tables in the examples SQLite database + conn = sqlite3.connect(DB_PATH) + create_tables(conn) + conn.close() + + # Step 2: Register as Superset datasets + register_datasets_in_superset() diff --git a/tests/superset/start.sh b/tests/superset/start.sh new file mode 100755 index 00000000..ae72eae0 --- /dev/null +++ b/tests/superset/start.sh @@ -0,0 +1,55 @@ +#!/usr/bin/env bash +# Start Superset + DF backend for plugin dev. Run `npx vite` yourself for frontend. +# +# Usage: +# ./tests/superset/start.sh # start Superset + DF backend +# ./tests/superset/start.sh stop # tear down Superset container + +set -e + +SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" +REPO_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)" +COMPOSE_FILE="$SCRIPT_DIR/docker-compose.yml" + +info() { echo -e "\033[0;32m[✓]\033[0m $1"; } +warn() { echo -e "\033[1;33m[!]\033[0m $1"; } + +# --- Stop mode --- +if [ "${1:-}" = "stop" ]; then + info "Stopping Superset..." + docker compose -f "$COMPOSE_FILE" down 2>/dev/null || true + info "Done" + exit 0 +fi + +# --- Load env --- +set -a +[ -f "$SCRIPT_DIR/.env.superset" ] && source "$SCRIPT_DIR/.env.superset" +[ -f "$REPO_ROOT/.env" ] && source "$REPO_ROOT/.env" +export PLG_SUPERSET_URL="${PLG_SUPERSET_URL:-http://localhost:8088}" +set +a + +# --- Start Superset --- +if docker ps --format '{{.Names}}' | grep -q "^df-test-superset$"; then + info "Superset already running" +else + info "Starting Superset (first run takes ~2 min)..." + docker compose -f "$COMPOSE_FILE" up -d + info "Waiting for Superset..." + until curl -sf http://localhost:8088/health > /dev/null 2>&1; do sleep 3; done + info "Superset ready" +fi + +# --- Start DF backend --- +echo "" +info "Superset: http://localhost:8088 (admin / admin)" +info "DF backend: http://localhost:5567" +info "Run 'npx vite' in another terminal for frontend on http://localhost:5173" +echo "" + +cd "$REPO_ROOT" +if command -v uv &> /dev/null; then + exec uv run data_formulator --port 5567 --dev +else + exec python -m data_formulator --port 5567 --dev +fi From 2e10da77905bbd92b11c6fe764347aa522abe5ce Mon Sep 17 00:00:00 2001 From: Chenglong Wang Date: Tue, 14 Apr 2026 10:03:22 -0700 Subject: [PATCH 2/6] halfway --- .../9-generalized-data-source-plugins.md | 1693 +++++++++++++++++ py-src/data_formulator/app.py | 13 + py-src/data_formulator/connected_source.py | 632 ++++++ .../data_formulator/data_loader/__init__.py | 5 +- .../data_loader/athena_data_loader.py | 97 +- .../data_loader/azure_blob_data_loader.py | 87 +- .../data_loader/bigquery_data_loader.py | 94 +- .../data_loader/external_data_loader.py | 261 ++- .../data_loader/kusto_data_loader.py | 103 +- .../data_loader/mongodb_data_loader.py | 72 +- .../data_loader/mssql_data_loader.py | 214 ++- .../data_loader/mysql_data_loader.py | 179 +- .../data_loader/postgresql_data_loader.py | 234 ++- .../data_loader/s3_data_loader.py | 87 +- .../data_loader/superset_data_loader.py | 406 ++++ .../datalake/azure_blob_workspace.py | 26 +- py-src/data_formulator/datalake/workspace.py | 30 +- .../datalake/workspace_metadata.py | 4 + py-src/data_formulator/plugins/data_writer.py | 8 +- py-src/data_formulator/tables_routes.py | 21 +- src/app/dfSlice.tsx | 11 + src/app/tableThunks.ts | 35 +- src/app/utils.tsx | 18 + src/views/DBTableManager.tsx | 191 +- tests/backend/unit/test_plugin_data_writer.py | 8 +- .../test_bigquery/test_bigquery_loader.py | 20 +- .../test_mongodb/test_mongodb_loader.py | 14 +- tests/plugin/test_mysql/test_mysql_loader.py | 10 +- tests/plugin/test_mysql_datalake.py | 5 +- .../test_postgres/test_postgresql_loader.py | 10 +- 30 files changed, 4310 insertions(+), 278 deletions(-) create mode 100644 design-docs/9-generalized-data-source-plugins.md create mode 100644 py-src/data_formulator/connected_source.py create mode 100644 py-src/data_formulator/data_loader/superset_data_loader.py diff --git a/design-docs/9-generalized-data-source-plugins.md b/design-docs/9-generalized-data-source-plugins.md new file mode 100644 index 00000000..51cafcd1 --- /dev/null +++ b/design-docs/9-generalized-data-source-plugins.md @@ -0,0 +1,1693 @@ +# Generalized Data Source Plugins — Unifying DataLoader + Plugin into a Lifecycle-Managed Connection + +## Status: Phase 1 complete + +## 1. Problem + +We have **two separate abstractions** for loading external data: + +| Abstraction | Example | Auth | Catalog Browsing | Refresh | Session Lifecycle | +|-------------|---------|------|-------------------|---------|-------------------| +| **ExternalDataLoader** | MySQL, PostgreSQL, Kusto, BigQuery, S3 | One-shot (params in request) | `list_tables()` per request | Manual re-import | None — stateless | +| **DataSourcePlugin** | Superset | Full (login/session/vault) | Rich catalog with caching | Not implemented | Full — session, token refresh | + +This split causes problems: + +1. **No persistent connections for databases.** A user who connects to PostgreSQL to browse tables must re-send credentials every time. There's no "logged into Postgres" state. +2. **No refresh.** Once a table is imported from MySQL, there's no way to re-pull the latest data without manually re-entering connection details. +3. **The Superset plugin is over-specialized.** It hard-codes dashboard/dataset concepts. Meanwhile, Kusto, PostgreSQL, MySQL all need the same pattern (auth → browse catalog → filter → import → refresh) but don't have it. +4. **Plugin naming is BI-centric.** `DataSourcePlugin` was designed for BI platforms (Superset, Metabase), but the real need is broader: any system you can authenticate into and continuously pull data from. + +### The Key Insight + +A DataLoader already knows *how* to talk to a data source (connect, list tables, fetch data). A Plugin knows *how* to manage a session (login, persist auth, browse, present UI). **Combining them gives us a lifecycle-managed data connection** — which is what users actually want. + +## 2. Proposal: `ConnectedDataSource` — A Generalized Plugin Built from a DataLoader + +### 2.1 Core Idea + +Define a **generic plugin factory** that takes any `ExternalDataLoader` class and automatically wraps it with: + +- **Session management** — persistent connection state (logged in / not) +- **Catalog browsing** — `list_tables()` exposed as a browsable tree +- **Filtered import** — column selection + row limits +- **Refresh** — re-fetch a previously imported table with the same parameters +- **Auto-discovery** — same env-var gating as existing plugins + +This means: to add "PostgreSQL as a connected data source," you write **zero new plugin code**. The existing `PostgreSQLDataLoader` is automatically promoted to a full plugin with auth, catalog, refresh, and UI. + +### 2.2 Architecture + +``` +┌─────────────────────────────────────────────────────────────┐ +│ ConnectedDataSource │ +│ (generic plugin framework) │ +│ │ +│ ┌──────────────┐ ┌──────────────┐ ┌───────────────────┐ │ +│ │ Auth Layer │ │ Catalog Layer│ │ Data Layer │ │ +│ │ │ │ │ │ │ │ +│ │ • login() │ │ • list() │ │ • load() │ │ +│ │ • logout() │ │ • detail() │ │ • refresh() │ │ +│ │ • status() │ │ • search() │ │ • preview() │ │ +│ │ • refresh() │ │ • tree() │ │ │ │ +│ └──────┬───────┘ └──────┬───────┘ └────────┬──────────┘ │ +│ │ │ │ │ +│ └─────────────────┼────────────────────┘ │ +│ │ │ +│ ┌───────▼────────┐ │ +│ │ ExternalData │ │ +│ │ Loader │ │ +│ │ (existing) │ │ +│ └────────────────┘ │ +└─────────────────────────────────────────────────────────────┘ +``` + +### 2.3 The Unification: Databases and BI Tools Are Both Hierarchical Data Sources + +From DF's perspective, **every external data source is the same thing**: an authenticated system with a hierarchical catalog whose leaf nodes are importable tables. The only difference is what the intermediate levels are called: + +| Source Type | Hierarchy | Leaf Node | +|-------------|-----------|----------| +| MySQL | `server → database → table` | table | +| PostgreSQL | `server → database → schema → table` | table / view | +| BigQuery | `project → dataset → table` | table / view | +| Kusto | `cluster → database → table` | table | +| S3 | `bucket → prefix → object` | CSV/Parquet file | +| **Superset** | `instance → dashboard → dataset` | dataset (= filtered table) | +| **Metabase** | `instance → collection → question` | question (= query result) | +| **Grafana** | `instance → datasource → query` | query result | + +The core user loop is always: **connect → browse tree → pick leaf → import → refresh.** + +This means we don't need separate abstractions for "BI plugin" vs. "database plugin." We unify them: + +| Component | Change | +|-----------|--------| +| `ExternalDataLoader` | **Evolves** into the universal data protocol. Gains `catalog_hierarchy()` + `ls()` + `effective_hierarchy()` for tree browsing with scope pinning. | +| `DataSourcePlugin` | **Stays** as the abstract base, but now primarily implemented via `ConnectedDataSource`. | +| **New: `ConnectedDataSource`** | Generic `DataSourcePlugin` subclass that wraps any `ExternalDataLoader`. Auto-generates auth/catalog/data routes. | +| **New: `ConnectedDataSourcePanel`** | Generic React component for all connected data sources (login → tree browser → import). | +| `SupersetPlugin` | **Migrates** to a `ConnectedDataSource` backed by a `SupersetLoader`. Dashboards are `"namespace"` nodes, datasets are `"table"` nodes — hierarchy labels provide the UI terminology. | + +## 3. API Design + +### 3.1 Backend: `ConnectedDataSource` Base + +```python +class ConnectedDataSource(DataSourcePlugin): + """A DataSourcePlugin auto-generated from an ExternalDataLoader. + + Provides lifecycle management: connection persistence, catalog browsing, + filtered import, and refresh — all driven by the underlying loader. + """ + + # Subclass must set these (or override manifest()) + LOADER_CLASS: type[ExternalDataLoader] # e.g., PostgreSQLDataLoader + SOURCE_ID: str # e.g., "postgresql" + SOURCE_NAME: str # e.g., "PostgreSQL" + + # ----- Auto-generated manifest from loader metadata ----- + + @staticmethod + def manifest() -> dict: + """Built from LOADER_CLASS.list_params() + SOURCE_ID.""" + return { + "id": cls.SOURCE_ID, + "name": cls.SOURCE_NAME, + "env_prefix": f"PLG_{cls.SOURCE_ID.upper()}", + "required_env": [], # DB plugins enabled by default (user provides creds at runtime) + "auth_modes": ["password"], + "capabilities": ["tables", "refresh"], + } + + # ----- Auth Routes (auto-generated) ----- + # POST /api/plugins/{id}/auth/connect — validate & persist connection + # POST /api/plugins/{id}/auth/disconnect — tear down connection + # GET /api/plugins/{id}/auth/status — is connection alive? + + # ----- Catalog Routes (auto-generated) ----- + # POST /api/plugins/{id}/catalog/ls — list children at a path (lazy) + # POST /api/plugins/{id}/catalog/metadata — get metadata for one node + + # ----- Data Routes (auto-generated) ----- + # POST /api/plugins/{id}/data/import — fetch & import to workspace + # POST /api/plugins/{id}/data/refresh — re-import with stored params + # POST /api/plugins/{id}/data/preview — fetch first N rows for preview +``` + +### 3.2 The Full API Surface + +#### 3.2.1 Auth / Connection Management + +``` +POST /api/plugins/{id}/auth/connect + Body: { params: { host, port, user, password, database, ... } } + Response: { status: "connected", user: "...", server: "...", database: "..." } + Side-effect: Validates connection, stores params in session (+ vault if available) + +POST /api/plugins/{id}/auth/disconnect + Response: { status: "disconnected" } + Side-effect: Clears session + vault + +GET /api/plugins/{id}/auth/status + Response: { + connected: true/false, + user: "...", + server: "...", + database: "...", + params_form: [...] // list_params() for the login form if not connected + } + Side-effect: If session empty but vault has creds → auto-reconnect +``` + +**Note on auth diversity:** "Connecting" means different things for different sources. For traditional databases it's validating host/user/password (e.g., `SELECT 1`). For cloud databases it may be OAuth (Azure AD for Kusto, IAM for AWS RDS). For BI tools it's obtaining a JWT. The framework doesn't care — the loader's `list_params()` declares what it needs, and the `auth_mode()` (see §6.3) tells the framework whether to persist a connection object or a token. The generic connection form renders whatever params the loader declares (password fields, file pickers for service account keys, OAuth redirect buttons, etc.). + +#### 3.2.2 Catalog Browsing (Tree-Based) + +The catalog is a **lazy tree** that mirrors the data source's natural hierarchy (see §3.4 for full design). Each expand in the UI triggers one API call. + +We use **POST** for catalog APIs (not GET) because: +- `path` is structured data (JSON array) that may contain special characters (dots, spaces in dashboard names) +- The request body will grow as we add filters, pagination, and import context +- Catalog results are not cacheable — the source data changes + +``` +POST /api/plugins/{id}/catalog/ls + Body: { + path: [], // JSON array: [] = root, ["mydb"], ["mydb","public"] + filter: "...", // optional name filter + } + Response: { + hierarchy: ["database", "schema", "table"], // source's level labels (from catalog_hierarchy) + effective_hierarchy: ["schema", "table"], // browsable levels (pinned levels removed) + path: [], + nodes: [ + { name: "analytics", node_type: "namespace", path: ["analytics"], + metadata: { table_count: 42 } }, + { name: "production", node_type: "namespace", path: ["production"], + metadata: { table_count: 15 } }, + ... + ] + } + +POST /api/plugins/{id}/catalog/ls + Body: { path: ["production", "public"] } + Response: { + hierarchy: ["database", "schema", "table"], + effective_hierarchy: ["schema", "table"], + path: ["production", "public"], + nodes: [ + { name: "users", node_type: "table", path: ["production","public","users"], + metadata: { row_count: 150000, columns: [...] } }, + ... + ] + } + +POST /api/plugins/{id}/catalog/metadata + Body: { path: ["production", "public", "users"] } + Response: { + name: "users", + path: ["production", "public", "users"], + node_type: "table", + columns: [...], // full column detail + row_count: 150000, + sample_rows: [...], // first 5 rows for preview + description: "...", // table comment if available + } +``` + +**How this maps to `ExternalDataLoader`:** The `ls(path)` method (§3.4) drives every tree expansion. `ConnectedDataSource` adds caching (per-session, with TTL) on top. + +#### 3.2.3 Data Loading + Refresh + +``` +POST /api/plugins/{id}/data/import + Body: { + source_table: "public.users", + table_name: "users", // name in workspace (optional, auto-generated) + size: 50000, // row limit + sort_columns: ["created_at"], + sort_order: "desc", + columns: ["id", "email", "name"], // column selection (optional) + } + Response: { + table_id: "tbl_abc123", + table_name: "users", + row_count: 50000, + columns: [...], + refreshable: true, + refresh_params: { ... } // stored for later refresh + } + +POST /api/plugins/{id}/data/refresh + Body: { + table_id: "tbl_abc123", // workspace table to refresh + } + Response: { + table_id: "tbl_abc123", + row_count: 52000, // may differ from original + refreshed_at: "2026-04-13T10:30:00Z" + } + Side-effect: Re-runs the same fetch with stored params, overwrites parquet + +POST /api/plugins/{id}/data/preview + Body: { + source_table: "public.users", + columns: ["id", "email"], // optional column selection + size: 10 // small preview + } + Response: { + columns: [...], + rows: [...] // first N rows + } +``` + +### 3.3 Refresh Mechanism + +Refresh is a first-class concept. When a table is imported via a `ConnectedDataSource`, the workspace metadata stores: + +```python +{ + "table_id": "tbl_abc123", + "table_name": "users", + "source": { + "plugin_id": "postgresql", # which plugin + "source_table": "public.users", # what was fetched + "size": 50000, + "sort_columns": ["created_at"], + "sort_order": "desc", + "columns": ["id", "email", "name"], # column selection + "fetched_at": "2026-04-13T10:00:00Z" + }, + "refreshable": True +} +``` + +On refresh: +1. Check if the plugin connection is still alive (auto-reconnect via vault if needed) +2. Re-run `loader.fetch_data_as_arrow()` with stored params +3. Overwrite the parquet file in workspace +4. Update `fetched_at` timestamp +5. Notify frontend of updated row count / schema changes + +### 3.4 Hierarchical Catalog Exploration + +#### The Problem with Single-Database Loaders + +Current loaders are scoped to a single database at init time: + +| Loader | Init Scope | `list_tables()` Sees | Natural Full Hierarchy | +|--------|-----------|---------------------|------------------------| +| MySQL | `host + database` | Tables in that one DB | `server → database → table` | +| PostgreSQL | `host + database` | Schemas + tables in one DB | `server → database → schema → table` | +| MSSQL | `server + database` | Schemas + tables in one DB | `server → database → schema → table` | +| Kusto | `cluster + database` | Tables in that one DB | `cluster → database → table` | +| BigQuery | `project (+ dataset)` | Datasets + tables | `project → dataset → table` | +| MongoDB | `host + database` | Collections in one DB | `server → database → collection` | +| S3 | `bucket` | Keys in that bucket | `bucket → prefix → object` | + +This means a user exploring a MySQL server with 10 databases must disconnect and reconnect 10 times. That's friction we should eliminate. + +#### Proposed: Tree-Based Catalog Model + +Instead of the flat `list_tables()` → `[table, table, ...]` model, introduce a **tree-based catalog** where loaders declare their hierarchy and support lazy expansion at each level: + +```python +@dataclass +class CatalogNode: + """A node in the data source's catalog tree. + + Only two kinds of node: + - "namespace" — expandable container (database, schema, bucket, dashboard, …). + The hierarchy's "label" tells the UI what to call it. + - "table" — importable leaf (table, file, dataset, …). + + The *level name* (e.g. "Database", "Schema") comes from + catalog_hierarchy(), not from the node itself. + """ + name: str # Display name ("public", "users", "events") + node_type: str # "namespace" or "table" + path: list[str] # Full path from root: ["mydb", "public", "users"] + metadata: dict | None = None # Row count, column info, description, etc. +``` + +This follows the **Iceberg REST / Unity Catalog convention**: every container is a `namespace`, every importable unit is a `table`. The hierarchy labels (what to call each level in the UI) come from `catalog_hierarchy()`, keeping the node model itself minimal and universal. + +Each data source declares its hierarchy as a sequence of **level descriptors** — each with a type key and the display label users see: + +```python +class ExternalDataLoader(ABC): + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + """Declare the levels in this source's catalog tree. + + Returns ordered list from root to leaf. Each entry has: + - "key": internal identifier (used in params, APIs) + - "label": user-facing display name + + The last level is always the importable unit (table/file/dataset). + + Examples: + MySQL: + [{"key": "database", "label": "Database"}, + {"key": "table", "label": "Table"}] + + PostgreSQL: + [{"key": "database", "label": "Database"}, + {"key": "schema", "label": "Schema"}, + {"key": "table", "label": "Table"}] + + BigQuery: + [{"key": "project", "label": "Project"}, + {"key": "dataset", "label": "Dataset"}, + {"key": "table", "label": "Table"}] + + Superset: + [{"key": "dashboard", "label": "Dashboard"}, + {"key": "dataset", "label": "Dataset"}] + + S3: + [{"key": "bucket", "label": "Bucket"}, + {"key": "prefix", "label": "Folder"}, + {"key": "object", "label": "File"}] + + Default: [{"key": "table", "label": "Table"}] (flat). + """ + return [{"key": "table", "label": "Table"}] +``` + +The keys serve double duty: they match the parameter names in `list_params()` (see §3.4.2 Scope Pinning), and the labels are what users see in the tree UI — so each source presents its own natural terminology. + +#### Lazy Expansion API + +Browsing happens **one level at a time**, like expanding directories in a file browser. The loader only fetches children when the user expands a node: + +```python +class ExternalDataLoader(ABC): + + def ls( + self, + path: list[str] | None = None, + filter: str | None = None, + ) -> list[CatalogNode]: + """List children at a catalog path (like `ls` in a filesystem). + + path is relative to the *effective* (unpinned) hierarchy. + + * path=[] — list nodes at the first browsable level. + * path=["public"] — expand that node one level deeper. + + Nodes are either "namespace" (expandable) or "table" (importable leaf). + The hierarchy's label tells the UI what to call each level. + + Args: + path: Path to list, as a list of names at each level. + None or [] = root level. + filter: Optional name filter (substring match). + + Returns: + Children at the given path. + + Examples: + MySQL (database not pinned): + ls([]) → [CatalogNode("mydb", "namespace", ["mydb"])] + ls(["mydb"]) → [CatalogNode("users", "table", ["mydb","users"])] + + PostgreSQL (database not pinned): + ls([]) → [CatalogNode("analytics", "namespace", ["analytics"])] + ls(["analytics"]) → [CatalogNode("public", "namespace", ["analytics","public"])] + ls(["analytics","public"])→ [CatalogNode("users", "table", ["analytics","public","users"])] + + PostgreSQL (database="analytics" pinned → effective hierarchy is schema→table): + ls([]) → [CatalogNode("public", "namespace", ["public"])] + ls(["public"]) → [CatalogNode("users", "table", ["public","users"])] + + BigQuery (project pinned): + ls([]) → [CatalogNode("sales", "namespace", ["sales"])] + ls(["sales"]) → [CatalogNode("orders", "table", ["sales","orders"])] + """ + pass +``` + +#### Scope Pinning: Pre-Configuring the Starting Level + +Not every user should browse from the top. An admin might restrict a deployment to one database, or a user might only care about one schema. **Scope pinning** lets connection params fix one or more hierarchy levels, so the tree starts deeper: + +``` +Full hierarchy (MySQL): server → database → table +Pinned to database="mydb": server → table (user sees tables directly) + +Full hierarchy (PostgreSQL): server → database → schema → table +Pinned to database="prod": server → schema → table +Pinned to db+schema: server → table +``` + +This works naturally because hierarchy level keys match parameter names in `list_params()`. When a connection param matches a hierarchy level key, that level is pinned and hidden from browsing: + +```python +# MySQL — no pinning: user browses databases → tables +MySQLDataLoader({"host": "db.example.com", "user": "me", "password": "..."}) +# ls([]) → [CatalogNode("mydb", "namespace", ["mydb"]), CatalogNode("other", "namespace", ["other"])] +# ls(["mydb"]) → [CatalogNode("users", "table", ["mydb","users"]), ...] + +# MySQL — database pinned: user sees tables directly +MySQLDataLoader({"host": "db.example.com", "user": "me", "password": "...", "database": "mydb"}) +# ls([]) → [CatalogNode("users", "table", ["users"]), ...] (database level skipped) + +# PostgreSQL — database pinned, schema free: user browses schemas → tables +PostgreSQLDataLoader({"host": "...", "user": "...", "password": "...", "database": "prod"}) +# ls([]) → [CatalogNode("public", "namespace", ["public"]), CatalogNode("analytics", "namespace", ["analytics"])] +# ls(["public"]) → [CatalogNode("users", "table", ["public","users"]), ...] + +# BigQuery — project pinned: user browses datasets → tables +BigQueryDataLoader({"project": "my-gcp-project"}) +# ls([]) → [CatalogNode("sales", "namespace", ["sales"]), ...] +``` + +The loader determines the **effective hierarchy** at connection time: + +```python +class ExternalDataLoader(ABC): + def effective_hierarchy(self) -> list[dict[str, str]]: + """Remove pinned levels from the catalog hierarchy. + + A level is pinned when the user provided a non-empty value for its + key in the connection params (e.g., database="prod" pins the database level). + """ + params = getattr(self, "params", {}) or {} + full = self.catalog_hierarchy() + return [level for level in full if not params.get(level["key"])] + + def pinned_scope(self) -> dict[str, str]: + """Return {level_key: value} for every pinned hierarchy level.""" + params = getattr(self, "params", {}) or {} + return { + level["key"]: params[level["key"]] + for level in self.catalog_hierarchy() + if params.get(level["key"]) + } +``` + +**How pinning is configured:** + +| Who | How | Example | +|-----|-----|---------| +| **Admin (env vars)** | Pre-fill params via `PLG_{ID}_{PARAM}` env vars. User never sees these fields. | `PLG_MYSQL_HOST=db.internal PLG_MYSQL_DATABASE=analytics` → users only see tables in `analytics` | +| **Admin (connection form)** | Mark params as `hidden` in `list_params()` when env var provides the value | Same as above, but the form shows remaining fields only | +| **User (connection form)** | Fill in or leave blank optional scope params | Leave `database` empty → browse all; fill it in → pinned to that DB | + +#### How `list_params()` Supports Scope Pinning + +```python +@staticmethod +def list_params() -> list[dict[str, Any]]: + return [ + {"name": "host", "type": "string", "required": True, "description": "Database host"}, + {"name": "port", "type": "number", "required": True, "default": 3306}, + {"name": "user", "type": "string", "required": True}, + {"name": "password", "type": "password", "required": True}, + # Scope params: match hierarchy level keys. Optional = user can browse that level. + {"name": "database", "type": "string", "required": False, + "scope_level": True, # <-- marks this as a hierarchy scope param + "description": "Database (leave empty to browse all databases)"}, + ] +``` + +The `scope_level: True` flag tells the framework this param corresponds to a catalog hierarchy level. When provided, it pins that level. When empty, the user browses it. + +#### Catalog API Endpoints (Revised) + +All catalog endpoints use **POST** with JSON body (see §3.2.2 for rationale): + +``` +POST /api/plugins/{id}/catalog/ls + Body: { path: ["mydb", "public"], filter: "..." } + Response: { + hierarchy: ["database", "schema", "table"], // from catalog_hierarchy() + effective_hierarchy: ["schema", "table"], // browsable levels (pinned removed) + path: ["mydb", "public"], + nodes: [ + { + name: "users", + node_type: "table", + path: ["mydb", "public", "users"], + metadata: { row_count: 150000, columns: [...] } + }, + { + name: "orders", + node_type: "table", + path: ["mydb", "public", "orders"], + metadata: { row_count: 1200000, columns: [...] } + } + ] + } +``` + +#### Tree Rendering with Scope Pinning + +The same source looks different depending on what's pinned: + +**Unpinned (user browses full hierarchy):** +``` +▾ 📂 MySQL — db.example.com (connected) + ▸ 📁 analytics ← database level + ▾ 📁 production ← database level (expanded) + users (150k rows) [⊕] ← table level (leaf) + orders (1.2M rows) [⊕] + products (5k rows) [⊕] + ▸ 📁 staging +``` + +**Pinned to `database=production` (admin or user pre-configured):** +``` +▾ 📂 MySQL — db.example.com / production (connected) + users (150k rows) [⊕] ← table level (leaf, top-level) + orders (1.2M rows) [⊕] + products (5k rows) [⊕] +``` + +**PostgreSQL — pinned to `database=reporting`, schema browsable:** +``` +▾ 📂 PostgreSQL — warehouse.corp / reporting (connected) + ▾ 📁 public ← schema level (now top-level) + monthly_revenue (3k rows) [⊕] + customer_ltv (50k rows) [⊕] + ▸ 📁 internal +``` + +**BigQuery — unpinned:** +``` +▾ 📂 BigQuery — my-gcp-project (connected) + ▾ 📁 sales_dataset ← dataset level + transactions (10M rows) [⊕] + returns (500k rows) [⊕] + ▸ 📁 analytics_dataset +``` + +**Superset — unpinned:** +``` +▾ 📂 Superset — bi.company.com (connected) + ▾ 📊 Q3 Sales Dashboard ← dashboard level + orders_fact (150k rows) [⊕] + product_dim (2k rows) [⊕] + ▸ 📊 Customer Analytics + ▸ 📁 Ungrouped Datasets +``` + +Each expand click triggers a lazy `ls(path)` call — no upfront loading of the entire catalog. The framework computes `effective_hierarchy()` at connection time to know how many levels to render. + +### 3.5 Revised `ExternalDataLoader` Interface + +The full loader interface after the redesign. The catalog API methods (`catalog_hierarchy`, `ls`, `get_metadata`, `test_connection`) have **default implementations** on the base class so loaders can be upgraded incrementally — un-upgraded loaders still work via fallback to `list_tables()`. + +```python +class ExternalDataLoader(ABC): + """Universal data source driver. + + Required interface for all data sources (databases, BI tools, cloud storage). + """ + + # ----- Connection ----- + + @abstractmethod + def __init__(self, params: dict[str, Any]): + """Initialize with connection parameters.""" + pass + + def test_connection(self) -> bool: + """Validate the connection is alive. Used by auth/status. + Default: tries list_tables(). Subclasses should override with + something cheaper (e.g. SELECT 1).""" + ... + + def get_safe_params(self) -> dict[str, Any]: + """Connection params with secrets removed. For metadata storage.""" + ... # existing implementation + + # ----- Catalog (new — all have defaults for backward compat) ----- + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + """Declare the *full* hierarchy of this data source. + + Each entry: {"key": "database", "label": "Database"} + Last level is always the importable leaf (table/dataset/file). + Default: [{"key": "table", "label": "Table"}] (flat). + """ + return [{"key": "table", "label": "Table"}] + + def effective_hierarchy(self) -> list[dict[str, str]]: + """Browsable hierarchy — full minus pinned levels. + A level is pinned when its key matches a non-empty connection param.""" + ... + + def pinned_scope(self) -> dict[str, str]: + """Return {level_key: value} for every pinned hierarchy level.""" + ... + + def ls( + self, + path: list[str] | None = None, + filter: str | None = None, + ) -> list[CatalogNode]: + """List children at a catalog path (like `ls` in a filesystem). + + path is relative to the effective (unpinned) hierarchy. + Returns CatalogNode with node_type "namespace" or "table". + Default: falls back to list_tables() at the root level. + """ + ... + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + """Get detailed metadata for a node (columns, row count, sample rows). + Default: finds the node via ls() and returns its metadata dict.""" + ... + + # ----- Flat listing (always available) ----- + + @abstractmethod + def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: + """List all accessible tables within the pinned scope (flat/eager). + + The simple, complete way to see everything the user can access. + Potentially slow for large catalogs — ls() is the lazy alternative. + Both coexist permanently; ls() falls back to this by default.""" + pass + + # ----- Data Fetching ----- + + @abstractmethod + def fetch_data_as_arrow( + self, + source_table: str, + import_options: dict | None = None, + ) -> pa.Table: + """Fetch data from the external source as a PyArrow Table. + + import_options is a single extensible dict: + - size (int): row limit (default: 1000000) + - columns (list[str]): column projection + - sort_columns (list[str]): ordering + - sort_order (str): 'asc' or 'desc' + - filters (list[dict]): standard SPJ filters + - source_filters (dict): BI-tool-specific filters (from CatalogNode.metadata) + """ + pass + + def fetch_preview( + self, + source_table: str, + import_options: dict | None = None, + ) -> pa.Table: + """Fetch a small preview. Default: delegates to fetch_data_as_arrow. + + Loaders can override for efficiency (e.g., TABLESAMPLE). + """ + opts = {"size": 10, **(import_options or {})} + return self.fetch_data_as_arrow( + source_table=source_table, import_options=opts + ) + + def fetch_data_as_dataframe(self, source_table: str, import_options: dict | None = None) -> pd.DataFrame: + """Convenience wrapper. Calls fetch_data_as_arrow().to_pandas().""" + return self.fetch_data_as_arrow(source_table=source_table, import_options=import_options).to_pandas() + + def ingest_to_workspace(self, workspace, table_name, source_table, import_options=None): + """Fetch → Arrow → Parquet in workspace.""" + ... # existing implementation + + # ----- Metadata / Config ----- + + @staticmethod + @abstractmethod + def list_params() -> list[dict[str, Any]]: + """Connection parameters (for auto-generated connection form).""" + pass + + @staticmethod + @abstractmethod + def auth_instructions() -> str: + """Human-readable setup guide (markdown).""" + pass + + @staticmethod + def auth_mode() -> str: + """'connection' (default) or 'token'. See §6.3.""" + return "connection" + + @staticmethod + def rate_limit() -> dict | None: + """Optional rate limit hints. See §6.3.""" + return None + + @staticmethod + def import_options(table_metadata: dict) -> list[dict] | None: + """Optional import-time options for the import dialog. See §6.3.""" + return None +``` + +**Key design decisions:** +- **`CatalogNode.node_type`** uses `"namespace"` / `"table"` (following the Iceberg REST / Unity Catalog convention), not per-source types like `"database"`, `"schema"`. The hierarchy labels provide the per-source terminology. +- **`list_tables()` is kept permanently** as the flat/eager complement to `ls()`. It returns every importable table in the pinned scope — simple and complete, but potentially slow. `ls()` is the lazy/hierarchical alternative. The default `ls()` falls back to `list_tables()` for loaders that haven't implemented hierarchical browsing. +- **`effective_hierarchy()` and `pinned_scope()`** live on the loader itself (not on `ConnectedDataSource`), since the loader has access to its own `params`. +- **`test_connection()`** has a default implementation, but loaders should override with something lightweight. +- **`import_options`** is a single extensible dict replacing the old scattered `size`/`sort_columns`/`sort_order`/`columns`/`import_context` params. All data-shaping options go through one bag: `size`, `columns`, `sort_columns`, `sort_order`, `filters`, `source_filters`. Loaders extract what they need; unknown keys are ignored. + +## 4. Plugin Registration: Config-Driven, Zero Code + +### 4.1 The Insight + +Since every `ExternalDataLoader` is fully self-describing — `list_params()`, `catalog_hierarchy()`, `auth_instructions()`, `auth_mode()` — the framework can auto-register any installed loader as a plugin with **zero Python code**. Users and admins just need to say "enable this loader" and optionally pre-fill some connection params. + +No one should need to touch DF's source code to add a data source. + +### 4.2 Configuration Sources (Priority Order) + +The framework reads plugin config from multiple sources, merged in priority order (higher overrides lower): + +| Priority | Source | Who Uses It | Format | +|----------|--------|-------------|--------| +| 1 (highest) | **Environment variables** | Docker/K8s admins, CI | `DF_SOURCES__{id}__{key}=value` | +| 2 | **Config file** (`data-sources.yml`) | Admins, power users | YAML in project or `~/.data-formulator/` | +| 3 | **UI settings panel** | End users | Saved to workspace config | +| 4 (lowest) | **Auto-discovery** | Default | Any installed loader with deps available | + +### 4.3 Config File: `data-sources.yml` + +A single YAML file declares which data sources are available and how they're pre-configured: + +```yaml +# ~/.data-formulator/data-sources.yml (user-level) +# or ./data-sources.yml (project-level) +# or /etc/data-formulator/data-sources.yml (system-level) + +sources: + # Minimal: just enable a loader by its registry key + - type: postgresql + + # With pre-filled connection params (scope pinning) + - type: mysql + name: "Analytics DB" # custom display name (optional) + icon: mysql # icon key (optional, defaults from loader) + params: + host: db.internal.corp + port: 3306 + database: analytics # pinned — user only sees tables in this DB + + # Multiple instances of the same loader type + - type: postgresql + name: "Production Warehouse" + params: + host: warehouse.corp + port: 5432 + database: prod + + - type: postgresql + name: "Staging" + params: + host: staging.corp + database: staging + + # BI tool + - type: superset + name: "Company Superset" + params: + url: https://bi.company.com + + # Cloud + - type: bigquery + params: + project: my-gcp-project + + # Kusto with Azure AD + - type: kusto + name: "Telemetry Cluster" + params: + kusto_cluster: https://telemetry.kusto.windows.net + +# Optional: disable auto-discovery (only show explicitly configured sources) +auto_discover: false +``` + +**Key design decisions:** +- `type` maps to the loader registry key (e.g., `"postgresql"` → `PostgreSQLDataLoader`) +- Same `type` can appear multiple times → solves the multi-instance problem (Q2) +- `params` pre-fills connection fields — the user only sees what's left +- Sensitive params (`password`, `token`) should use env var references: `password: ${PG_PASSWORD}` + +### 4.4 Environment Variables + +For Docker / CI / Kubernetes deployments where YAML isn't convenient: + +```bash +# Enable PostgreSQL with pre-configured host +DF_SOURCES__pg_prod__type=postgresql +DF_SOURCES__pg_prod__name="Production DB" +DF_SOURCES__pg_prod__params__host=db.internal.corp +DF_SOURCES__pg_prod__params__database=analytics +DF_SOURCES__pg_prod__params__port=5432 + +# Enable Superset +DF_SOURCES__superset__type=superset +DF_SOURCES__superset__params__url=https://bi.company.com + +# Disable auto-discovery +DF_AUTO_DISCOVER_SOURCES=false +``` + +Convention: `DF_SOURCES__{instance_id}__{key}` with `__` as separator (avoids conflict with dots/dashes in names). + +### 4.5 Auto-Discovery (Default Behavior) + +When no config file or env vars are set, the framework **auto-discovers** all installed loaders: + +```python +def discover_sources(app): + """Auto-register every installed ExternalDataLoader as a ConnectedDataSource plugin.""" + for key, loader_class in DATA_LOADERS.items(): + # DATA_LOADERS is the existing registry from data_loader/__init__.py + # Only contains loaders whose pip dependencies are installed + plugin = ConnectedDataSource.from_loader(loader_class, source_id=key) + register_plugin(app, plugin) + + # Log disabled loaders (missing deps) + for key, reason in DISABLED_LOADERS.items(): + logger.info(f"Source '{key}' not available: {reason}") +``` + +With auto-discovery, a fresh DF install with `pymysql` installed automatically shows "MySQL" in the data source panel — no config needed. The user fills in host/user/password at connect time. + +### 4.6 Auth: Admin-Configured vs. User-Provided + +The config `params` and the loader's `list_params()` together determine what the user sees at connect time. Each param falls into one of three categories: + +| Category | Where it comes from | User sees it? | Example | +|----------|-------------------|--------------|---------| +| **Admin-fixed** | YAML `params` or env var | No — hidden, pre-filled | `host: db.internal.corp` | +| **Admin-defaulted** | YAML `params` with `user_editable: true` | Yes — pre-filled but editable | `port: 5432` | +| **User-provided** | Not in config; loader declares it in `list_params()` | Yes — empty, must fill in | `user`, `password` | + +#### Scenario 1: Admin provides infra, user provides credentials + +The most common enterprise setup. Admin locks down the server, user brings their own identity: + +```yaml +# data-sources.yml +sources: + - type: postgresql + name: "Analytics DB" + params: + host: warehouse.corp + port: 5432 + database: analytics +``` + +The user's connect form only shows what's **not** in config: + +``` +┌─ Connect to Analytics DB ──────────────────┐ +│ │ +│ ⓘ Server: warehouse.corp:5432/analytics │ ← info only, not editable +│ │ +│ Username: [ ] │ ← user fills in +│ Password: [•••••••• ] │ ← user fills in +│ │ +│ [Cancel] [Connect] │ +└─────────────────────────────────────────────┘ +``` + +#### Scenario 2: Admin provides everything (shared service account) + +For read-only dashboards or demo deployments. No user interaction needed: + +```yaml +sources: + - type: postgresql + name: "Analytics DB" + auto_connect: true # connect on first access, no form + params: + host: warehouse.corp + database: analytics + user: readonly_svc + password: ${ANALYTICS_DB_PASSWORD} # env var reference — not stored in YAML +``` + +The user clicks "Analytics DB" in the tree → auto-connects immediately. No connect form shown. The password is resolved from the `ANALYTICS_DB_PASSWORD` environment variable at startup. + +#### Scenario 3: User provides everything (auto-discovered) + +No config file. The user sees the full connection form: + +``` +┌─ Connect to PostgreSQL ────────────────────┐ +│ Host: [ ] │ +│ Port: [5432 ] │ +│ Username: [ ] │ +│ Password: [•••••••• ] │ +│ Database: [ ] (optional) │ +│ │ +│ [Cancel] [Connect] │ +└─────────────────────────────────────────────┘ +``` + +#### Scenario 4: Token / OAuth sources + +For Kusto (Azure AD), BigQuery (service account), Superset (JWT): + +```yaml +sources: + - type: kusto + name: "Telemetry Cluster" + params: + kusto_cluster: https://telemetry.kusto.windows.net + # No user/password — Kusto uses Azure AD +``` + +The connect form shows whatever the loader's `list_params()` declares — for Kusto that might be an "Authenticate with Azure AD" button that triggers an OAuth redirect. + +#### How `list_params()` drives the form + +The framework computes the connect form at startup: + +```python +def compute_connect_form(loader_class, config_params): + """Determine which params the user needs to fill in.""" + all_params = loader_class.list_params() + form_fields = [] + pinned = {} + + for param in all_params: + if param["name"] in config_params: + # Admin provided this — don't show in form + pinned[param["name"]] = config_params[param["name"]] + else: + # User must provide this + form_fields.append(param) + + return form_fields, pinned +``` + +The result goes into `/api/app-config`: +- `params_form` — fields the user fills in (rendered as the connect form) +- `pinned_params` — values the user can see (as info) but not edit + +#### Credential & Connection Persistence + +Two-level storage, no in-memory tricks: + +| Scope | Where | What | Who manages | +|-------|-------|------|-------------| +| **User connections** | Workspace directory (`workspace/connections/`) | Per-user saved connection params (encrypted) | User, via connect/disconnect | +| **Admin connections** | DF home (`~/.data-formulator/data-sources.yml` or `/etc/data-formulator/`) | Shared/pre-configured sources | Admin, via config file or env vars | + +**User connections live in the workspace.** When a user connects to a source, their params (host, user, encrypted password) are saved to `workspace/connections/{source_id}.json`. On next session, the framework reads this file → re-instantiates the loader → user is auto-connected. No vault service, no in-memory pool, no Flask sessions. + +``` +workspace/ + connections/ + pg_prod.json # {"type": "postgresql", "params": {"host": "...", "user": "...", "password": ""}} + superset.json # {"type": "superset", "params": {"url": "...", "username": "...", "token": ""}} + tables/ + users.parquet + orders.parquet + metadata.json +``` + +**Admin connections live in DF home.** The `data-sources.yml` file (§4.3) is read-only for users. Admin-provided params are merged with user-provided params at connect time. + +**Flow:** +1. User submits credentials via connect form +2. Framework validates (instantiate loader, call `test_connection()`) +3. On success: save encrypted params to `workspace/connections/{source_id}.json`, keep loader instance alive for the current process +4. On next session/restart: read saved connections → re-instantiate loaders on first access (lazy) +5. On disconnect: delete the connection file, close loader + +**Loader instances** are created on-demand and cached in-process for the duration of the server process — this is just normal Python object lifecycle, not a special pool. If the process restarts, the saved connection file lets us recreate the loader transparently. + +**Encryption:** Passwords and tokens are encrypted at rest using a per-workspace key (or a key derived from the user's session secret). The framework decrypts on read, never exposes in API responses. + +For **admin-provided credentials** (`auto_connect: true`), the connection file is pre-populated from config at startup — the user never needs to connect manually. + +### 4.7 UI Settings Panel (Future) + +End users can add/remove sources from the DF UI: + +``` +┌─ Settings → Data Sources ───────────────────────────┐ +│ │ +│ Configured Sources: │ +│ ┌──────────────────────┬────────────┬─────────┐ │ +│ │ Name │ Type │ Status │ │ +│ ├──────────────────────┼────────────┼─────────┤ │ +│ │ Production DB │ PostgreSQL │ ● Ready │ │ +│ │ Company Superset │ Superset │ ● Ready │ │ +│ │ Telemetry Cluster │ Kusto │ ○ No dep│ │ +│ └──────────────────────┴────────────┴─────────┘ │ +│ │ +│ [+ Add Source] │ +│ │ +│ Available Source Types: │ +│ PostgreSQL, MySQL, BigQuery, Kusto, S3, MongoDB, │ +│ MSSQL, Azure Blob, Superset │ +│ │ +└──────────────────────────────────────────────────────┘ +``` + +### 4.8 How It Works Internally + +At startup, the framework: + +1. **Scan** `DATA_LOADERS` registry → all installed loader classes +2. **Read** config sources (env vars → YAML → UI settings) → merge +3. **For each configured source** (or auto-discovered loader): + - Resolve the `ExternalDataLoader` class from `type` + - Create a `ConnectedDataSource` instance with pre-filled `params` + - Generate Flask Blueprint with auth/catalog/data routes + - Register frontend module (generic `ConnectedSourcePanel`) +4. **Serve** `/api/app-config` with the list of enabled sources + +```python +# Internal — no user code needed +def register_sources(app): + config = load_source_config() # merge env + yaml + UI settings + + for source_spec in config.sources: + loader_class = DATA_LOADERS.get(source_spec.type) + if not loader_class: + logger.warn(f"Unknown source type: {source_spec.type}") + continue + + plugin = ConnectedDataSource.from_loader( + loader_class, + source_id=source_spec.id, # auto-generated or from config + display_name=source_spec.name, # optional custom name + default_params=source_spec.params, # pre-filled connection params + icon=source_spec.icon, + ) + register_plugin(app, plugin) +``` + +### 4.9 Frontend: No Per-Source Registration Needed + +Since all `ConnectedDataSource` plugins use the same generic `ConnectedSourcePanel`, the frontend doesn't need per-source modules either. The backend's `/api/app-config` tells the frontend what sources are available: + +```json +{ + "SOURCES": [ + { + "id": "pg_prod", + "type": "postgresql", + "name": "Production DB", + "icon": "postgresql", + "params_form": [ + {"name": "user", "type": "string", "required": true}, + {"name": "password", "type": "password", "required": true} + ], + "pinned_params": {"host": "db.internal.corp", "database": "analytics"}, + "hierarchy": [{"key": "schema", "label": "Schema"}, {"key": "table", "label": "Table"}] + }, + { + "id": "superset", + "type": "superset", + "name": "Company Superset", + "icon": "superset", + "params_form": [ + {"name": "username", "type": "string", "required": true}, + {"name": "password", "type": "password", "required": true} + ], + "pinned_params": {"url": "https://bi.company.com"}, + "hierarchy": [{"key": "dashboard", "label": "Dashboard"}, {"key": "dataset", "label": "Dataset"}] + } + ] +} +``` + +The frontend renders one `ConnectedSourcePanel` per source in the `SOURCES` list — each with its own connection form, tree hierarchy, and icon. **Zero frontend code per source.** + +## 5. Frontend: Generic `ConnectedSourcePanel` + +### 5.1 Shared UI for All Database-Type Sources + +Instead of writing a custom React panel per data source, `ConnectedDataSource` plugins share a single generic panel: + +```typescript +// src/plugins/_shared/ConnectedSourcePanel.tsx + +interface ConnectedSourcePanelProps { + pluginId: string; + config: PluginConfig; + callbacks: PluginHostCallbacks; +} + +function ConnectedSourcePanel({ pluginId, config, callbacks }: ConnectedSourcePanelProps) { + // State machine: disconnected → connecting → connected → browsing → importing + + // 1. If not connected: show connection form (auto-generated from list_params) + // 2. If connected: show table browser (tree view with groups/schemas) + // 3. On table select: show detail + preview + import button + // 4. On import: optional filter dialog (if large) → load → notify host +} +``` + +### 5.2 Auto-Generated Connection Form + +The connection form is generated from `ExternalDataLoader.list_params()`: + +```typescript +// list_params() returns: +[ + { name: "host", type: "string", required: true, default: "localhost", description: "Database host" }, + { name: "port", type: "number", required: true, default: 5432, description: "Port" }, + { name: "user", type: "string", required: true, description: "Username" }, + { name: "password", type: "password", required: true, description: "Password" }, + { name: "database", type: "string", required: true, description: "Database name" }, +] + +// Renders as: +┌─ Connect to PostgreSQL ────────────────┐ +│ Host: [localhost ] │ +│ Port: [5432 ] │ +│ User: [ ] │ +│ Password: [•••••••• ] │ +│ Database: [ ] │ +│ │ +│ [Cancel] [Connect] │ +└─────────────────────────────────────────┘ +``` + +### 5.3 Table Browser + +Once connected, the table browser uses the unified tree from [design-doc #8](8-unified-data-source-panel.md): + +``` +▾ 📂 PostgreSQL — analytics-db (connected) + ▾ 📁 public + users (150k rows) [⊕] [↻] + orders (1.2M rows) [⊕] [↻] + products (5k rows) [⊕] [↻] + ▸ 📁 staging + ▸ 📁 analytics +``` + +- **[⊕]** = Import to workspace +- **[↻]** = Refresh (only shown for already-imported tables) + +### 5.4 Frontend Plugin Registration + +No per-source frontend code needed. The backend's `/api/app-config` response (see §4.9) tells the frontend what sources exist and what their connection forms / hierarchy look like. One generic `ConnectedSourcePanel` handles all of them. + +The frontend factory is only needed once, in the shared module: + +```typescript +// src/plugins/_shared/ConnectedSourcePanel.tsx +// Handles ALL connected data sources — databases, BI tools, cloud storage +// Reads source config from /api/app-config → SOURCES[] +// Renders: connection form (from params_form) → tree browser (from hierarchy) → import +``` + +## 6. Full Unification: BI Tools as Data Loaders + +Since DF only **consumes** data, both databases and BI tools serve the same role: hierarchical sources of importable tables. We unify them under the same `ConnectedDataSource` model. + +### 6.1 Architecture (Unified) + +``` + ConnectedDataSource (generic lifecycle wrapper) + | + ┌────────────┼────────────────┐ + │ │ │ + Database Loaders Cloud Loaders BI Tool Loaders + ┌────┬────┐ ┌────┬────┐ ┌─────────┬──────────┐ + MySQL PG MSSQL BQ Kusto S3 Superset Metabase Grafana +``` + +**Everything is a loader.** Superset becomes a `SupersetLoader(ExternalDataLoader)` that: +- Connects via JWT instead of host/password +- Exposes `catalog_hierarchy() → [{"key":"dashboard","label":"Dashboard"}, {"key":"dataset","label":"Dataset"}]` +- Returns `CatalogNode(node_type="namespace", ...)` for dashboards (expandable containers) +- Returns datasets as `CatalogNode(node_type="table", ...)` leaf nodes with optional pre-applied filters + +### 6.2 How Superset Migrates + +```python +class SupersetLoader(ExternalDataLoader): + """Treats Superset as a hierarchical data source.""" + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "dashboard", "label": "Dashboard"}, + {"key": "dataset", "label": "Dataset"}, + ] + + def ls(self, path=None, filter=None) -> list[CatalogNode]: + path = path or [] + if not path: # root → list dashboards + "Ungrouped Datasets" + dashboards = self.client.list_dashboards(self.token) + return [ + CatalogNode(name=d["title"], node_type="namespace", + path=[d["title"]]) + for d in dashboards + ] + [CatalogNode(name="Ungrouped Datasets", node_type="namespace", + path=["Ungrouped Datasets"])] + + if len(path) == 1: # dashboard → list its datasets + datasets = self.client.get_dashboard_datasets(self.token, path[0]) + return [ + CatalogNode( + name=ds["name"], node_type="table", + path=[path[0], ds["name"]], + metadata={"row_count": ds["count"], "filters": ds.get("filters")}, + ) + for ds in datasets + ] + return [] + + def fetch_data_as_arrow(self, source_table, size=100000, **kwargs) -> pa.Table: + # source_table = dataset ID; executes SQL via Superset's SQL Lab + return self.client.execute_sql_as_arrow(self.token, source_table, size) + + @staticmethod + def list_params() -> list[dict]: + return [ + {"name": "url", "type": "string", "required": True, "description": "Superset URL"}, + {"name": "username", "type": "string", "required": True}, + {"name": "password", "type": "password", "required": True}, + ] + +# Plugin registration — same one-liner as databases: +plugin_class = create_connected_data_source(SupersetLoader, "superset", "Superset", icon="superset") +``` + +The rich Superset-specific features (dashboard filters, column metadata, etc.) are expressed as **metadata on `CatalogNode`** rather than as a separate plugin architecture. + +### 6.3 Critical Differences to Be Aware Of + +Unification is the right call, but these differences must be handled in the `ConnectedDataSource` framework: + +#### 1. Auth Model Diversity + +| Source Type | Auth Mechanism | Token Lifecycle | +|-------------|---------------|------------------| +| MySQL, PG, MSSQL | Connection params (host/user/password) | Connection object — alive until closed | +| Kusto, BigQuery | OAuth / service account token | Expires, needs refresh | +| Superset, Metabase | JWT (username/password → token) | Expires, needs refresh | +| Grafana | API key | Long-lived, no refresh | + +**Solution:** The `ConnectedDataSource` auth layer must support both: +- **Persistent connection** mode (databases): store connection object in session, reconnect on failure +- **Token** mode (BI tools, cloud): store token in session, auto-refresh on expiry + +The loader declares which mode it uses: +```python +class ExternalDataLoader(ABC): + @staticmethod + def auth_mode() -> str: + """'connection' (default) or 'token'.""" + return "connection" +``` + +#### 2. Catalog Node Semantics: Import vs. Import-With-Context + +Database tables are **context-free** — `SELECT * FROM users` means the same thing regardless of how you navigated to it. But BI tool datasets can carry **context from their parent**: + +``` +Superset: + 📊 Q3 Sales Dashboard + orders_fact → import with dashboard's date filter pre-applied + 📁 Ungrouped Datasets + orders_fact → import raw, no filters +``` + +The same leaf ("orders_fact") means different things depending on which parent you expanded from. + +**Solution:** `CatalogNode.metadata` carries the context: +```python +@dataclass +class CatalogNode: + name: str + node_type: str # "namespace" or "table" + path: list[str] + metadata: dict | None = None # <-- includes import_context + # e.g., metadata = { + # "filters": [{"column": "date", "op": ">=", "value": "2025-07-01"}], + # "description": "Filtered by Q3 Sales Dashboard", + # } +``` + +When importing, the framework passes `metadata` to the loader, which can apply filters server-side. Databases ignore this (no filters in metadata). BI tools use it. + +#### 3. Data Freshness & Caching + +| Source | Data Freshness | Caching Behavior | +|--------|---------------|------------------| +| Database | Live — query returns current state | No source-side cache; DF caches catalog metadata only | +| BI tool | May have source-side cache (Superset caches query results) | Catalog may be stale; need cache-bust option | + +**Solution:** `CatalogNode.metadata` can include `cached_at` / `cache_ttl` hints. The tree UI shows a staleness indicator and offers a "refresh catalog" action per source. + +#### 4. Rate Limiting & Quotas + +BI tools often have API rate limits (Superset: N requests/minute). Databases have connection limits but no per-query throttling. + +**Solution:** Loaders can declare rate limit hints: +```python +class ExternalDataLoader(ABC): + @staticmethod + def rate_limit() -> dict | None: + """Optional rate limit hints. None = no limit.""" + return None # or {"requests_per_minute": 60, "concurrent": 5} +``` +The `ConnectedDataSource` framework uses this to throttle catalog expansion and data loads. + +#### 5. Import Filtering: Standard SPJ + Source-Defined Filters + +Large datasets need filtering before import. There are two layers: + +**Layer 1: Standard SPJ (Select-Project-Join) — all sources get this for free** + +The framework provides a built-in filter UI for every data source, regardless of type: + +``` +┌─ Import: orders (1.2M rows) ───────────────────────────┐ +│ │ +│ Columns (select): │ +│ ☑ order_id ☑ customer_id ☑ amount │ +│ ☑ region ☐ internal_id ☐ updated_at │ +│ │ +│ Filters (where): │ +│ ┌──────────────┬─────┬──────────────────────┐ │ +│ │ region │ IN │ [US, EU] │ │ +│ │ amount │ >= │ [100] │ │ +│ │ order_date │ >= │ [2025-01-01] │ │ +│ │ │ │ [+ Add filter] │ │ +│ └──────────────┴─────┴──────────────────────┘ │ +│ │ +│ Sort by: [order_date ▾] [desc ▾] │ +│ Row limit: [50000 ] │ +│ │ +│ Estimated rows: ~38,000 │ +│ │ +│ [Cancel] [Import] │ +└─────────────────────────────────────────────────────────┘ +``` + +This UI is **auto-generated from column metadata** (`get_metadata()` returns column names and types). The framework builds the SQL WHERE clause server-side via a safe, parameterized filter DSL — no raw SQL from the user. + +The filter DSL: +```python +# Sent in data/import request body +{ + "source_table": ["production", "public", "orders"], + "columns": ["order_id", "customer_id", "amount", "region", "order_date"], + "filters": [ + {"column": "region", "op": "in", "value": ["US", "EU"]}, + {"column": "amount", "op": ">=", "value": 100}, + {"column": "order_date", "op": ">=", "value": "2025-01-01"} + ], + "sort_columns": ["order_date"], + "sort_order": "desc", + "size": 50000 +} +``` + +The loader receives this in `import_context` and translates to the source's query language (SQL WHERE, Kusto where, S3 Select, etc.). Each loader handles its own dialect safely. + +**Layer 2: Source-defined filters — for BI tools and curated datasets** + +Some sources provide **pre-defined filter sets** created by the data source owner (e.g., Superset dashboard native filters, Metabase question parameters). These appear as additional interactive controls above the standard SPJ filters: + +``` +┌─ Import: orders_fact (from Q3 Sales Dashboard) ────────┐ +│ │ +│ Dashboard Filters (pre-defined by source): │ +│ ┌──────────────────────────────────────────────┐ │ +│ │ Quarter: [Q3 2025 ▾] │ │ +│ │ Region: [☑ US ☑ EU ☐ APAC ☐ LATAM] │ │ +│ │ Product: [All ▾] │ │ +│ └──────────────────────────────────────────────┘ │ +│ │ +│ Additional Filters (standard): │ +│ ┌──────────────┬─────┬──────────────────────┐ │ +│ │ amount │ >= │ [100] │ │ +│ │ │ │ [+ Add filter] │ │ +│ └──────────────┴─────┴──────────────────────┘ │ +│ │ +│ Columns: ☑ order_id ☑ customer_id ☑ amount ... │ +│ Row limit: [50000] │ +│ │ +│ [Cancel] [Import] │ +└─────────────────────────────────────────────────────────┘ +``` + +Source-defined filters come from `CatalogNode.metadata`: +```python +# CatalogNode for "orders_fact" under "Q3 Sales Dashboard" +CatalogNode( + name="orders_fact", + node_type="table", + path=["Q3 Sales Dashboard", "orders_fact"], + metadata={ + "row_count": 150000, + "source_filters": [ + { + "name": "Quarter", "column": "quarter", + "type": "select", "options": ["Q1 2025", "Q2 2025", "Q3 2025", "Q4 2025"], + "default": "Q3 2025" + }, + { + "name": "Region", "column": "region", + "type": "multi_select", "options": ["US", "EU", "APAC", "LATAM"], + "default": ["US", "EU"] + }, + { + "name": "Product", "column": "product_category", + "type": "select", "options_endpoint": "/filter-values", # lazy-loaded + "default": "All" + } + ] + } +) +``` + +The import dialog renders both layers. The combined request: +```python +{ + "source_table": ["Q3 Sales Dashboard", "orders_fact"], + "columns": ["order_id", "customer_id", "amount"], + "filters": [ # standard SPJ filters (layer 1) + {"column": "amount", "op": ">=", "value": 100} + ], + "import_context": { # source-defined filters (layer 2) + "source_filters": [ + {"column": "quarter", "value": "Q3 2025"}, + {"column": "region", "value": ["US", "EU"]}, + {"column": "product_category", "value": "All"} + ] + }, + "size": 50000 +} +``` + +The loader applies both: source-defined filters first (they define the base dataset), then standard SPJ filters on top (user refinement). + +**How loaders declare filter support:** + +```python +class ExternalDataLoader(ABC): + @staticmethod + def supports_standard_filters() -> bool: + """Whether this loader can apply SPJ filters server-side. + + True → framework sends filters in import_context, loader builds WHERE clause + False → framework fetches all data, applies filters client-side (slower) + Default: True for SQL databases, loaders can override. + """ + return True +``` + +For sources that can't filter server-side (e.g., some REST APIs), the framework falls back to client-side filtering after fetch — less efficient but still works. + +## 7. Migration Plan + +### Phase 1: Core Framework + Loader Upgrade + +1. ✅ Add `CatalogNode` dataclass (`"namespace"` / `"table"`) and new base methods on `ExternalDataLoader`: `catalog_hierarchy()`, `effective_hierarchy()`, `pinned_scope()`, `ls()`, `get_metadata()`, `test_connection()`, `auth_mode()`, `rate_limit()` — all with default implementations so existing loaders keep working +2. ✅ **Upgrade all 9 loaders** to override the new methods: + - MySQL, PostgreSQL, MSSQL: `catalog_hierarchy()`, `ls()`, `get_metadata()`, `test_connection()` — database param made optional for scope pinning + - BigQuery: project always pinned (required), dataset_id optional — 3-level hierarchy + - Kusto: kusto_database made optional — 2-level hierarchy + - Athena: database already optional — 2-level hierarchy + - MongoDB: database required, collection is scope param — 2-level hierarchy + - S3, Azure Blob: bucket/container required (can't list safely) — 2-level hierarchy +3. ✅ Unify `fetch_data_as_arrow()` signature: replace `size`/`sort_columns`/`sort_order` positional params with single `import_options: dict` — extensible for `columns`, `filters`, `source_filters`. All 9 loaders, callers, and tests updated. Renamed `loader_metadata` → `source_info`. Removed pandas from PG/MySQL/MSSQL query path (cursor + `pa.table()` directly). `import_options` stored in workspace metadata for refresh replay. +4. ✅ Implement `ConnectedDataSource` base class with generic auth/catalog/data routes — auto-registers all 10 loaders at startup (90 routes under `/api/sources/{id}/`), exposes `SOURCES` in `/api/app-config` +5. ✅ Implement `SupersetLoader(ExternalDataLoader)` — JWT-based auth (`auth_mode="token"`), dashboard→dataset hierarchy, SQL Lab data fetch. Registered as 10th loader, auto-wrapped by `ConnectedDataSource` with 9 routes. +6. ✅ Implement config-driven registration — `data-sources.yml` (searched in `DATA_FORMULATOR_HOME`, cwd, `~/.data-formulator/`, `/etc/`), env vars (`DF_SOURCES__id__key`), `${ENV_REF}` resolution, `auto_discover: false` to restrict to configured sources only. Multiple instances of same type supported. +7. ✅ Integrate `ConnectedDataSource` into frontend — `SOURCES` from `/api/app-config` rendered in `DBManagerPane` sidebar alongside legacy loaders. `DataLoaderForm` accepts optional `connectedSourceId` to route through `/api/sources/{id}/*`. `loadTable` thunk updated to support connected source import. Zero new components — reuses existing form/table UI. + +### Phase 2: Integration Testing + +7. Test database loaders end-to-end: PostgreSQL, MySQL via auto-discovery and `data-sources.yml` config + - Connect → browse hierarchy → scope pinning → import with SPJ filters → refresh → disconnect → reconnect from saved credentials +8. Test `SupersetLoader` end-to-end: dashboard → dataset hierarchy, source-defined filters, SSO auth +9. Deprecate old hand-written `SupersetPlugin(DataSourcePlugin)` +10. Verify remaining loaders via auto-discovery: Kusto, BigQuery, MSSQL, MongoDB, S3, Azure Blob + +### Phase 3: Cleanup + Unified Panel + +11. Remove `DataSourcePlugin` base class, `plugins/` directory, and per-plugin `__init__.py` files +12. Integrate with unified data source panel ([doc #8](8-unified-data-source-panel.md)) +13. Old `/api/db-manager/load-table` endpoint → deprecation path + +### Phase 4: Advanced Features + +13. Scheduled refresh (periodic re-fetch) +14. Incremental refresh (append-only for time-series data) +15. Connection sharing in team deployments (admin-managed connections) +16. Cross-database queries (join tables from different databases in tree) +17. Metabase / Grafana loaders + +## 8. Open Questions + +### Q1: What happens to `DataSourcePlugin` and the `plugins/` directory? + +They go away after migration. The auth and lifecycle components move into DF core: + +``` +py-src/data_formulator/ + auth/ ← NEW: DF's auth layer (extracted from plugins/) + __init__.py + credentials.py ← encrypt/decrypt passwords & tokens at rest + connection_store.py ← read/write workspace/connections/{id}.json + token_manager.py ← token refresh, expiry checking (for token-mode sources) + sso.py ← SSO/OIDC provider (AuthProvider, extracted from plugins/) + data_loader/ ← EXISTING: all ExternalDataLoader subclasses + external_data_loader.py ← revised interface (§3.5) + mysql_data_loader.py + postgresql_data_loader.py + ... + connected_source.py ← NEW: ConnectedDataSource framework + (route generation, form computation, lifecycle) + plugins/ ← REMOVED after Phase 3 +``` + +Post-migration architecture: + +``` +ExternalDataLoader (driver) ← each source type implements this + ↓ +ConnectedDataSource (framework) ← generic lifecycle wrapper, one implementation + ↓ uses auth/ for credentials, tokens, SSO +data-sources.yml / auto-discovery ← config, not code +``` + +There are no "plugins" anymore — just **loaders** (the driver layer) and the **framework** (the lifecycle layer). The `plugins/` directory, `DataSourcePlugin` base class, and per-plugin `__init__.py` files are all removed. + +**What's reused** from the current plugin system (relocated to `auth/`): +- Credential encryption patterns → `auth/credentials.py` +- Session helpers, token refresh logic → `auth/token_manager.py` +- SSO bridge patterns (for token passthrough) → `auth/sso.py` +- Workspace connection persistence → `auth/connection_store.py` (new) +- Error isolation (one source failure doesn't crash others) — stays in framework +- Frontend error boundaries — stays in frontend + +### Q2: Multiple connections to the same source type? + +**Solved by config.** Users list multiple entries with the same `type` in `data-sources.yml`: + +```yaml +sources: + - type: postgresql + name: "Production" + params: { host: prod.corp, database: prod } + - type: postgresql + name: "Staging" + params: { host: staging.corp, database: staging } +``` + +Each becomes a separate entry in the data source tree. No code changes needed. + +### Q3: How deep should hierarchical browsing go? + +Different sources have different depths: + +| Source | Levels | Example | +|--------|--------|---------| +| MySQL | 2 | `database → table` | +| PostgreSQL | 3 | `database → schema → table` | +| BigQuery | 3 | `project → dataset → table` | +| Kusto | 2 | `database → table` | +| S3 | 2+ | `bucket → prefix → ... → object` (variable depth) | + +**Recommendation:** Each loader declares its hierarchy via `catalog_hierarchy()`. The tree UI renders whatever depth the loader declares. S3-style variable depth can be handled by repeating level types (e.g., `["bucket", "prefix", "prefix", "object"]` or a special "recursive" marker). + +### Q4: How do column selection and filtering interact with the loader? + +The current `fetch_data_as_arrow(source_table, size, ...)` doesn't support column selection or arbitrary WHERE clauses. Options: + +- **Column selection:** Add `columns` param to `fetch_data_as_arrow()` — loaders build `SELECT col1, col2 FROM ...` +- **Server-side filtering:** More complex. Would need a filter DSL or raw SQL passthrough. + +**Recommendation:** Phase 1 supports column selection + size limit only. Server-side filtering (like Superset has) is Phase 4 for database plugins — it requires building SQL WHERE clauses safely, which varies per database dialect. + +### Q5: What about token-based auth (Kusto, BigQuery)? + +Some data sources use OAuth/service accounts, not username/password. The `list_params()` already handles this — BigQuery asks for a service account JSON, Kusto uses Azure AD tokens. + +The `ConnectedDataSource` auth layer should support: +- **Password mode** (MySQL, PostgreSQL, MSSQL): user/password fields +- **Token/key mode** (BigQuery, Kusto): API key or token file +- **OAuth mode** (future): redirect-based auth flow + +`list_params()` already declares the param types — the generic connection form renders whatever the loader needs. + +### Q6: Should the old `db-manager` endpoints remain? + +The existing `POST /api/db-manager/load-table` is a stateless, one-shot endpoint. Once `ConnectedDataSource` plugins exist, it's redundant. But we should keep it for backward compatibility and deprecate it gradually. + +``` +Phase 1-2: Both endpoints work +Phase 3: /api/db-manager/* shows deprecation warning in logs +Phase 4: Remove (or keep as thin wrapper that delegates to plugin) +``` + +## 9. Summary + +**The generalized plugin library unifies databases and BI tools into one model:** + +``` +ExternalDataLoader (data protocol: how to connect, browse, fetch) + + +ConnectedDataSource (lifecycle mgmt: session, caching, refresh, UI) + = +A full plugin — for databases AND BI tools — for free +``` + +All data sources are **hierarchical trees of `namespace` → `table` nodes**: +- MySQL: `database (namespace) → table` +- PostgreSQL: `database (namespace) → schema (namespace) → table` +- Superset: `dashboard (namespace) → dataset (table)` +- S3: `bucket (namespace) → file (table)` + +The hierarchy labels (what to call each namespace level) come from `catalog_hierarchy()`. **Scope pinning** lets users skip levels they don't need to browse — if you provide `database="prod"` in your connection params, that level is hidden and browsing starts at the next level. + +The five critical differences between databases and BI tools (auth model, contextual import, caching, rate limits, import options) are handled as **optional capabilities** on `ExternalDataLoader` and `CatalogNode.metadata` — not as separate plugin architectures. + +**What plugin authors / admins write:** + +| Scenario | What to do | +|----------|------------| +| Enable an already-installed loader | Add one entry to `data-sources.yml` or set env var | +| Pre-configure a database for all users | Add entry with `params` (host, database, etc.) in YAML or env | +| Multiple connections to same DB type | Add multiple entries with same `type`, different `name` and `params` | +| New loader not yet in DF | Implement `ExternalDataLoader` subclass (~100 lines), `pip install` it | +| BI platform with custom hierarchy | Same as above, implement `ls()` with custom hierarchy (~200 lines) | + +**What users get:** + +- Log into PostgreSQL / Kusto / MySQL / BigQuery / **Superset / Metabase** once → browse hierarchy → import → refresh +- All data sources visible in one unified tree panel +- Consistent experience: same connect → browse → import → refresh loop everywhere +- No re-entering credentials for every data pull diff --git a/py-src/data_formulator/app.py b/py-src/data_formulator/app.py index 8dd6ac1a..e649d6c9 100644 --- a/py-src/data_formulator/app.py +++ b/py-src/data_formulator/app.py @@ -156,6 +156,11 @@ def _register_blueprints(): from data_formulator.plugins import discover_and_register discover_and_register(app) + # Auto-register all installed data loaders as ConnectedDataSource plugins + print(" Loading connected data sources...", flush=True) + from data_formulator.connected_source import register_connected_sources + register_connected_sources(app) + # Register blueprints at module level so WSGI servers (gunicorn) pick up all routes. # The guard inside _register_blueprints() prevents double registration when run via CLI. @@ -246,6 +251,14 @@ def get_app_config(): } config["PLUGINS"] = plugins_info + # Expose connected data sources to the frontend + from data_formulator.connected_source import CONNECTED_SOURCES + if CONNECTED_SOURCES: + sources_info: list[dict] = [] + for sid, src in CONNECTED_SOURCES.items(): + sources_info.append(src.get_frontend_config()) + config["SOURCES"] = sources_info + return flask.jsonify(config) diff --git a/py-src/data_formulator/connected_source.py b/py-src/data_formulator/connected_source.py new file mode 100644 index 00000000..b20057ed --- /dev/null +++ b/py-src/data_formulator/connected_source.py @@ -0,0 +1,632 @@ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. + +"""ConnectedDataSource — generic lifecycle wrapper for ExternalDataLoader. + +Takes any ``ExternalDataLoader`` class and auto-generates a Flask Blueprint +with auth / catalog / data routes. No per-source code needed. + +Usage:: + + from data_formulator.connected_source import ConnectedDataSource + + plugin = ConnectedDataSource.from_loader( + PostgreSQLDataLoader, + source_id="pg_prod", + display_name="Production DB", + default_params={"host": "db.corp", "database": "prod"}, + ) + app.register_blueprint(plugin.create_blueprint()) +""" + +import dataclasses +import json as _json +import logging +from typing import Any + +from flask import Blueprint, Flask, jsonify, request + +from data_formulator.data_loader.external_data_loader import ( + CatalogNode, + ExternalDataLoader, +) +from data_formulator.plugins.base import DataSourcePlugin + +logger = logging.getLogger(__name__) + +# Registry of enabled ConnectedDataSource instances (populated at startup). +CONNECTED_SOURCES: dict[str, "ConnectedDataSource"] = {} + + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + +def _sanitize_error(error: Exception) -> tuple[str, int]: + """Return a safe error message + HTTP status code. + + Never leaks internal details to the client. + """ + logger.error("ConnectedDataSource error", exc_info=error) + msg = str(error).lower() + if "required" in msg or "invalid" in msg: + return "Invalid connection parameters", 400 + if "permission" in msg or "access" in msg: + return "Access denied", 403 + if "connect" in msg or "refused" in msg: + return "Connection failed", 502 + return "An unexpected error occurred", 500 + + +def _node_to_dict(node: CatalogNode) -> dict[str, Any]: + return { + "name": node.name, + "node_type": node.node_type, + "path": node.path, + "metadata": node.metadata, + } + + +def _hierarchy_dicts(levels: list[dict[str, str]]) -> list[dict[str, str]]: + return [{"key": l["key"], "label": l["label"]} for l in levels] + + +# --------------------------------------------------------------------------- +# ConnectedDataSource +# --------------------------------------------------------------------------- + +class ConnectedDataSource(DataSourcePlugin): + """A DataSourcePlugin auto-generated from an ExternalDataLoader. + + Provides: + - **Auth routes**: connect / disconnect / status + - **Catalog routes**: ls / metadata + - **Data routes**: import / refresh / preview + + All driven by the underlying loader's existing methods. + """ + + def __init__( + self, + loader_class: type[ExternalDataLoader], + source_id: str, + display_name: str | None = None, + default_params: dict[str, Any] | None = None, + icon: str | None = None, + ) -> None: + self._loader_class = loader_class + self._source_id = source_id + self._display_name = display_name or source_id + self._default_params = default_params or {} + self._icon = icon or source_id + + # Per-identity loader instances: identity_id → ExternalDataLoader + # In-process cache; cleared on disconnect. + self._loaders: dict[str, ExternalDataLoader] = {} + + # -- Factory ----------------------------------------------------------- + + @classmethod + def from_loader( + cls, + loader_class: type[ExternalDataLoader], + source_id: str, + display_name: str | None = None, + default_params: dict[str, Any] | None = None, + icon: str | None = None, + ) -> "ConnectedDataSource": + return cls( + loader_class=loader_class, + source_id=source_id, + display_name=display_name, + default_params=default_params, + icon=icon, + ) + + # -- DataSourcePlugin interface ---------------------------------------- + + @staticmethod + def manifest() -> dict[str, Any]: + # Static stub; per-instance config is in _manifest(). + return {} + + def _manifest(self) -> dict[str, Any]: + return { + "id": self._source_id, + "name": self._display_name, + "icon": self._icon, + "env_prefix": f"DF_{self._source_id.upper()}", + "required_env": [], + "auth_modes": [self._loader_class.auth_mode()], + "capabilities": ["tables", "catalog", "refresh"], + } + + def get_frontend_config(self) -> dict[str, Any]: + all_params = self._loader_class.list_params() + form_fields: list[dict] = [] + pinned_params: dict[str, Any] = {} + + for param in all_params: + if param["name"] in self._default_params: + pinned_params[param["name"]] = self._default_params[param["name"]] + else: + form_fields.append(param) + + full_hierarchy = self._loader_class.catalog_hierarchy() + effective = [ + level for level in full_hierarchy + if not self._default_params.get(level["key"]) + ] + + return { + "source_id": self._source_id, + "source_type": self._source_id, + "name": self._display_name, + "icon": self._icon, + "params_form": form_fields, + "pinned_params": pinned_params, + "hierarchy": _hierarchy_dicts(full_hierarchy), + "effective_hierarchy": _hierarchy_dicts(effective), + "auth_instructions": self._loader_class.auth_instructions(), + } + + def create_blueprint(self) -> Blueprint: + bp = Blueprint( + f"source_{self._source_id}", + __name__, + url_prefix=f"/api/sources/{self._source_id}", + ) + self._register_auth_routes(bp) + self._register_catalog_routes(bp) + self._register_data_routes(bp) + return bp + + def on_enable(self, app: Flask) -> None: + logger.info("ConnectedDataSource '%s' enabled", self._source_id) + + # -- Identity + Loader Management -------------------------------------- + + @staticmethod + def _get_identity() -> str: + from data_formulator.security.auth import get_identity_id + return get_identity_id() + + def _get_loader(self, identity: str | None = None) -> ExternalDataLoader | None: + identity = identity or self._get_identity() + return self._loaders.get(identity) + + def _connect(self, user_params: dict[str, Any]) -> ExternalDataLoader: + """Instantiate a loader with merged params (default + user).""" + merged = {**self._default_params, **user_params} + loader = self._loader_class(merged) + identity = self._get_identity() + self._loaders[identity] = loader + return loader + + def _disconnect(self) -> None: + identity = self._get_identity() + self._loaders.pop(identity, None) + + def _require_loader(self) -> ExternalDataLoader: + loader = self._get_loader() + if loader is None: + raise ValueError("Not connected. Please connect first.") + return loader + + # -- Auth Routes ------------------------------------------------------- + + def _register_auth_routes(self, bp: Blueprint) -> None: + source = self + + @bp.route("/auth/connect", methods=["POST"]) + def auth_connect(): + try: + data = request.get_json() or {} + user_params = data.get("params", {}) + loader = source._connect(user_params) + + if not loader.test_connection(): + source._disconnect() + return jsonify({"status": "error", "message": "Connection test failed"}), 400 + + safe = loader.get_safe_params() + return jsonify({ + "status": "connected", + "params": safe, + "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), + "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), + "pinned_scope": loader.pinned_scope(), + }) + except Exception as e: + source._disconnect() + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + @bp.route("/auth/disconnect", methods=["POST"]) + def auth_disconnect(): + source._disconnect() + return jsonify({"status": "disconnected"}) + + @bp.route("/auth/status", methods=["GET"]) + def auth_status(): + loader = source._get_loader() + if loader is None: + return jsonify({ + "connected": False, + "params_form": source.get_frontend_config()["params_form"], + }) + try: + alive = loader.test_connection() + except Exception: + alive = False + if not alive: + source._disconnect() + return jsonify({ + "connected": False, + "params_form": source.get_frontend_config()["params_form"], + }) + return jsonify({ + "connected": True, + "params": loader.get_safe_params(), + "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), + "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), + "pinned_scope": loader.pinned_scope(), + }) + + # -- Catalog Routes ---------------------------------------------------- + + def _register_catalog_routes(self, bp: Blueprint) -> None: + source = self + + @bp.route("/catalog/ls", methods=["POST"]) + def catalog_ls(): + try: + loader = source._require_loader() + data = request.get_json() or {} + path = data.get("path", []) + name_filter = data.get("filter") + + nodes = loader.ls(path=path, filter=name_filter) + return jsonify({ + "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), + "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), + "path": path, + "nodes": [_node_to_dict(n) for n in nodes], + }) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + @bp.route("/catalog/metadata", methods=["POST"]) + def catalog_metadata(): + try: + loader = source._require_loader() + data = request.get_json() or {} + path = data.get("path", []) + + metadata = loader.get_metadata(path) + return jsonify({"path": path, "metadata": metadata}) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + @bp.route("/catalog/list_tables", methods=["POST"]) + def catalog_list_tables(): + """Flat/eager listing of all tables in pinned scope.""" + try: + loader = source._require_loader() + data = request.get_json() or {} + table_filter = data.get("filter") + + tables = loader.list_tables(table_filter=table_filter) + return jsonify({"tables": tables}) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + # -- Data Routes ------------------------------------------------------- + + def _register_data_routes(self, bp: Blueprint) -> None: + source = self + + @bp.route("/data/import", methods=["POST"]) + def data_import(): + try: + loader = source._require_loader() + data = request.get_json() or {} + + source_table = data.get("source_table") + if not source_table: + return jsonify({"status": "error", "message": "source_table is required"}), 400 + + table_name = data.get("table_name") + import_options = data.get("import_options", {}) + + from data_formulator.security.auth import get_identity_id + from data_formulator.workspace_factory import get_workspace + from data_formulator.datalake.parquet_utils import sanitize_table_name + + workspace = get_workspace(get_identity_id()) + + if not table_name: + raw = source_table.split(".")[-1] if "." in source_table else source_table + table_name = raw + safe_name = sanitize_table_name(table_name) + + meta = loader.ingest_to_workspace( + workspace=workspace, + table_name=safe_name, + source_table=source_table, + import_options=import_options or None, + ) + return jsonify({ + "status": "success", + "table_name": meta.name, + "row_count": meta.row_count, + "refreshable": True, + }) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + @bp.route("/data/refresh", methods=["POST"]) + def data_refresh(): + try: + loader = source._require_loader() + data = request.get_json() or {} + table_name = data.get("table_name") + if not table_name: + return jsonify({"status": "error", "message": "table_name is required"}), 400 + + from data_formulator.security.auth import get_identity_id + from data_formulator.workspace_factory import get_workspace + + workspace = get_workspace(get_identity_id()) + meta = workspace.get_table_metadata(table_name) + if meta is None or not meta.source_table: + return jsonify({"status": "error", "message": f"No refreshable source for '{table_name}'"}), 400 + + arrow_table = loader.fetch_data_as_arrow( + source_table=meta.source_table, + import_options=meta.import_options, + ) + new_meta, data_changed = workspace.refresh_parquet_from_arrow(table_name, arrow_table) + return jsonify({ + "status": "success", + "table_name": table_name, + "row_count": new_meta.row_count, + "data_changed": data_changed, + }) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + @bp.route("/data/preview", methods=["POST"]) + def data_preview(): + try: + loader = source._require_loader() + data = request.get_json() or {} + source_table = data.get("source_table") + if not source_table: + return jsonify({"status": "error", "message": "source_table is required"}), 400 + + size = data.get("size", 10) + arrow_table = loader.fetch_data_as_arrow( + source_table=source_table, + import_options={"size": size}, + ) + df = arrow_table.to_pandas() + rows = _json.loads(df.to_json(orient="records", date_format="iso")) + columns = [{"name": col, "type": str(df[col].dtype)} for col in df.columns] + + return jsonify({ + "status": "success", + "columns": columns, + "rows": rows, + "row_count": len(rows), + }) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + +# --------------------------------------------------------------------------- +# Configuration loading +# --------------------------------------------------------------------------- + +@dataclasses.dataclass +class SourceSpec: + """A single data source entry from config (YAML, env vars, or auto-discovery).""" + source_id: str + loader_type: str # registry key in DATA_LOADERS (e.g. "postgresql") + display_name: str + default_params: dict[str, Any] = dataclasses.field(default_factory=dict) + icon: str = "" + auto_connect: bool = False + + +def _resolve_env_refs(params: dict[str, Any]) -> dict[str, Any]: + """Resolve ``${ENV_VAR}`` references in param values.""" + import os + resolved = {} + for k, v in params.items(): + if isinstance(v, str) and v.startswith("${") and v.endswith("}"): + env_name = v[2:-1] + resolved[k] = os.environ.get(env_name, "") + else: + resolved[k] = v + return resolved + + +def _load_yaml_config() -> dict | None: + """Search for ``data-sources.yml`` in standard locations and return parsed content.""" + import os + from pathlib import Path + + search_paths = [ + Path.cwd() / "data-sources.yml", + Path.home() / ".data-formulator" / "data-sources.yml", + Path("/etc/data-formulator/data-sources.yml"), + ] + # Also check DATA_FORMULATOR_HOME + df_home = os.environ.get("DATA_FORMULATOR_HOME") + if df_home: + search_paths.insert(0, Path(df_home) / "data-sources.yml") + + for p in search_paths: + if p.is_file(): + try: + import yaml + with open(p) as f: + data = yaml.safe_load(f) + logger.info("Loaded data source config from %s", p) + return data + except Exception as e: + logger.warning("Failed to parse %s: %s", p, e) + return None + + +def _parse_env_sources() -> list[SourceSpec]: + """Parse ``DF_SOURCES____=`` environment variables.""" + import os + prefix = "DF_SOURCES__" + # Collect: {instance_id: {key: value}} + raw: dict[str, dict[str, str]] = {} + for env_key, env_val in os.environ.items(): + if not env_key.startswith(prefix): + continue + rest = env_key[len(prefix):] + parts = rest.split("__", 1) + if len(parts) != 2: + continue + instance_id, field = parts[0], parts[1].lower() + raw.setdefault(instance_id, {})[field] = env_val + + specs = [] + for instance_id, fields in raw.items(): + loader_type = fields.pop("type", "") + if not loader_type: + logger.warning("DF_SOURCES__%s has no 'type' field, skipping", instance_id) + continue + name = fields.pop("name", loader_type.replace("_", " ").title()) + icon = fields.pop("icon", "") + # Remaining fields with "params__" prefix → params dict + params: dict[str, str] = {} + other: dict[str, str] = {} + for k, v in fields.items(): + if k.startswith("params__"): + params[k[len("params__"):]] = v + else: + other[k] = v + # Also treat top-level non-reserved keys as params + params.update(other) + specs.append(SourceSpec( + source_id=instance_id, + loader_type=loader_type, + display_name=name, + default_params=params, + icon=icon, + )) + return specs + + +def _build_source_specs() -> tuple[list[SourceSpec], bool]: + """Build the list of source specs from config (env + YAML + auto-discovery). + + Returns ``(specs, auto_discover)`` where ``auto_discover`` indicates + whether unconfigured loaders should also be registered. + """ + import os + from data_formulator.data_loader import DATA_LOADERS + + # 1. Env vars (highest priority) + env_specs = _parse_env_sources() + + # 2. YAML config + yaml_config = _load_yaml_config() + yaml_specs: list[SourceSpec] = [] + auto_discover = True + if yaml_config: + auto_discover = yaml_config.get("auto_discover", True) + for i, entry in enumerate(yaml_config.get("sources", [])): + loader_type = entry.get("type", "") + if not loader_type: + continue + sid = entry.get("id") or f"{loader_type}_{i}" if i > 0 else loader_type + yaml_specs.append(SourceSpec( + source_id=sid, + loader_type=loader_type, + display_name=entry.get("name", loader_type.replace("_", " ").title()), + default_params=_resolve_env_refs(entry.get("params", {})), + icon=entry.get("icon", ""), + auto_connect=entry.get("auto_connect", False), + )) + + # Also respect DF_AUTO_DISCOVER_SOURCES env var + if os.environ.get("DF_AUTO_DISCOVER_SOURCES", "").lower() == "false": + auto_discover = False + + # Merge: env specs override yaml specs with same source_id + env_ids = {s.source_id for s in env_specs} + merged = list(env_specs) + [s for s in yaml_specs if s.source_id not in env_ids] + + # 3. Auto-discovery: add any installed loader not already configured + if auto_discover: + configured_types = {s.loader_type for s in merged} + for key in DATA_LOADERS: + if key not in configured_types: + merged.append(SourceSpec( + source_id=key, + loader_type=key, + display_name=key.replace("_", " ").title(), + )) + + return merged, auto_discover + + +# --------------------------------------------------------------------------- +# Registration +# --------------------------------------------------------------------------- + +def register_connected_sources(app: Flask) -> None: + """Register ConnectedDataSource plugins from config + auto-discovery. + + Called from ``app.py`` during startup. + """ + from data_formulator.data_loader import DATA_LOADERS, DISABLED_LOADERS + + specs, _auto_discover = _build_source_specs() + + for spec in specs: + loader_class = DATA_LOADERS.get(spec.loader_type) + if not loader_class: + if spec.loader_type in DISABLED_LOADERS: + logger.info( + "Source '%s' (type=%s) not available: %s", + spec.source_id, spec.loader_type, DISABLED_LOADERS[spec.loader_type], + ) + else: + logger.warning("Unknown source type '%s' for '%s'", spec.loader_type, spec.source_id) + continue + + source = ConnectedDataSource.from_loader( + loader_class, + source_id=spec.source_id, + display_name=spec.display_name, + default_params=spec.default_params, + icon=spec.icon or spec.loader_type, + ) + bp = source.create_blueprint() + app.register_blueprint(bp) + source.on_enable(app) + CONNECTED_SOURCES[spec.source_id] = source + logger.info( + "Registered ConnectedDataSource '%s' (type=%s%s)", + spec.source_id, + spec.loader_type, + f", pinned={list(spec.default_params.keys())}" if spec.default_params else "", + ) + + for key, reason in DISABLED_LOADERS.items(): + if key not in CONNECTED_SOURCES: + logger.info("Source '%s' not available: %s", key, reason) diff --git a/py-src/data_formulator/data_loader/__init__.py b/py-src/data_formulator/data_loader/__init__.py index f4c3dec5..d7577372 100644 --- a/py-src/data_formulator/data_loader/__init__.py +++ b/py-src/data_formulator/data_loader/__init__.py @@ -14,7 +14,7 @@ import importlib import logging -from data_formulator.data_loader.external_data_loader import ExternalDataLoader +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode _log = logging.getLogger(__name__) @@ -32,6 +32,7 @@ ("mongodb", "data_formulator.data_loader.mongodb_data_loader", "MongoDBDataLoader", "pymongo"), ("bigquery", "data_formulator.data_loader.bigquery_data_loader", "BigQueryDataLoader", "google-cloud-bigquery"), ("athena", "data_formulator.data_loader.athena_data_loader", "AthenaDataLoader", "boto3"), + ("superset", "data_formulator.data_loader.superset_data_loader", "SupersetLoader", "requests"), ] # --------------------------------------------------------------------------- @@ -53,4 +54,4 @@ _key, exc.name, _install_hint, ) -__all__ = ["ExternalDataLoader", "DATA_LOADERS", "DISABLED_LOADERS"] \ No newline at end of file +__all__ = ["ExternalDataLoader", "CatalogNode", "DATA_LOADERS", "DISABLED_LOADERS"] \ No newline at end of file diff --git a/py-src/data_formulator/data_loader/athena_data_loader.py b/py-src/data_formulator/data_loader/athena_data_loader.py index 70d0fcc1..5e612ca1 100644 --- a/py-src/data_formulator/data_loader/athena_data_loader.py +++ b/py-src/data_formulator/data_loader/athena_data_loader.py @@ -7,7 +7,7 @@ import botocore.exceptions from pyarrow import fs as pa_fs -from data_formulator.data_loader.external_data_loader import ExternalDataLoader, sanitize_table_name +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode, sanitize_table_name from typing import Any log = logging.getLogger(__name__) @@ -318,9 +318,7 @@ def _execute_query(self, query: str) -> str: def fetch_data_as_arrow( self, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> pa.Table: """ Fetch data from Athena as a PyArrow Table. @@ -328,6 +326,11 @@ def fetch_data_as_arrow( Executes the query on Athena and reads the CSV results from S3 using PyArrow's S3 filesystem. """ + opts = import_options or {} + size = opts.get("size", 1000000) + sort_columns = opts.get("sort_columns") + sort_order = opts.get("sort_order", "asc") + if not source_table: raise ValueError("source_table must be provided") @@ -434,3 +437,89 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: log.info(f"Returning {len(results)} tables") return results + + # -- Catalog tree API -------------------------------------------------- + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "database", "label": "Database"}, + {"key": "table", "label": "Table"}, + ] + + def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[CatalogNode]: + path = path or [] + eff = self.effective_hierarchy() + if len(path) >= len(eff): + return [] + level_key = eff[len(path)]["key"] + + if level_key == "database": + try: + resp = self.athena_client.list_databases(CatalogName="AwsDataCatalog") + databases = resp.get("DatabaseList", []) + if self.database: + databases = [d for d in databases if d["Name"] == self.database] + except botocore.exceptions.ClientError: + databases = [] + nodes = [] + for db in databases: + name = db["Name"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode(name=name, node_type="namespace", path=path + [name])) + return nodes + + if level_key == "table": + pinned = self.pinned_scope() + db_name = pinned.get("database") or (path[0] if path else None) + if not db_name: + return [] + try: + resp = self.athena_client.list_table_metadata( + CatalogName="AwsDataCatalog", DatabaseName=db_name, MaxResults=200, + ) + except botocore.exceptions.ClientError: + return [] + nodes = [] + for t in resp.get("TableMetadataList", []): + name = t["Name"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode(name=name, node_type="table", path=path + [name])) + return nodes + + return [] + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + if not path: + return {} + pinned = self.pinned_scope() + remaining = list(path) + db_name = pinned.get("database") + if not db_name: + if not remaining: + return {} + db_name = remaining.pop(0) + if not remaining: + return {} + table_name = remaining[0] + try: + resp = self.athena_client.get_table_metadata( + CatalogName="AwsDataCatalog", DatabaseName=db_name, TableName=table_name, + ) + t = resp.get("TableMetadata", {}) + columns = [{"name": c["Name"], "type": c.get("Type", "unknown")} for c in t.get("Columns", [])] + for c in t.get("PartitionKeys", []): + columns.append({"name": c["Name"], "type": c.get("Type", "unknown") + " (partition)"}) + return {"row_count": 0, "columns": columns, "sample_rows": []} + except Exception as e: + log.warning(f"get_metadata failed for {path}: {e}") + return {} + + def test_connection(self) -> bool: + try: + self.athena_client.list_databases(CatalogName="AwsDataCatalog", MaxResults=1) + return True + except Exception: + return False diff --git a/py-src/data_formulator/data_loader/azure_blob_data_loader.py b/py-src/data_formulator/data_loader/azure_blob_data_loader.py index fb078e4b..91e94242 100644 --- a/py-src/data_formulator/data_loader/azure_blob_data_loader.py +++ b/py-src/data_formulator/data_loader/azure_blob_data_loader.py @@ -8,7 +8,7 @@ from azure.identity import DefaultAzureCredential from pyarrow import fs as pa_fs -from data_formulator.data_loader.external_data_loader import ExternalDataLoader, sanitize_table_name +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode, sanitize_table_name from typing import Any logger = logging.getLogger(__name__) @@ -102,15 +102,18 @@ def _read_sample(self, azure_url: str, limit: int) -> pd.DataFrame: def fetch_data_as_arrow( self, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> pa.Table: """ Fetch data from Azure Blob as a PyArrow Table. For files (parquet, csv), reads directly using PyArrow's Azure filesystem. """ + opts = import_options or {} + size = opts.get("size", 1000000) + sort_columns = opts.get("sort_columns") + sort_order = opts.get("sort_order", "asc") + if not source_table: raise ValueError("source_table (Azure blob URL) must be provided") @@ -281,4 +284,78 @@ def _estimate_by_row_sampling(self, azure_url: str, file_extension: str) -> int: return len(sample_df) except Exception as e: logger.debug("Row sampling failed for %s: %s", azure_url, e) - return 0 \ No newline at end of file + return 0 + + # -- Catalog tree API -------------------------------------------------- + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "container_name", "label": "Container"}, + {"key": "table", "label": "File"}, + ] + + def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[CatalogNode]: + path = path or [] + eff = self.effective_hierarchy() + if len(path) >= len(eff): + return [] + level_key = eff[len(path)]["key"] + + if level_key == "container_name": + return [CatalogNode(name=self.container_name, node_type="namespace", path=path + [self.container_name])] + + if level_key == "table": + from azure.storage.blob import BlobServiceClient as _BSC + if self.connection_string: + bsc = _BSC.from_connection_string(self.connection_string) + elif self.account_key: + bsc = _BSC(account_url=f"https://{self.account_name}.{self.endpoint}", credential=self.account_key) + else: + from azure.identity import DefaultAzureCredential + bsc = _BSC(account_url=f"https://{self.account_name}.{self.endpoint}", credential=DefaultAzureCredential()) + container_client = bsc.get_container_client(self.container_name) + nodes = [] + for blob in container_client.list_blobs(): + name = blob.name + if name.endswith("/") or not self._is_supported_file(name): + continue + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode( + name=name, node_type="table", path=path + [name], + metadata={"size_bytes": blob.size if hasattr(blob, "size") else 0}, + )) + return nodes + + return [] + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + if not path: + return {} + blob_name = path[-1] + azure_url = f"az://{self.account_name}.{self.endpoint}/{self.container_name}/{blob_name}" + try: + sample_df = self._read_sample(azure_url, 5) + columns = [{"name": c, "type": str(sample_df[c].dtype)} for c in sample_df.columns] + sample_rows = json.loads(sample_df.to_json(orient="records")) + row_count = self._estimate_row_count(azure_url) + return {"row_count": row_count, "columns": columns, "sample_rows": sample_rows} + except Exception as e: + logger.warning(f"get_metadata failed for {path}: {e}") + return {} + + def test_connection(self) -> bool: + try: + from azure.storage.blob import BlobServiceClient as _BSC + if self.connection_string: + bsc = _BSC.from_connection_string(self.connection_string) + elif self.account_key: + bsc = _BSC(account_url=f"https://{self.account_name}.{self.endpoint}", credential=self.account_key) + else: + from azure.identity import DefaultAzureCredential + bsc = _BSC(account_url=f"https://{self.account_name}.{self.endpoint}", credential=DefaultAzureCredential()) + bsc.get_container_client(self.container_name).get_container_properties() + return True + except Exception: + return False \ No newline at end of file diff --git a/py-src/data_formulator/data_loader/bigquery_data_loader.py b/py-src/data_formulator/data_loader/bigquery_data_loader.py index 687e95ae..f69fcc55 100644 --- a/py-src/data_formulator/data_loader/bigquery_data_loader.py +++ b/py-src/data_formulator/data_loader/bigquery_data_loader.py @@ -3,7 +3,7 @@ from typing import Any import pyarrow as pa -from data_formulator.data_loader.external_data_loader import ExternalDataLoader, sanitize_table_name +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode, sanitize_table_name from google.cloud import bigquery from google.oauth2 import service_account @@ -135,9 +135,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: def fetch_data_as_arrow( self, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> pa.Table: """ Fetch data from BigQuery as a PyArrow Table using native Arrow support. @@ -145,6 +143,11 @@ def fetch_data_as_arrow( BigQuery's Python client provides .to_arrow() for efficient Arrow-native data transfer, avoiding pandas conversion overhead. """ + opts = import_options or {} + size = opts.get("size", 1000000) + sort_columns = opts.get("sort_columns") + sort_order = opts.get("sort_order", "asc") + if not source_table: raise ValueError("source_table must be provided") @@ -207,3 +210,86 @@ def process_field(field, parent_path: str = ""): process_field(field) return select_parts if select_parts else ["*"] + + # -- Catalog tree API -------------------------------------------------- + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "project_id", "label": "Project"}, + {"key": "dataset_id", "label": "Dataset"}, + {"key": "table", "label": "Table"}, + ] + + def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[CatalogNode]: + path = path or [] + eff = self.effective_hierarchy() + if len(path) >= len(eff): + return [] + level_key = eff[len(path)]["key"] + + if level_key == "project_id": + # Project is always pinned (required param), but just in case + return [CatalogNode(name=self.project_id, node_type="namespace", path=path + [self.project_id])] + + if level_key == "dataset_id": + datasets = list(self.client.list_datasets(max_results=200)) + nodes = [] + for ds in datasets: + name = ds.dataset_id + if self.dataset_ids and name not in self.dataset_ids: + continue + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode(name=name, node_type="namespace", path=path + [name])) + return nodes + + if level_key == "table": + pinned = self.pinned_scope() + remaining = list(path) + # project is always pinned + dataset = pinned.get("dataset_id") + if not dataset: + if not remaining: + return [] + dataset = remaining.pop(0) + dataset_ref = f"{self.project_id}.{dataset}" + tables = list(self.client.list_tables(dataset_ref, max_results=500)) + nodes = [] + for t in tables: + name = t.table_id + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode(name=name, node_type="table", path=path + [name])) + return nodes + + return [] + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + if not path: + return {} + pinned = self.pinned_scope() + remaining = list(path) + dataset = pinned.get("dataset_id") + if not dataset: + if not remaining: + return {} + dataset = remaining.pop(0) + if not remaining: + return {} + table_name = remaining[0] + full_table = f"{self.project_id}.{dataset}.{table_name}" + try: + table_ref = self.client.get_table(full_table) + columns = [{"name": f.name, "type": f.field_type} for f in table_ref.schema] + return {"row_count": table_ref.num_rows or 0, "columns": columns, "sample_rows": []} + except Exception as e: + log.warning(f"get_metadata failed for {path}: {e}") + return {} + + def test_connection(self) -> bool: + try: + list(self.client.list_datasets(max_results=1)) + return True + except Exception: + return False diff --git a/py-src/data_formulator/data_loader/external_data_loader.py b/py-src/data_formulator/data_loader/external_data_loader.py index 168f8617..754d20a1 100644 --- a/py-src/data_formulator/data_loader/external_data_loader.py +++ b/py-src/data_formulator/data_loader/external_data_loader.py @@ -1,4 +1,5 @@ from abc import ABC, abstractmethod +from dataclasses import dataclass, field from typing import Any, TYPE_CHECKING import pandas as pd import pyarrow as pa @@ -21,6 +22,30 @@ def sanitize_table_name(name_as: str) -> str: return sanitize_external_loader_table_name(name_as) +# --------------------------------------------------------------------------- +# Catalog tree model +# --------------------------------------------------------------------------- + +@dataclass +class CatalogNode: + """A node in the data source's catalog tree. + + Only two kinds of node: + + * ``"namespace"`` — expandable container (database, schema, bucket, …). + The hierarchy's ``label`` tells the UI what to call it. + * ``"table"`` — importable leaf (table, file, dataset, …). + + The *level name* (e.g. "Database", "Schema") comes from + :meth:`ExternalDataLoader.catalog_hierarchy`, not from the node itself. + """ + + name: str # Display name ("public", "users", …) + node_type: str # "namespace" or "table" + path: list[str] # Full path from root: ["mydb", "public", "users"] + metadata: dict[str, Any] | None = field(default=None) # row_count, columns, … + + class ExternalDataLoader(ABC): """ Abstract base class for external data loaders. @@ -54,9 +79,7 @@ def get_safe_params(self) -> dict[str, Any]: def fetch_data_as_arrow( self, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> pa.Table: """ Fetch data from the external source as a PyArrow Table. @@ -68,46 +91,36 @@ def fetch_data_as_arrow( Args: source_table: Full table name (or table identifier) to fetch from - size: Maximum number of rows to fetch - sort_columns: Columns to sort by before limiting - sort_order: Sort direction ('asc' or 'desc') + import_options: Optional dict controlling what/how data is fetched: + - size (int): Maximum number of rows to fetch (default: 1000000) + - columns (list[str]): Column selection / projection + - sort_columns (list[str]): Columns to sort by before limiting + - sort_order (str): 'asc' or 'desc' + - filters (list[dict]): Standard SPJ filters + - source_filters (dict): Source-defined filters (BI tools) Returns: PyArrow Table with the fetched data Raises: ValueError: If source_table is not provided - NotImplementedError: If the loader doesn't support this method yet """ pass def fetch_data_as_dataframe( self, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> pd.DataFrame: """ Fetch data from the external source as a pandas DataFrame. This method converts the Arrow table to pandas. For better performance, prefer using `fetch_data_as_arrow()` directly when possible. - - Args: - source_table: Full table name to fetch from - size: Maximum number of rows to fetch - sort_columns: Columns to sort by before limiting - sort_order: Sort direction ('asc' or 'desc') - - Returns: - pandas DataFrame with the fetched data """ arrow_table = self.fetch_data_as_arrow( source_table=source_table, - size=size, - sort_columns=sort_columns, - sort_order=sort_order, + import_options=import_options, ) return arrow_table.to_pandas() @@ -116,9 +129,7 @@ def ingest_to_workspace( workspace: "Workspace", table_name: str, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> "TableMetadata": """ Fetch data from external source and store as parquet in workspace. @@ -130,9 +141,7 @@ def ingest_to_workspace( workspace: The workspace to store data in table_name: Name for the table in the workspace source_table: Full table name to fetch from - size: Maximum number of rows to fetch - sort_columns: Columns to sort by before limiting - sort_order: Sort direction ('asc' or 'desc') + import_options: See fetch_data_as_arrow for details. Returns: TableMetadata for the created parquet file @@ -140,23 +149,22 @@ def ingest_to_workspace( # Fetch data as Arrow table (efficient, no pandas conversion) arrow_table = self.fetch_data_as_arrow( source_table=source_table, - size=size, - sort_columns=sort_columns, - sort_order=sort_order, + import_options=import_options, ) # Prepare loader metadata - loader_metadata = { + source_info = { "loader_type": self.__class__.__name__, "loader_params": self.get_safe_params(), "source_table": source_table, + "import_options": import_options, } # Write Arrow table directly to parquet (no pandas conversion) table_metadata = workspace.write_parquet_from_arrow( table=arrow_table, table_name=table_name, - loader_metadata=loader_metadata, + source_info=source_info, ) logger.info( @@ -190,10 +198,191 @@ def __init__(self, params: dict[str, Any]): @abstractmethod def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: - """ - List available tables (or files) from the data source. + """List all accessible tables within the current pinned scope. + + This is the **flat / eager** complement to :meth:`ls`: + + * ``list_tables()`` returns *every* importable table the user can + reach given the connection params (pinned scope). Simple and + complete, but potentially slow for large catalogs. + * ``ls(path)`` returns one level of the hierarchy at a time + (lazy). Better UX for large catalogs, but requires the loader + to implement hierarchical browsing. + + Both methods coexist permanently — ``list_tables`` is not legacy. + The default ``ls()`` falls back to ``list_tables()`` for loaders + that haven't implemented hierarchical browsing yet. Returns: - List of dicts with: name (table/file identifier), metadata (row_count, columns, sample_rows). + List of dicts with: name (table/file identifier), + metadata (row_count, columns, sample_rows). """ pass + + # ------------------------------------------------------------------ # + # Catalog tree API # + # ------------------------------------------------------------------ # + # # + # Every data source has a natural hierarchy whose leaf nodes are # + # importable tables (or files / datasets). ``catalog_hierarchy()`` # + # declares the *full* hierarchy; ``ls(path)`` lazily lists one level. # + # # + # ``list_tables()`` is the flat/eager alternative — it returns every # + # table in the pinned scope in one shot. Both coexist permanently. # + # # + # **Scope pinning** — when a connection param matches a hierarchy # + # level key (e.g. the user provides ``database="analytics"``), that # + # level is *pinned* and hidden from browsing. The helper # + # ``effective_hierarchy()`` computes the browsable levels. # + # ------------------------------------------------------------------ # + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + """Declare the *full* hierarchy of this data source. + + Returns an ordered list from root to leaf. Each entry: + + * ``"key"`` — internal identifier, matches a param name in + ``list_params()`` when the level is pinnable (e.g. ``"database"``). + * ``"label"`` — user-facing display name (e.g. ``"Database"``). + + The **last** entry is always the importable leaf (table / file / + dataset). + + Examples:: + + MySQL: [{"key":"database","label":"Database"}, + {"key":"table","label":"Table"}] + PostgreSQL: [{"key":"database","label":"Database"}, + {"key":"schema","label":"Schema"}, + {"key":"table","label":"Table"}] + BigQuery: [{"key":"project","label":"Project"}, + {"key":"dataset","label":"Dataset"}, + {"key":"table","label":"Table"}] + S3: [{"key":"bucket","label":"Bucket"}, + {"key":"object","label":"File"}] + + Default (flat): ``[{"key":"table","label":"Table"}]``. + """ + return [{"key": "table", "label": "Table"}] + + def effective_hierarchy(self) -> list[dict[str, str]]: + """Return the *browsable* hierarchy — full hierarchy minus pinned levels. + + A level is **pinned** when: + + 1. Its ``key`` appears in the loader's ``list_params()`` with + ``scope_level=True`` (or when ``key`` matches a param name), AND + 2. The user provided a non-empty value for that param at connect time. + + The pinned value is used transparently by ``ls()`` so the user never + has to browse that level. + + Example — PostgreSQL with ``database="prod"`` provided:: + + full: database → schema → table + effective: schema → table (database is pinned to "prod") + + Example — PostgreSQL with *no* ``database`` provided:: + + full: database → schema → table + effective: database → schema → table (all levels browsable) + """ + params = getattr(self, "params", {}) or {} + full = self.catalog_hierarchy() + return [ + level for level in full + if not params.get(level["key"]) # empty / missing → browsable + ] + + def pinned_scope(self) -> dict[str, str]: + """Return ``{level_key: value}`` for every pinned hierarchy level. + + These are the levels that were fixed at connection time and are + hidden from tree browsing. + """ + params = getattr(self, "params", {}) or {} + return { + level["key"]: params[level["key"]] + for level in self.catalog_hierarchy() + if params.get(level["key"]) + } + + def ls( + self, + path: list[str] | None = None, + filter: str | None = None, + ) -> list[CatalogNode]: + """List children at a catalog path (like ``ls`` in a filesystem). + + This is the **lazy / hierarchical** complement to :meth:`list_tables`. + It returns one level of the catalog at a time, which is better for + large catalogs but requires the loader to implement hierarchical + browsing. + + ``path`` is relative to the **effective** (unpinned) hierarchy. + + * ``path=[]`` — list nodes at the first *browsable* level. + * ``path=["public"]`` — expand that node one level deeper. + * The length of ``path`` must be ``< len(effective_hierarchy())``. + + The default implementation falls back to :meth:`list_tables` at the + root level. Subclasses should override for true hierarchical + browsing. + + Args: + path: List of names, one per effective hierarchy level. + filter: Optional substring filter on node names. + + Returns: + :class:`CatalogNode` objects representing children. + """ + if path: + return [] + tables = self.list_tables(table_filter=filter) + return [ + CatalogNode( + name=t["name"], + node_type="table", + path=[t["name"]], + metadata=t.get("metadata"), + ) + for t in tables + ] + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + """Get detailed metadata for a single catalog node. + + For a table: columns, types, row count, sample rows. + Default: finds the node via ``ls`` and returns its metadata dict. + """ + if not path: + return {} + nodes = self.ls(path[:-1], filter=path[-1]) + for n in nodes: + if n.name == path[-1]: + return n.metadata or {} + return {} + + def test_connection(self) -> bool: + """Validate the connection is alive. + + Default: tries a lightweight ``list_tables`` call. + Subclasses should override with something cheaper + (e.g. ``SELECT 1``). + """ + try: + self.list_tables(table_filter="__ping__") + return True + except Exception: + return False + + @staticmethod + def auth_mode() -> str: + """Return ``'connection'`` (default) or ``'token'``.""" + return "connection" + + @staticmethod + def rate_limit() -> dict | None: + """Optional rate-limit hints. ``None`` = no limit.""" + return None diff --git a/py-src/data_formulator/data_loader/kusto_data_loader.py b/py-src/data_formulator/data_loader/kusto_data_loader.py index ace6dbf6..c04cb9c0 100644 --- a/py-src/data_formulator/data_loader/kusto_data_loader.py +++ b/py-src/data_formulator/data_loader/kusto_data_loader.py @@ -4,7 +4,7 @@ import pandas as pd import pyarrow as pa -from data_formulator.data_loader.external_data_loader import ExternalDataLoader, sanitize_table_name +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode, sanitize_table_name from azure.kusto.data import KustoClient, KustoConnectionStringBuilder from azure.kusto.data.helpers import dataframe_from_result_table @@ -17,7 +17,7 @@ class KustoDataLoader(ExternalDataLoader): def list_params() -> list[dict[str, Any]]: params_list = [ {"name": "kusto_cluster", "type": "string", "required": True, "description": "e.g., https://mycluster.region.kusto.windows.net"}, - {"name": "kusto_database", "type": "string", "required": True, "description": "database name"}, + {"name": "kusto_database", "type": "string", "required": False, "description": "Database name (leave empty to browse all databases)"}, {"name": "client_id", "type": "string", "required": False, "description": "only for App Key auth"}, {"name": "client_secret", "type": "string", "required": False, "description": "only for App Key auth"}, {"name": "tenant_id", "type": "string", "required": False, "description": "only for App Key auth"} @@ -130,9 +130,7 @@ def query(self, kql: str) -> pd.DataFrame: def fetch_data_as_arrow( self, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> pa.Table: """ Fetch data from Kusto/Azure Data Explorer as a PyArrow Table. @@ -145,6 +143,11 @@ def fetch_data_as_arrow( sort_columns: Columns to sort by sort_order: Sort direction """ + opts = import_options or {} + size = opts.get("size", 1000000) + sort_columns = opts.get("sort_columns") + sort_order = opts.get("sort_order", "asc") + if not source_table: raise ValueError("source_table must be provided") @@ -211,4 +214,92 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: "metadata": table_metadata }) - return tables \ No newline at end of file + return tables + + # -- Catalog tree API -------------------------------------------------- + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "kusto_database", "label": "Database"}, + {"key": "table", "label": "Table"}, + ] + + def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[CatalogNode]: + path = path or [] + eff = self.effective_hierarchy() + if len(path) >= len(eff): + return [] + level_key = eff[len(path)]["key"] + + if level_key == "kusto_database": + # List databases on the cluster + db_df = self.query(".show databases") + nodes = [] + for rec in db_df.to_dict(orient="records"): + name = rec["DatabaseName"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode(name=name, node_type="namespace", path=path + [name])) + return nodes + + if level_key == "table": + pinned = self.pinned_scope() + db = pinned.get("kusto_database") or (path[0] if path else None) + if not db: + return [] + # Query tables in the specific database + old_db = self.kusto_database + self.kusto_database = db + try: + tables_df = self.query(".show tables") + finally: + self.kusto_database = old_db + nodes = [] + for rec in tables_df.to_dict(orient="records"): + name = rec["TableName"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode(name=name, node_type="table", path=path + [name])) + return nodes + + return [] + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + if not path: + return {} + pinned = self.pinned_scope() + remaining = list(path) + db = pinned.get("kusto_database") + if not db: + if not remaining: + return {} + db = remaining.pop(0) + if not remaining: + return {} + table_name = remaining[0] + old_db = self.kusto_database + self.kusto_database = db + try: + schema_result = self.query(f".show table ['{table_name}'] schema as json").to_dict(orient="records") + columns = [ + {"name": r["Name"], "type": r["Type"]} + for r in json.loads(schema_result[0]["Schema"])["OrderedColumns"] + ] + details = self.query(f".show table ['{table_name}'] details").to_dict(orient="records") + row_count = int(details[0]["TotalRowCount"]) + sample_df = self.query(f"['{table_name}'] | take 5") + sample_rows = json.loads(sample_df.to_json(orient="records", date_format="iso")) + return {"row_count": row_count, "columns": columns, "sample_rows": sample_rows} + except Exception as e: + logger.warning(f"get_metadata failed for {path}: {e}") + return {} + finally: + self.kusto_database = old_db + + def test_connection(self) -> bool: + try: + self.query(".show databases | take 1") + return True + except Exception: + return False \ No newline at end of file diff --git a/py-src/data_formulator/data_loader/mongodb_data_loader.py b/py-src/data_formulator/data_loader/mongodb_data_loader.py index 7d31327a..0ddddd3e 100644 --- a/py-src/data_formulator/data_loader/mongodb_data_loader.py +++ b/py-src/data_formulator/data_loader/mongodb_data_loader.py @@ -7,7 +7,7 @@ import pymongo from bson import ObjectId -from data_formulator.data_loader.external_data_loader import ExternalDataLoader, sanitize_table_name +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode, sanitize_table_name from typing import Any logger = logging.getLogger(__name__) @@ -164,10 +164,12 @@ def _process_documents(self, documents: list[dict[str, Any]]) -> pd.DataFrame: def fetch_data_as_arrow( self, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> pa.Table: + opts = import_options or {} + size = opts.get("size", 1000000) + sort_columns = opts.get("sort_columns") + sort_order = opts.get("sort_order", "asc") """ Fetch data from MongoDB as a PyArrow Table. @@ -275,4 +277,64 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: logger.debug(f"Error listing collection {collection_name}: {e}") continue - return results \ No newline at end of file + return results + + # -- Catalog tree API -------------------------------------------------- + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "database", "label": "Database"}, + {"key": "collection", "label": "Collection"}, + ] + + def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[CatalogNode]: + path = path or [] + eff = self.effective_hierarchy() + if len(path) >= len(eff): + return [] + level_key = eff[len(path)]["key"] + + if level_key == "database": + # database is required, so always pinned — but handle defensively + return [CatalogNode( + name=self.database_name, node_type="namespace", + path=path + [self.database_name], + )] + + if level_key == "collection": + collection_names = self.db.list_collection_names() + nodes = [] + for name in sorted(collection_names): + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode(name=name, node_type="table", path=path + [name])) + return nodes + + return [] + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + if not path: + return {} + collection_name = path[-1] + try: + coll = self.db[collection_name] + row_count = coll.count_documents({}) + sample = list(coll.find().limit(5)) + if sample: + df = self._process_documents(sample) + columns = [{"name": c, "type": str(df[c].dtype)} for c in df.columns] + sample_rows = json.loads(df.to_json(orient="records")) + else: + columns, sample_rows = [], [] + return {"row_count": row_count, "columns": columns, "sample_rows": sample_rows} + except Exception as e: + logger.warning(f"get_metadata failed for {path}: {e}") + return {} + + def test_connection(self) -> bool: + try: + self.mongo_client.admin.command("ping") + return True + except Exception: + return False \ No newline at end of file diff --git a/py-src/data_formulator/data_loader/mssql_data_loader.py b/py-src/data_formulator/data_loader/mssql_data_loader.py index 080b4d65..b0305a10 100644 --- a/py-src/data_formulator/data_loader/mssql_data_loader.py +++ b/py-src/data_formulator/data_loader/mssql_data_loader.py @@ -1,16 +1,26 @@ import json import logging +import math from typing import Any -import pandas as pd import pyarrow as pa import pyodbc -from data_formulator.data_loader.external_data_loader import ExternalDataLoader, sanitize_table_name +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode, sanitize_table_name log = logging.getLogger(__name__) +def _is_nan(value) -> bool: + """Check if a value is NaN (works for float, int, None).""" + if value is None: + return True + try: + return math.isnan(float(value)) + except (TypeError, ValueError): + return False + + class MSSQLDataLoader(ExternalDataLoader): @staticmethod def list_params() -> list[dict[str, Any]]: @@ -25,9 +35,9 @@ def list_params() -> list[dict[str, Any]]: { "name": "database", "type": "string", - "required": True, - "default": "master", - "description": "Database name to connect to", + "required": False, + "default": "", + "description": "Database name (leave empty to browse all databases)", }, { "name": "user", @@ -102,7 +112,7 @@ def __init__(self, params: dict[str, Any]): self.params = params self.server = params.get("server", "localhost") - self.database = params.get("database", "master") + self.database = params.get("database", "") or "" self.user = params.get("user", "").strip() self.password = params.get("password", "").strip() self.port = params.get("port", "1433") @@ -111,11 +121,14 @@ def __init__(self, params: dict[str, Any]): self.trust_server_certificate = params.get("trust_server_certificate", "no") self.connection_timeout = params.get("connection_timeout", "30") + # When no database specified, connect to master for catalog browsing + connect_db = self.database or "master" + # Build ODBC connection string conn_str = ( f"DRIVER={{{self.driver}}};" f"SERVER={self.server},{self.port};" - f"DATABASE={self.database};" + f"DATABASE={connect_db};" f"Encrypt={self.encrypt};" f"TrustServerCertificate={self.trust_server_certificate};" f"Connection Timeout={self.connection_timeout};" @@ -166,9 +179,20 @@ def _safe_select_list(self, schema: str, table_name: str) -> str: return "*" def _read_sql(self, query: str) -> pa.Table: - """Execute a query and return results as a PyArrow Table via pyodbc.""" - df = pd.read_sql(query, self._conn) - return pa.Table.from_pandas(df) + """Execute a query and return results as a PyArrow Table (no pandas).""" + cur = self._conn.cursor() + try: + cur.execute(query) + if cur.description is None: + return pa.table({}) + columns = [desc[0] for desc in cur.description] + rows = cur.fetchall() + if not rows: + return pa.table({col: pa.array([], type=pa.null()) for col in columns}) + col_data = {col: [row[i] for row in rows] for i, col in enumerate(columns)} + return pa.table(col_data) + finally: + cur.close() def _execute_query_raw(self, query: str) -> pa.Table: """Execute a query (no error wrapping).""" @@ -185,13 +209,16 @@ def _execute_query(self, query: str) -> pa.Table: def fetch_data_as_arrow( self, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> pa.Table: """ Fetch data from SQL Server as a PyArrow Table. """ + opts = import_options or {} + size = opts.get("size", 1000000) + sort_columns = opts.get("sort_columns") + sort_order = opts.get("sort_order", "asc") + if not source_table: raise ValueError("source_table must be provided") @@ -277,7 +304,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: # Add length/precision info for relevant types with NaN handling if ( col_row["CHARACTER_MAXIMUM_LENGTH"] is not None - and not pd.isna(col_row["CHARACTER_MAXIMUM_LENGTH"]) + and not _is_nan(col_row["CHARACTER_MAXIMUM_LENGTH"]) ): try: col_info["max_length"] = int(col_row["CHARACTER_MAXIMUM_LENGTH"]) @@ -286,7 +313,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: if ( col_row["NUMERIC_PRECISION"] is not None - and not pd.isna(col_row["NUMERIC_PRECISION"]) + and not _is_nan(col_row["NUMERIC_PRECISION"]) ): try: col_info["precision"] = int(col_row["NUMERIC_PRECISION"]) @@ -295,7 +322,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: if ( col_row["NUMERIC_SCALE"] is not None - and not pd.isna(col_row["NUMERIC_SCALE"]) + and not _is_nan(col_row["NUMERIC_SCALE"]) ): try: col_info["scale"] = int(col_row["NUMERIC_SCALE"]) @@ -311,13 +338,17 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: sample_rows = [] sample_query = f"SELECT TOP 10 {col_list} FROM [{schema}].[{table_name}]" try: - sample_df = self._execute_query(sample_query).to_pandas() - sample_df_clean = sample_df.fillna(value=None) - sample_rows = json.loads( - sample_df_clean.to_json( - orient="records", date_format="iso", default_handler=str - ) - ) + sample_table = self._execute_query(sample_query) + sample_rows = sample_table.to_pydict() + # Convert to list-of-dicts format + if sample_table.num_rows > 0: + cols = sample_table.column_names + sample_rows = [ + {c: str(sample_table.column(c)[i].as_py()) if sample_table.column(c)[i].as_py() is not None else None for c in cols} + for i in range(sample_table.num_rows) + ] + else: + sample_rows = [] except Exception as e: log.warning( f"Failed to sample table {schema}.{table_name}: {e}" @@ -325,11 +356,11 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: # Get row count count_query = f"SELECT COUNT(*) as row_count FROM [{schema}].[{table_name}]" - count_df = self._execute_query(count_query).to_pandas() + count_table = self._execute_query(count_query) # Handle NaN values in row count - raw_count = count_df.iloc[0]["row_count"] - if pd.isna(raw_count): + raw_count = count_table.column("row_count")[0].as_py() + if _is_nan(raw_count): row_count = 0 log.warning( f"Row count for table {schema}.{table_name} returned NaN, using 0" @@ -372,3 +403,134 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: results = [] return results + + # -- Catalog tree API -------------------------------------------------- + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "database", "label": "Database"}, + {"key": "schema", "label": "Schema"}, + {"key": "table", "label": "Table"}, + ] + + def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[CatalogNode]: + path = path or [] + eff = self.effective_hierarchy() + if len(path) >= len(eff): + return [] + level_key = eff[len(path)]["key"] + + if level_key == "database": + query = """ + SELECT name FROM sys.databases + WHERE name NOT IN ('master', 'tempdb', 'model', 'msdb') + AND state_desc = 'ONLINE' + ORDER BY name + """ + rows = self._execute_query(query).to_pandas() + nodes = [] + for _, r in rows.iterrows(): + name = r["name"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode(name=name, node_type="namespace", path=path + [name])) + return nodes + + if level_key == "schema": + pinned = self.pinned_scope() + db = pinned.get("database") or (path[0] if path else None) + if not db: + return [] + query = f""" + SELECT DISTINCT TABLE_SCHEMA + FROM [{db}].INFORMATION_SCHEMA.TABLES + WHERE TABLE_TYPE = 'BASE TABLE' + AND TABLE_SCHEMA NOT IN ('sys', 'INFORMATION_SCHEMA') + ORDER BY TABLE_SCHEMA + """ + rows = self._execute_query(query).to_pandas() + nodes = [] + for _, r in rows.iterrows(): + name = r["TABLE_SCHEMA"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode(name=name, node_type="namespace", path=path + [name])) + return nodes + + if level_key == "table": + pinned = self.pinned_scope() + remaining = list(path) + db = pinned.get("database") + if not db: + if not remaining: + return [] + db = remaining.pop(0) + schema = pinned.get("schema") + if not schema: + if not remaining: + return [] + schema = remaining.pop(0) + query = f""" + SELECT TABLE_NAME + FROM [{db}].INFORMATION_SCHEMA.TABLES + WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_SCHEMA = '{schema}' + ORDER BY TABLE_NAME + """ + rows = self._execute_query(query).to_pandas() + nodes = [] + for _, r in rows.iterrows(): + name = r["TABLE_NAME"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode(name=name, node_type="table", path=path + [name])) + return nodes + + return [] + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + if not path: + return {} + pinned = self.pinned_scope() + remaining = list(path) + db = pinned.get("database") + if not db: + if not remaining: + return {} + db = remaining.pop(0) + schema = pinned.get("schema") + if not schema: + if not remaining: + return {} + schema = remaining.pop(0) + if not remaining: + return {} + table_name = remaining[0] + try: + cols_df = self._execute_query(f""" + SELECT COLUMN_NAME, DATA_TYPE + FROM [{db}].INFORMATION_SCHEMA.COLUMNS + WHERE TABLE_SCHEMA = '{schema}' AND TABLE_NAME = '{table_name}' + ORDER BY ORDINAL_POSITION + """).to_pandas() + columns = [{"name": r["COLUMN_NAME"], "type": r["DATA_TYPE"]} for _, r in cols_df.iterrows()] + count_df = self._execute_query( + f"SELECT COUNT(*) AS cnt FROM [{db}].[{schema}].[{table_name}]" + ).to_pandas() + row_count = int(count_df["cnt"].iloc[0]) + col_list = self._safe_select_list(schema, table_name) + sample_df = self._execute_query( + f"SELECT TOP 5 {col_list} FROM [{db}].[{schema}].[{table_name}]" + ).to_pandas() + sample_rows = json.loads(sample_df.fillna(value=None).to_json(orient="records", date_format="iso", default_handler=str)) + return {"row_count": row_count, "columns": columns, "sample_rows": sample_rows} + except Exception as e: + log.warning(f"get_metadata failed for {path}: {e}") + return {} + + def test_connection(self) -> bool: + try: + self._execute_query("SELECT 1 AS ok") + return True + except Exception: + return False diff --git a/py-src/data_formulator/data_loader/mysql_data_loader.py b/py-src/data_formulator/data_loader/mysql_data_loader.py index b2c025db..f35a151b 100644 --- a/py-src/data_formulator/data_loader/mysql_data_loader.py +++ b/py-src/data_formulator/data_loader/mysql_data_loader.py @@ -2,11 +2,10 @@ import logging from typing import Any -import pandas as pd import pyarrow as pa import pymysql -from data_formulator.data_loader.external_data_loader import ExternalDataLoader +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode logger = logging.getLogger(__name__) @@ -20,7 +19,7 @@ def list_params() -> list[dict[str, Any]]: {"name": "password", "type": "string", "required": False, "default": "", "description": "leave blank for no password"}, {"name": "host", "type": "string", "required": True, "default": "localhost", "description": "server address"}, {"name": "port", "type": "int", "required": False, "default": 3306, "description": "server port"}, - {"name": "database", "type": "string", "required": True, "default": "mysql", "description": "database name"} + {"name": "database", "type": "string", "required": False, "default": "", "description": "Database name (leave empty to browse all databases)"} ] return params_list @@ -32,6 +31,8 @@ def auth_instructions() -> str: **Remote setup:** Get host, port, username, and password from your database administrator. Ensure the server allows remote connections and your IP is whitelisted. +**Scope:** Leave *database* empty to browse all databases on the server, or fill it in to go straight to tables in that database. + **Troubleshooting:** Test with `mysql -u -p -h -P `""" def __init__(self, params: dict[str, Any]): @@ -46,8 +47,6 @@ def __init__(self, params: dict[str, Any]): raise ValueError("MySQL host is required") if not self.user: raise ValueError("MySQL user is required") - if not self.database: - raise ValueError("MySQL database is required") port = self.params.get("port", "") if isinstance(port, str): @@ -61,20 +60,23 @@ def __init__(self, params: dict[str, Any]): # Use 127.0.0.1 when host is localhost to force IPv4 TCP and avoid IPv6 ::1 connection issues. host_for_conn = "127.0.0.1" if (self.host or "").strip().lower() == "localhost" else self.host - self._sanitized_url = f"mysql://{self.user}:***@{self.host}:{self.port}/{self.database}" + self._sanitized_url = f"mysql://{self.user}:***@{self.host}:{self.port}/{self.database or '(all)'}" + + # Connect — database is optional (can be None for server-level browsing) + connect_kwargs: dict[str, Any] = { + "host": host_for_conn, + "user": self.user, + "password": self.password or "", + "port": self.port, + } + if self.database: + connect_kwargs["database"] = self.database - # Test connection try: - self._conn = pymysql.connect( - host=host_for_conn, - user=self.user, - password=self.password or "", - database=self.database, - port=self.port, - ) + self._conn = pymysql.connect(**connect_kwargs) except Exception as e: logger.error(f"Failed to connect to MySQL ({self._sanitized_url}): {e}") - raise ValueError(f"Failed to connect to MySQL database '{self.database}' on host '{self.host}': {e}") from e + raise ValueError(f"Failed to connect to MySQL on host '{self.host}': {e}") from e logger.info(f"Successfully connected to MySQL: {self._sanitized_url}") # MySQL types that may need special handling @@ -85,9 +87,20 @@ def __init__(self, params: dict[str, Any]): _UNSUPPORTED_TYPES = _GEOMETRY_TYPES | _OTHER_UNSUPPORTED def _read_sql(self, query: str) -> pa.Table: - """Execute a query and return results as a PyArrow Table via pymysql.""" - df = pd.read_sql(query, self._conn) - return pa.Table.from_pandas(df) + """Execute a query and return results as a PyArrow Table (no pandas).""" + cur = self._conn.cursor() + try: + cur.execute(query) + if cur.description is None: + return pa.table({}) + columns = [desc[0] for desc in cur.description] + rows = cur.fetchall() + if not rows: + return pa.table({col: pa.array([], type=pa.null()) for col in columns}) + col_data = {col: [row[i] for row in rows] for i, col in enumerate(columns)} + return pa.table(col_data) + finally: + cur.close() def _safe_select_list(self, schema: str, table_name: str) -> str: """Build a SELECT column list that converts unsupported types to text. @@ -121,13 +134,16 @@ def _safe_select_list(self, schema: str, table_name: str) -> str: def fetch_data_as_arrow( self, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> pa.Table: """ Fetch data from MySQL as a PyArrow Table. """ + opts = import_options or {} + size = opts.get("size", 1000000) + sort_columns = opts.get("sort_columns") + sort_order = opts.get("sort_order", "asc") + if not source_table: raise ValueError("source_table must be provided") @@ -162,12 +178,17 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: return self._list_tables(table_filter) def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: - """List tables from MySQL database.""" + """List tables from MySQL database(s) within pinned scope.""" try: + # If database is pinned, list only that database; otherwise all user-accessible DBs + if self.database: + db_filter = f"TABLE_SCHEMA = '{self.database}'" + else: + db_filter = "TABLE_SCHEMA NOT IN ('information_schema', 'mysql', 'performance_schema', 'sys')" tables_query = f""" SELECT TABLE_SCHEMA, TABLE_NAME FROM information_schema.tables - WHERE TABLE_SCHEMA = '{self.database}' + WHERE {db_filter} AND TABLE_TYPE = 'BASE TABLE' """ tables_arrow = self._read_sql(tables_query) @@ -238,4 +259,114 @@ def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: except Exception as e: logger.error(f"Error listing tables: {e}") - return [] \ No newline at end of file + return [] + + # -- Catalog tree API -------------------------------------------------- + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "database", "label": "Database"}, + {"key": "table", "label": "Table"}, + ] + + def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[CatalogNode]: + path = path or [] + eff = self.effective_hierarchy() + + if len(path) >= len(eff): + return [] + + level_key = eff[len(path)]["key"] + + if level_key == "database": + query = """ + SELECT SCHEMA_NAME + FROM information_schema.schemata + WHERE SCHEMA_NAME NOT IN ('information_schema', 'mysql', 'performance_schema', 'sys') + ORDER BY SCHEMA_NAME + """ + rows = self._read_sql(query).to_pandas() + nodes = [] + for _, r in rows.iterrows(): + name = r["SCHEMA_NAME"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode( + name=name, node_type="namespace", path=path + [name], + )) + return nodes + + if level_key == "table": + pinned = self.pinned_scope() + db = pinned.get("database") or (path[0] if path else None) + if not db: + return [] + query = f""" + SELECT TABLE_NAME + FROM information_schema.tables + WHERE TABLE_SCHEMA = '{db}' AND TABLE_TYPE = 'BASE TABLE' + ORDER BY TABLE_NAME + """ + rows = self._read_sql(query).to_pandas() + nodes = [] + for _, r in rows.iterrows(): + name = r["TABLE_NAME"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode( + name=name, node_type="table", path=path + [name], + )) + return nodes + + return [] + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + if not path: + return {} + pinned = self.pinned_scope() + remaining = list(path) + db = pinned.get("database") + if not db: + if not remaining: + return {} + db = remaining.pop(0) + if not remaining: + return {} + table_name = remaining[0] + try: + cols_query = f""" + SELECT COLUMN_NAME, DATA_TYPE + FROM information_schema.columns + WHERE TABLE_SCHEMA = '{db}' AND TABLE_NAME = '{table_name}' + ORDER BY ORDINAL_POSITION + """ + cols_df = self._read_sql(cols_query).to_pandas() + columns = [ + {"name": r["COLUMN_NAME"], "type": r["DATA_TYPE"]} + for _, r in cols_df.iterrows() + ] + count_df = self._read_sql( + f"SELECT COUNT(*) AS cnt FROM `{db}`.`{table_name}`" + ).to_pandas() + row_count = int(count_df["cnt"].iloc[0]) + col_list = self._safe_select_list(db, table_name) + sample_df = self._read_sql( + f"SELECT {col_list} FROM `{db}`.`{table_name}` LIMIT 5" + ).to_pandas() + sample_rows = json.loads(sample_df.to_json(orient="records", date_format="iso")) + return { + "row_count": row_count, + "columns": columns, + "sample_rows": sample_rows, + } + except Exception as e: + logger.warning(f"get_metadata failed for {path}: {e}") + return {} + + def test_connection(self) -> bool: + try: + self._read_sql("SELECT 1") + return True + except Exception: + return False \ No newline at end of file diff --git a/py-src/data_formulator/data_loader/postgresql_data_loader.py b/py-src/data_formulator/data_loader/postgresql_data_loader.py index 5b46d419..1c62d6eb 100644 --- a/py-src/data_formulator/data_loader/postgresql_data_loader.py +++ b/py-src/data_formulator/data_loader/postgresql_data_loader.py @@ -2,11 +2,10 @@ import logging from typing import Any -import pandas as pd import pyarrow as pa import psycopg2 -from data_formulator.data_loader.external_data_loader import ExternalDataLoader +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode logger = logging.getLogger(__name__) @@ -20,7 +19,7 @@ def list_params() -> list[dict[str, Any]]: {"name": "password", "type": "string", "required": False, "default": "", "description": "leave blank for no password"}, {"name": "host", "type": "string", "required": True, "default": "localhost", "description": "PostgreSQL host"}, {"name": "port", "type": "string", "required": False, "default": "5432", "description": "PostgreSQL port"}, - {"name": "database", "type": "string", "required": True, "default": "postgres", "description": "PostgreSQL database name"} + {"name": "database", "type": "string", "required": False, "default": "", "description": "Database name (leave empty to browse all databases)"} ] return params_list @@ -32,6 +31,8 @@ def auth_instructions() -> str: **Remote setup:** Get host, port, username, and password from your database administrator. The user must have SELECT permissions on the tables you want to access. +**Scope:** Leave *database* empty to browse all databases on the server, or fill it in to go straight to schemas/tables in that database. + **Troubleshooting:** Test with `psql -U -h -p -d `""" def __init__(self, params: dict[str, Any]): @@ -47,8 +48,10 @@ def __init__(self, params: dict[str, Any]): raise ValueError("PostgreSQL host is required") if not self.user: raise ValueError("PostgreSQL user is required") - if not self.database: - raise ValueError("PostgreSQL database is required") + + # When no database is specified, connect to the default "postgres" DB + # for catalog browsing. The user can browse all databases via ls(). + connect_db = self.database or "postgres" # Build psycopg2 connection # Use 127.0.0.1 when host is localhost to force IPv4 TCP and avoid IPv6 ::1 connection issues. @@ -60,13 +63,13 @@ def __init__(self, params: dict[str, Any]): port=int(self.port), user=self.user, password=self.password or "", - dbname=self.database, + dbname=connect_db, ) self._conn.autocommit = True except Exception as e: - logger.error(f"Failed to connect to PostgreSQL (postgresql://{self.user}:***@{self.host}:{self.port}/{self.database}): {e}") - raise ValueError(f"Failed to connect to PostgreSQL database '{self.database}' on host '{self.host}': {e}") from e - logger.info(f"Successfully connected to PostgreSQL: postgresql://{self.user}:***@{self.host}:{self.port}/{self.database}") + logger.error(f"Failed to connect to PostgreSQL (postgresql://{self.user}:***@{self.host}:{self.port}/{connect_db}): {e}") + raise ValueError(f"Failed to connect to PostgreSQL database '{connect_db}' on host '{self.host}': {e}") from e + logger.info(f"Successfully connected to PostgreSQL: postgresql://{self.user}:***@{self.host}:{self.port}/{connect_db}") # PostgreSQL types that may need special handling _SPATIAL_TYPES = {'geometry', 'geography'} # PostGIS types → ST_AsText() @@ -75,9 +78,25 @@ def __init__(self, params: dict[str, Any]): _UNSUPPORTED_TYPES = _SPATIAL_TYPES | _OTHER_UNSUPPORTED def _read_sql(self, query: str) -> pa.Table: - """Execute a query and return results as a PyArrow Table via psycopg2.""" - df = pd.read_sql(query, self._conn) - return pa.Table.from_pandas(df) + """Execute a query and return results as a PyArrow Table (no pandas).""" + return self._execute_on_conn(self._conn, query) + + @staticmethod + def _execute_on_conn(conn, query: str) -> pa.Table: + """Run *query* on *conn* and return a PyArrow Table.""" + cur = conn.cursor() + try: + cur.execute(query) + if cur.description is None: + return pa.table({}) + columns = [desc[0] for desc in cur.description] + rows = cur.fetchall() + if not rows: + return pa.table({col: pa.array([], type=pa.null()) for col in columns}) + col_data = {col: [row[i] for row in rows] for i, col in enumerate(columns)} + return pa.table(col_data) + finally: + cur.close() def _safe_select_list(self, schema: str, table_name: str) -> str: """Build a SELECT column list that converts unsupported types to text. @@ -111,13 +130,16 @@ def _safe_select_list(self, schema: str, table_name: str) -> str: def fetch_data_as_arrow( self, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> pa.Table: """ Fetch data from PostgreSQL as a PyArrow Table. """ + opts = import_options or {} + size = opts.get("size", 1000000) + sort_columns = opts.get("sort_columns") + sort_order = opts.get("sort_order", "asc") + if not source_table: raise ValueError("source_table must be provided") @@ -239,3 +261,185 @@ def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: except Exception as e: logger.error(f"Error listing tables: {e}") return [] + + # -- Catalog tree API -------------------------------------------------- + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "database", "label": "Database"}, + {"key": "schema", "label": "Schema"}, + {"key": "table", "label": "Table"}, + ] + + def _connect_to_db(self, dbname: str): + """Open a new connection to a specific database on the same server.""" + host_for_conn = "127.0.0.1" if (self.host or "").strip().lower() == "localhost" else self.host + conn = psycopg2.connect( + host=host_for_conn, + port=int(self.port), + user=self.user, + password=self.password or "", + dbname=dbname, + ) + conn.autocommit = True + return conn + + def _read_sql_on(self, query: str, dbname: str | None = None) -> pa.Table: + """Run a query, optionally on a different database.""" + if dbname and dbname != (self.database or "postgres"): + conn = self._connect_to_db(dbname) + try: + return self._execute_on_conn(conn, query) + finally: + conn.close() + else: + return self._execute_on_conn(self._conn, query) + + def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[CatalogNode]: + path = path or [] + eff = self.effective_hierarchy() + + if len(path) >= len(eff): + return [] + + level_key = eff[len(path)]["key"] + + # --- database level --- + if level_key == "database": + query = """ + SELECT datname FROM pg_database + WHERE datistemplate = false AND datallowconn = true + ORDER BY datname + """ + rows = self._read_sql(query).to_pandas() + nodes = [] + for _, r in rows.iterrows(): + name = r["datname"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode( + name=name, node_type="namespace", + path=path + [name], + )) + return nodes + + # --- schema level --- + if level_key == "schema": + # Determine which database to query + pinned = self.pinned_scope() + db = pinned.get("database") or (path[0] if path else None) + if not db: + return [] + query = """ + SELECT schema_name + FROM information_schema.schemata + WHERE schema_name NOT IN ('information_schema', 'pg_catalog', 'pg_toast') + AND schema_name NOT LIKE '%%_intern%%' + AND schema_name NOT LIKE '%%timescaledb%%' + ORDER BY schema_name + """ + rows = self._read_sql_on(query, db).to_pandas() + nodes = [] + for _, r in rows.iterrows(): + name = r["schema_name"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode( + name=name, node_type="namespace", + path=path + [name], + )) + return nodes + + # --- table level --- + if level_key == "table": + pinned = self.pinned_scope() + # Resolve database and schema from pinned values + path + remaining_path = list(path) + db = pinned.get("database") + if not db: + if not remaining_path: + return [] + db = remaining_path.pop(0) + schema = pinned.get("schema") + if not schema: + if not remaining_path: + return [] + schema = remaining_path.pop(0) + query = f""" + SELECT table_name + FROM information_schema.tables + WHERE table_schema = '{schema}' + AND table_type = 'BASE TABLE' + AND table_name NOT LIKE '%%/%%' + ORDER BY table_name + """ + rows = self._read_sql_on(query, db).to_pandas() + nodes = [] + for _, r in rows.iterrows(): + name = r["table_name"] + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode( + name=name, node_type="table", + path=path + [name], + )) + return nodes + + return [] + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + if not path: + return {} + # Resolve the actual database.schema.table from effective path + pinned + pinned = self.pinned_scope() + remaining = list(path) + db = pinned.get("database") + if not db: + if not remaining: + return {} + db = remaining.pop(0) + schema = pinned.get("schema") + if not schema: + if not remaining: + return {} + schema = remaining.pop(0) + if not remaining: + return {} + table_name = remaining[0] + try: + cols_query = f""" + SELECT column_name, data_type + FROM information_schema.columns + WHERE table_schema = '{schema}' AND table_name = '{table_name}' + ORDER BY ordinal_position + """ + cols_df = self._read_sql_on(cols_query, db).to_pandas() + columns = [ + {"name": r["column_name"], "type": r["data_type"]} + for _, r in cols_df.iterrows() + ] + count_df = self._read_sql_on( + f'SELECT COUNT(*) AS cnt FROM "{schema}"."{table_name}"', db + ).to_pandas() + row_count = int(count_df["cnt"].iloc[0]) + col_list = self._safe_select_list(schema, table_name) + sample_df = self._read_sql_on( + f'SELECT {col_list} FROM "{schema}"."{table_name}" LIMIT 5', db + ).to_pandas() + sample_rows = json.loads(sample_df.to_json(orient="records")) + return { + "row_count": row_count, + "columns": columns, + "sample_rows": sample_rows, + } + except Exception as e: + logger.warning(f"get_metadata failed for {path}: {e}") + return {} + + def test_connection(self) -> bool: + try: + self._read_sql("SELECT 1") + return True + except Exception: + return False diff --git a/py-src/data_formulator/data_loader/s3_data_loader.py b/py-src/data_formulator/data_loader/s3_data_loader.py index 82fa8eb1..30a461e7 100644 --- a/py-src/data_formulator/data_loader/s3_data_loader.py +++ b/py-src/data_formulator/data_loader/s3_data_loader.py @@ -9,7 +9,7 @@ import pyarrow.parquet as pq from pyarrow import fs as pa_fs -from data_formulator.data_loader.external_data_loader import ExternalDataLoader +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode logger = logging.getLogger(__name__) @@ -57,15 +57,18 @@ def __init__(self, params: dict[str, Any]): def fetch_data_as_arrow( self, source_table: str, - size: int = 1000000, - sort_columns: list[str] | None = None, - sort_order: str = 'asc' + import_options: dict[str, Any] | None = None, ) -> pa.Table: """ Fetch data from S3 as a PyArrow Table using PyArrow's native S3 filesystem. For files (parquet, csv), reads directly using PyArrow. """ + opts = import_options or {} + size = opts.get("size", 1000000) + sort_columns = opts.get("sort_columns") + sort_order = opts.get("sort_order", "asc") + if not source_table: raise ValueError("source_table (S3 URL) must be provided") @@ -197,4 +200,78 @@ def _estimate_row_count(self, s3_url: str) -> int: return 0 except Exception as e: logger.warning(f"Error estimating row count for {s3_url}: {e}") - return 0 \ No newline at end of file + return 0 + + # -- Catalog tree API -------------------------------------------------- + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "bucket", "label": "Bucket"}, + {"key": "table", "label": "File"}, + ] + + def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[CatalogNode]: + path = path or [] + eff = self.effective_hierarchy() + if len(path) >= len(eff): + return [] + level_key = eff[len(path)]["key"] + + if level_key == "bucket": + # Bucket is always pinned (required) but handle defensively + return [CatalogNode(name=self.bucket, node_type="namespace", path=path + [self.bucket])] + + if level_key == "table": + s3_client = boto3.client( + "s3", + aws_access_key_id=self.aws_access_key_id, + aws_secret_access_key=self.aws_secret_access_key, + aws_session_token=self.aws_session_token if self.aws_session_token else None, + region_name=self.region_name, + ) + resp = s3_client.list_objects_v2(Bucket=self.bucket) + nodes = [] + for obj in resp.get("Contents", []): + key = obj["Key"] + if key.endswith("/") or not self._is_supported_file(key): + continue + if filter and filter.lower() not in key.lower(): + continue + nodes.append(CatalogNode( + name=key, node_type="table", path=path + [key], + metadata={"size_bytes": obj.get("Size", 0)}, + )) + return nodes + + return [] + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + if not path: + return {} + key = path[-1] + s3_url = f"s3://{self.bucket}/{key}" + try: + sample = self._read_sample_arrow(s3_url, 5) + sample_df = sample.to_pandas() + columns = [{"name": c, "type": str(sample_df[c].dtype)} for c in sample_df.columns] + sample_rows = json.loads(sample_df.to_json(orient="records")) + row_count = self._estimate_row_count(s3_url) + return {"row_count": row_count, "columns": columns, "sample_rows": sample_rows} + except Exception as e: + logger.warning(f"get_metadata failed for {path}: {e}") + return {} + + def test_connection(self) -> bool: + try: + s3_client = boto3.client( + "s3", + aws_access_key_id=self.aws_access_key_id, + aws_secret_access_key=self.aws_secret_access_key, + aws_session_token=self.aws_session_token if self.aws_session_token else None, + region_name=self.region_name, + ) + s3_client.head_bucket(Bucket=self.bucket) + return True + except Exception: + return False \ No newline at end of file diff --git a/py-src/data_formulator/data_loader/superset_data_loader.py b/py-src/data_formulator/data_loader/superset_data_loader.py new file mode 100644 index 00000000..46736469 --- /dev/null +++ b/py-src/data_formulator/data_loader/superset_data_loader.py @@ -0,0 +1,406 @@ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. + +"""SupersetLoader — ExternalDataLoader implementation for Apache Superset. + +Treats Superset as a hierarchical data source: + dashboard (namespace) → dataset (table) + +Authentication is JWT-based (``auth_mode() = "token"``). Data is fetched +via Superset's SQL Lab API, reusing the existing ``SupersetClient`` and +``SupersetAuthBridge`` from the legacy plugin. +""" + +import json +import logging +import re +from typing import Any + +import pyarrow as pa + +from data_formulator.data_loader.external_data_loader import ( + CatalogNode, + ExternalDataLoader, +) + +logger = logging.getLogger(__name__) + +# Lazy-imported Superset helpers (only if the plugin deps are available) +_SupersetClient = None +_SupersetAuthBridge = None + + +def _ensure_imports(): + global _SupersetClient, _SupersetAuthBridge + if _SupersetClient is None: + from data_formulator.plugins.superset.superset_client import SupersetClient + from data_formulator.plugins.superset.auth_bridge import SupersetAuthBridge + _SupersetClient = SupersetClient + _SupersetAuthBridge = SupersetAuthBridge + + +# --------------------------------------------------------------------------- +# SQL building helpers (extracted from plugins/superset/routes/data.py) +# --------------------------------------------------------------------------- + +def _quote_identifier(name: str) -> str: + escaped = (name or "").replace('"', '""') + return f'"{escaped}"' + + +def _sql_literal(value: Any) -> str: + if value is None: + return "NULL" + if isinstance(value, bool): + return "TRUE" if value else "FALSE" + if isinstance(value, (int, float)): + return str(value) + escaped = str(value).replace("'", "''") + return f"'{escaped}'" + + +def _build_dataset_sql(detail: dict) -> tuple[int, str, str]: + """Return (database_id, schema, base_select_sql) from a dataset detail.""" + db_id = detail["database"]["id"] + table_name = detail["table_name"] + schema = detail.get("schema", "") or "" + dataset_sql = (detail.get("sql") or "").strip() + dataset_kind = (detail.get("kind") or "").lower() + + if dataset_kind == "virtual" and dataset_sql: + return db_id, schema, f"SELECT * FROM ({dataset_sql.rstrip(';')}) AS _vds" + + prefix = f'"{schema}".' if schema else "" + return db_id, schema, f'SELECT * FROM {prefix}"{table_name}"' + + +# --------------------------------------------------------------------------- +# SupersetLoader +# --------------------------------------------------------------------------- + +class SupersetLoader(ExternalDataLoader): + """Treats a Superset instance as a hierarchical data source. + + Hierarchy: ``dashboard`` (namespace) → ``dataset`` (table). + Datasets not attached to any dashboard appear under a synthetic + "All Datasets" namespace at the root level. + """ + + @staticmethod + def list_params() -> list[dict[str, Any]]: + return [ + {"name": "url", "type": "string", "required": True, + "description": "Superset base URL (e.g. https://bi.company.com)"}, + {"name": "username", "type": "string", "required": True, + "description": "Superset username"}, + {"name": "password", "type": "password", "required": True, + "description": "Superset password"}, + ] + + @staticmethod + def auth_instructions() -> str: + return """**Example:** url: `https://bi.company.com` · username: `admin` · password: `***` + +**Setup:** Provide the base URL of your Superset instance and credentials for a user with at least **Gamma** role (read access to datasets). + +**SSO:** If your Superset uses SSO, use the SSO bridge flow instead of password auth (configure via `PLG_SUPERSET_SSO_LOGIN_URL`).""" + + @staticmethod + def auth_mode() -> str: + return "token" + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "dashboard", "label": "Dashboard"}, + {"key": "dataset", "label": "Dataset"}, + ] + + def __init__(self, params: dict[str, Any]): + _ensure_imports() + self.params = params + + self.url = (params.get("url") or "").rstrip("/") + if self.url and not self.url.startswith(("http://", "https://")): + self.url = f"http://{self.url}" + self.username = params.get("username", "") + self.password = params.get("password", "") + + if not self.url: + raise ValueError("Superset URL is required") + + self._client = _SupersetClient(self.url) + self._bridge = _SupersetAuthBridge(self.url) + + # Authenticate immediately + self._access_token: str | None = None + self._refresh_token: str | None = None + if self.username and self.password: + self._do_login() + + def _do_login(self) -> None: + result = self._bridge.login(self.username, self.password) + self._access_token = result.get("access_token") + self._refresh_token = result.get("refresh_token") + if not self._access_token: + raise ValueError("Superset login failed: no access token returned") + + @staticmethod + def _is_token_expired(token: str, buffer_seconds: int = 60) -> bool: + """Check JWT exp claim without hitting Superset API.""" + import base64 + import time + try: + payload = token.split(".")[1] + payload += "=" * (-len(payload) % 4) + claims = json.loads(base64.urlsafe_b64decode(payload)) + return time.time() > claims.get("exp", 0) - buffer_seconds + except Exception: + return True # conservative: assume expired + + def _ensure_token(self) -> str: + """Return a valid access token, refreshing if needed. + + Uses JWT exp claim to detect expiry (no API call). + Tries refresh token first, then full re-login with password. + SSO tokens that expire without a refresh token will raise. + """ + if not self._access_token: + raise ValueError("Not authenticated with Superset") + + if not self._is_token_expired(self._access_token): + return self._access_token + + # Token expired — try refresh + if self._refresh_token: + try: + result = self._bridge.refresh_token(self._refresh_token) + new_token = result.get("access_token") + if new_token: + self._access_token = new_token + return self._access_token + except Exception: + logger.debug("Token refresh failed", exc_info=True) + + # Refresh failed or unavailable — try password re-login + if self.username and self.password: + self._do_login() + return self._access_token + + raise ValueError("Superset token expired and cannot refresh (no password or refresh token available)") + + # -- test_connection --------------------------------------------------- + + def test_connection(self) -> bool: + try: + token = self._ensure_token() + # Try a lightweight API call — list datasets with page_size=1 + result = self._client.list_datasets(token, page=0, page_size=1) + return "result" in result + except Exception: + return False + + # -- list_tables (flat/eager) ------------------------------------------ + + def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: + """List all datasets the user can access (flat). + + Fetches detail per dataset to populate columns — may be slow for + large Superset instances. + """ + token = self._ensure_token() + all_datasets = self._fetch_all_datasets(token) + results = [] + for ds in all_datasets: + name = ds.get("table_name") or "" + if table_filter and table_filter.lower() not in name.lower(): + continue + + # The list endpoint doesn't include columns or row_count — + # fetch detail for each dataset. + columns: list[dict] = [] + row_count = ds.get("row_count") + sample_rows: list[dict] = [] + try: + detail = self._client.get_dataset_detail(token, ds["id"]) + columns = [ + {"name": c.get("column_name") or c.get("name") or "", "type": c.get("type") or ""} + for c in (detail.get("columns") or []) + ] + row_count = detail.get("row_count") or row_count + + # Fetch sample rows via SQL Lab + db_id, schema, base_sql = _build_dataset_sql(detail) + sql_session = self._client.create_sql_session(token) + result = self._client.execute_sql_with_session( + sql_session, db_id, f"SELECT * FROM ({base_sql}) AS _src LIMIT 10", schema, 10, + ) + sample_rows = result.get("data", []) or [] + except Exception: + logger.debug("Failed to fetch detail for dataset %s", ds.get("id")) + + results.append({ + "name": f"{ds.get('id')}:{name}", + "metadata": { + "dataset_id": ds["id"], + "row_count": row_count, + "columns": columns, + "sample_rows": sample_rows, + "schema": ds.get("schema", ""), + "database": (ds.get("database") or {}).get("database_name", ""), + }, + }) + return results + + # -- ls (lazy/hierarchical) -------------------------------------------- + + def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[CatalogNode]: + path = path or [] + token = self._ensure_token() + + if len(path) == 0: + # Root: list dashboards + "All Datasets" + raw = self._client.list_dashboards(token, page=0, page_size=500) + dashboards = raw.get("result", []) + nodes = [] + for d in dashboards: + title = d.get("dashboard_title", f"Dashboard {d['id']}") + if filter and filter.lower() not in title.lower(): + continue + nodes.append(CatalogNode( + name=title, + node_type="namespace", + path=[str(d["id"])], + metadata={"dashboard_id": d["id"]}, + )) + # Add synthetic "All Datasets" entry + if not filter or "all datasets" in (filter or "").lower(): + nodes.append(CatalogNode( + name="All Datasets", + node_type="namespace", + path=["__all__"], + )) + return nodes + + if len(path) == 1: + # Expand a dashboard or "All Datasets" + parent_id = path[0] + if parent_id == "__all__": + datasets = self._fetch_all_datasets(token) + else: + try: + raw = self._client.get_dashboard_datasets(token, int(parent_id)) + datasets = raw.get("result", []) + except Exception: + datasets = [] + + nodes = [] + for ds in datasets: + name = ds.get("table_name") or ds.get("name") or f"dataset_{ds.get('id', '?')}" + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode( + name=name, + node_type="table", + path=[parent_id, str(ds["id"])], + metadata={ + "dataset_id": ds["id"], + "row_count": ds.get("row_count"), + "schema": ds.get("schema", ""), + "database": (ds.get("database") or {}).get("database_name", ""), + }, + )) + return nodes + + return [] + + # -- get_metadata ------------------------------------------------------ + + def get_metadata(self, path: list[str]) -> dict[str, Any]: + if not path or len(path) < 2: + return {} + dataset_id_str = path[-1] + try: + dataset_id = int(dataset_id_str) + except ValueError: + return {} + token = self._ensure_token() + try: + detail = self._client.get_dataset_detail(token, dataset_id) + columns = [ + {"name": c.get("column_name", ""), "type": c.get("type", "")} + for c in (detail.get("columns") or []) + ] + return { + "dataset_id": dataset_id, + "row_count": detail.get("row_count"), + "columns": columns, + "schema": detail.get("schema", ""), + "database": (detail.get("database") or {}).get("database_name", ""), + "description": detail.get("description", ""), + } + except Exception as e: + logger.warning("get_metadata failed for dataset %s: %s", dataset_id, e) + return {} + + # -- fetch_data_as_arrow ----------------------------------------------- + + def fetch_data_as_arrow( + self, + source_table: str, + import_options: dict[str, Any] | None = None, + ) -> pa.Table: + """Fetch dataset data via Superset's SQL Lab API. + + ``source_table`` is either: + - A dataset ID (int as string): ``"42"`` + - A ``"dataset_id:table_name"`` pair: ``"42:orders_fact"`` + """ + opts = import_options or {} + size = opts.get("size", 100_000) + + # Parse dataset_id from source_table + dataset_id_str = source_table.split(":")[0] if ":" in source_table else source_table + try: + dataset_id = int(dataset_id_str) + except ValueError: + raise ValueError(f"source_table must be a dataset ID (got: {source_table!r})") + + token = self._ensure_token() + detail = self._client.get_dataset_detail(token, dataset_id) + db_id, schema, base_sql = _build_dataset_sql(detail) + + # Build SQL + full_sql = f"SELECT * FROM ({base_sql}) AS _src LIMIT {size}" + + # Execute via SQL Lab + sql_session = self._client.create_sql_session(token) + result = self._client.execute_sql_with_session( + sql_session, db_id, full_sql, schema, size, + ) + + rows = result.get("data", []) or [] + if not rows: + return pa.table({}) + + # Convert list-of-dicts to Arrow table + columns = list(rows[0].keys()) + col_data = {col: [row.get(col) for row in rows] for col in columns} + return pa.table(col_data) + + # -- helpers ----------------------------------------------------------- + + def _fetch_all_datasets(self, token: str) -> list[dict]: + """Paginate through all datasets.""" + all_results: list[dict] = [] + page = 0 + page_size = 100 + while True: + raw = self._client.list_datasets(token, page=page, page_size=page_size) + batch = raw.get("result", []) + all_results.extend(batch) + total = raw.get("count", len(all_results)) + if len(all_results) >= total or len(batch) < page_size: + break + page += 1 + return all_results diff --git a/py-src/data_formulator/datalake/azure_blob_workspace.py b/py-src/data_formulator/datalake/azure_blob_workspace.py index a09a924a..8afa5e48 100644 --- a/py-src/data_formulator/datalake/azure_blob_workspace.py +++ b/py-src/data_formulator/datalake/azure_blob_workspace.py @@ -399,7 +399,7 @@ def write_parquet_from_arrow( table: pa.Table, table_name: str, compression: str = DEFAULT_COMPRESSION, - loader_metadata: Optional[dict[str, Any]] = None, + source_info: Optional[dict[str, Any]] = None, ) -> TableMetadata: safe_name = sanitize_table_name(table_name) filename = f"{safe_name}.parquet" @@ -431,11 +431,12 @@ def write_parquet_from_arrow( last_synced=now, ) - if loader_metadata: - table_metadata.loader_type = loader_metadata.get("loader_type") - table_metadata.loader_params = loader_metadata.get("loader_params") - table_metadata.source_table = loader_metadata.get("source_table") - table_metadata.source_query = loader_metadata.get("source_query") + if source_info: + table_metadata.loader_type = source_info.get("loader_type") + table_metadata.loader_params = source_info.get("loader_params") + table_metadata.source_table = source_info.get("source_table") + table_metadata.source_query = source_info.get("source_query") + table_metadata.import_options = source_info.get("import_options") self.add_table_metadata(table_metadata) logger.info( @@ -449,7 +450,7 @@ def write_parquet( df: pd.DataFrame, table_name: str, compression: str = DEFAULT_COMPRESSION, - loader_metadata: Optional[dict[str, Any]] = None, + source_info: Optional[dict[str, Any]] = None, ) -> TableMetadata: safe_name = sanitize_table_name(table_name) filename = f"{safe_name}.parquet" @@ -482,11 +483,12 @@ def write_parquet( last_synced=now, ) - if loader_metadata: - table_metadata.loader_type = loader_metadata.get("loader_type") - table_metadata.loader_params = loader_metadata.get("loader_params") - table_metadata.source_table = loader_metadata.get("source_table") - table_metadata.source_query = loader_metadata.get("source_query") + if source_info: + table_metadata.loader_type = source_info.get("loader_type") + table_metadata.loader_params = source_info.get("loader_params") + table_metadata.source_table = source_info.get("source_table") + table_metadata.source_query = source_info.get("source_query") + table_metadata.import_options = source_info.get("import_options") self.add_table_metadata(table_metadata) logger.info( diff --git a/py-src/data_formulator/datalake/workspace.py b/py-src/data_formulator/datalake/workspace.py index 9a5c9234..df6ce242 100644 --- a/py-src/data_formulator/datalake/workspace.py +++ b/py-src/data_formulator/datalake/workspace.py @@ -484,7 +484,7 @@ def write_parquet_from_arrow( table: pa.Table, table_name: str, compression: str = DEFAULT_COMPRESSION, - loader_metadata: Optional[dict[str, Any]] = None, + source_info: Optional[dict[str, Any]] = None, ) -> TableMetadata: """ Write a PyArrow Table directly to parquet. @@ -518,11 +518,12 @@ def write_parquet_from_arrow( last_synced=now, ) - if loader_metadata: - table_metadata.loader_type = loader_metadata.get('loader_type') - table_metadata.loader_params = loader_metadata.get('loader_params') - table_metadata.source_table = loader_metadata.get('source_table') - table_metadata.source_query = loader_metadata.get('source_query') + if source_info: + table_metadata.loader_type = source_info.get('loader_type') + table_metadata.loader_params = source_info.get('loader_params') + table_metadata.source_table = source_info.get('source_table') + table_metadata.source_query = source_info.get('source_query') + table_metadata.import_options = source_info.get('import_options') self.add_table_metadata(table_metadata) logger.info( @@ -537,7 +538,7 @@ def write_parquet( df: pd.DataFrame, table_name: str, compression: str = DEFAULT_COMPRESSION, - loader_metadata: Optional[dict[str, Any]] = None, + source_info: Optional[dict[str, Any]] = None, ) -> TableMetadata: """Write a pandas DataFrame to parquet.""" safe_name = sanitize_table_name(table_name) @@ -569,11 +570,12 @@ def write_parquet( last_synced=now, ) - if loader_metadata: - table_metadata.loader_type = loader_metadata.get('loader_type') - table_metadata.loader_params = loader_metadata.get('loader_params') - table_metadata.source_table = loader_metadata.get('source_table') - table_metadata.source_query = loader_metadata.get('source_query') + if source_info: + table_metadata.loader_type = source_info.get('loader_type') + table_metadata.loader_params = source_info.get('loader_params') + table_metadata.source_table = source_info.get('source_table') + table_metadata.source_query = source_info.get('source_query') + table_metadata.import_options = source_info.get('import_options') self.add_table_metadata(table_metadata) logger.info( @@ -667,7 +669,7 @@ def refresh_parquet_from_arrow( logger.info(f"Table {table_name} unchanged (hash: {new_hash[:8]}…)") return old_meta, False - loader_metadata = { + source_info = { 'loader_type': old_meta.loader_type, 'loader_params': old_meta.loader_params, 'source_table': old_meta.source_table, @@ -677,7 +679,7 @@ def refresh_parquet_from_arrow( table=table, table_name=table_name, compression=compression, - loader_metadata=loader_metadata, + source_info=source_info, ) logger.info(f"Refreshed {table_name}: {old_meta.row_count} → {new_meta.row_count} rows") return new_meta, True diff --git a/py-src/data_formulator/datalake/workspace_metadata.py b/py-src/data_formulator/datalake/workspace_metadata.py index ff11f9bc..86c54eda 100644 --- a/py-src/data_formulator/datalake/workspace_metadata.py +++ b/py-src/data_formulator/datalake/workspace_metadata.py @@ -210,6 +210,7 @@ class TableMetadata: loader_params: dict | None = None source_table: str | None = None source_query: str | None = None + import_options: dict | None = None last_synced: datetime | None = None row_count: int | None = None columns: list[ColumnInfo] | None = None @@ -237,6 +238,8 @@ def to_dict(self) -> dict: result["source_table"] = self.source_table if self.source_query is not None: result["source_query"] = self.source_query + if self.import_options is not None: + result["import_options"] = make_json_safe(self.import_options) if self.last_synced is not None: result["last_synced"] = self.last_synced.isoformat() if self.row_count is not None: @@ -277,6 +280,7 @@ def from_dict(cls, name: str, data: dict) -> "TableMetadata": loader_params=data.get("loader_params"), source_table=data.get("source_table"), source_query=data.get("source_query"), + import_options=data.get("import_options"), last_synced=last_synced, row_count=data.get("row_count"), columns=columns, diff --git a/py-src/data_formulator/plugins/data_writer.py b/py-src/data_formulator/plugins/data_writer.py index 3060a322..ce99f073 100644 --- a/py-src/data_formulator/plugins/data_writer.py +++ b/py-src/data_formulator/plugins/data_writer.py @@ -8,7 +8,7 @@ * Identity-scoped workspace resolution (via ``get_identity_id()``) * Table name sanitisation -* Automatic ``loader_metadata`` stamping (``loader_type = "plugin:"``) +* Automatic ``source_info`` stamping (``loader_type = "plugin:"``) * ``overwrite=False`` collision avoidance (auto-suffix ``_1``, ``_2``, …) """ @@ -70,12 +70,12 @@ def write_dataframe( if not overwrite: safe_name = self._unique_name(safe_name, workspace) - loader_metadata = self._build_loader_metadata( + source_info = self._build_source_info( safe_name, source_metadata, ) table_meta = workspace.write_parquet( - df, safe_name, loader_metadata=loader_metadata, + df, safe_name, source_info=source_info, ) is_renamed = safe_name != sanitize_table_name(table_name) @@ -94,7 +94,7 @@ def write_dataframe( # Internal helpers # ------------------------------------------------------------------ - def _build_loader_metadata( + def _build_source_info( self, table_name: str, source_metadata: Optional[dict[str, Any]], diff --git a/py-src/data_formulator/tables_routes.py b/py-src/data_formulator/tables_routes.py index c1a122f6..1e2a57d2 100644 --- a/py-src/data_formulator/tables_routes.py +++ b/py-src/data_formulator/tables_routes.py @@ -869,9 +869,11 @@ def data_loader_ingest_data(): workspace, safe_name, source_table=table_name, - size=row_limit, - sort_columns=sort_columns, - sort_order=sort_order, + import_options={ + "size": row_limit, + "sort_columns": sort_columns, + "sort_order": sort_order, + }, ) return jsonify({ "status": "success", @@ -936,9 +938,11 @@ def data_loader_fetch_data(): # Fetch data as DataFrame (not Arrow, since we need JSON output not parquet) df = data_loader.fetch_data_as_dataframe( source_table=table_name, - size=row_limit, - sort_columns=sort_columns, - sort_order=sort_order, + import_options={ + "size": row_limit, + "sort_columns": sort_columns, + "sort_order": sort_order, + }, ) total_row_count = len(df) @@ -998,7 +1002,10 @@ def data_loader_refresh_table(): data_loader = DATA_LOADERS[data_loader_type](data_loader_params) if meta.source_table: - arrow_table = data_loader.fetch_data_as_arrow(source_table=meta.source_table) + arrow_table = data_loader.fetch_data_as_arrow( + source_table=meta.source_table, + import_options=meta.import_options, + ) else: return jsonify({ "status": "error", diff --git a/src/app/dfSlice.tsx b/src/app/dfSlice.tsx index 756d334f..7a51e2d9 100644 --- a/src/app/dfSlice.tsx +++ b/src/app/dfSlice.tsx @@ -55,6 +55,17 @@ export interface ServerConfig { [key: string]: unknown; }; PLUGINS?: Record; + SOURCES?: Array<{ + source_id: string; + source_type: string; + name: string; + icon: string; + params_form: Array<{name: string; type: string; required: boolean; default?: string; description?: string}>; + pinned_params: Record; + hierarchy: Array<{key: string; label: string}>; + effective_hierarchy: Array<{key: string; label: string}>; + auth_instructions: string; + }>; } export interface ModelConfig { diff --git a/src/app/tableThunks.ts b/src/app/tableThunks.ts index 60acc242..c32a62cb 100644 --- a/src/app/tableThunks.ts +++ b/src/app/tableThunks.ts @@ -16,7 +16,7 @@ import { createAsyncThunk } from '@reduxjs/toolkit'; import { DataSourceConfig, DictTable } from '../components/ComponentType'; import { Type } from '../data/types'; import { inferTypeFromValueArray } from '../data/utils'; -import { fetchWithIdentity, getUrls, computeContentHash } from './utils'; +import { fetchWithIdentity, getUrls, getSourceUrls, computeContentHash } from './utils'; import { DataFormulatorState, dfActions, fetchFieldSemanticType } from './dfSlice'; import { tableDataDB } from './workspaceDB'; @@ -44,6 +44,8 @@ export interface LoadTablePayload { dataLoaderType?: string; dataLoaderParams?: Record; sourceTableName?: string; + // For connected data sources (new /api/sources/{id}/ routes): + connectedSourceId?: string; importOptions?: { rowLimit?: number; sortColumns?: string[]; @@ -74,7 +76,7 @@ export const loadTable = createAsyncThunk< >( 'dataFormulator/loadTable', async (payload, { dispatch, getState }) => { - const { table, file, replaceSource, dataLoaderType, dataLoaderParams, sourceTableName, importOptions } = payload; + const { table, file, replaceSource, dataLoaderType, dataLoaderParams, sourceTableName, connectedSourceId, importOptions } = payload; const state = getState(); const frontendRowLimit = state.config?.frontendRowLimit ?? 50000; const existingTables = state.tables; @@ -129,18 +131,33 @@ export const loadTable = createAsyncThunk< if (storeOnServer) { // === STORE ON SERVER PATH === - if (sourceType === 'database' && dataLoaderType && sourceTableName) { - // Database source: ingest to workspace via data loader + if (sourceType === 'database' && sourceTableName && (dataLoaderType || connectedSourceId)) { + // Database source: ingest to workspace via data loader or connected source try { - const response = await fetchWithIdentity(getUrls().DATA_LOADER_INGEST_DATA, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ + let ingestUrl: string; + let ingestBody: any; + if (connectedSourceId) { + // Connected source route + ingestUrl = getSourceUrls(connectedSourceId).DATA_IMPORT; + ingestBody = { + source_table: sourceTableName, + table_name: sourceTableName, + import_options: importOptions || {}, + }; + } else { + // Legacy data loader route + ingestUrl = getUrls().DATA_LOADER_INGEST_DATA; + ingestBody = { data_loader_type: dataLoaderType, data_loader_params: dataLoaderParams, table_name: sourceTableName, import_options: importOptions || {}, - }), + }; + } + const response = await fetchWithIdentity(ingestUrl, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify(ingestBody), }); const data = await response.json(); if (data.status === 'success') { diff --git a/src/app/utils.tsx b/src/app/utils.tsx index 0aba3110..6595a892 100644 --- a/src/app/utils.tsx +++ b/src/app/utils.tsx @@ -81,6 +81,24 @@ export function getUrls() { }; } +/** + * Build API URLs for a ConnectedDataSource by source_id. + */ +export function getSourceUrls(sourceId: string) { + const base = `/api/sources/${sourceId}`; + return { + AUTH_CONNECT: `${base}/auth/connect`, + AUTH_DISCONNECT: `${base}/auth/disconnect`, + AUTH_STATUS: `${base}/auth/status`, + CATALOG_LS: `${base}/catalog/ls`, + CATALOG_METADATA: `${base}/catalog/metadata`, + CATALOG_LIST_TABLES: `${base}/catalog/list_tables`, + DATA_IMPORT: `${base}/data/import`, + DATA_REFRESH: `${base}/data/refresh`, + DATA_PREVIEW: `${base}/data/preview`, + }; +} + /** * Get the current namespaced identity from the Redux store, or fall back to browser ID. * Returns identity in "type:id" format (e.g., "user:alice@example.com" or "browser:550e8400-...") diff --git a/src/views/DBTableManager.tsx b/src/views/DBTableManager.tsx index 27f26ea3..43d5d6fb 100644 --- a/src/views/DBTableManager.tsx +++ b/src/views/DBTableManager.tsx @@ -33,7 +33,7 @@ import SearchIcon from '@mui/icons-material/Search'; import Autocomplete from '@mui/material/Autocomplete'; -import { getUrls, fetchWithIdentity } from '../app/utils'; +import { getUrls, getSourceUrls, fetchWithIdentity } from '../app/utils'; import { borderColor } from '../app/tokens'; import { CustomReactTable } from './ReactTable'; import { DataSourceConfig, DictTable } from '../components/ComponentType'; @@ -297,6 +297,35 @@ export const DBManagerPane: React.FC<{ ))} + + {/* Connected data sources from /api/app-config SOURCES */} + {(serverConfig.SOURCES ?? []).length > 0 && ( + + )} + {(serverConfig.SOURCES ?? []).map((source) => ( + + ))} let dataConnectorView = @@ -334,6 +363,32 @@ export const DBManagerPane: React.FC<{ ) ))} + + {/* Connected data source forms */} + {(serverConfig.SOURCES ?? []).map((source) => ( + selectedDataLoader === `source:${source.source_id}` && ( + + { + setIsUploading(true); + }} + onFinish={(status, message, importedTables) => { + setIsUploading(false); + if (status === "success") { + setSystemMessage(message, "success"); + } else { + setSystemMessage(message, "error"); + } + }} + /> + + ) + ))} ; let mainContent = @@ -405,11 +460,12 @@ export const DBManagerPane: React.FC<{ export const DataLoaderForm: React.FC<{ dataLoaderType: string, - paramDefs: {name: string, default: string, type: string, required: boolean, description: string}[], + paramDefs: {name: string, default?: string, type: string, required: boolean, description?: string}[], authInstructions: string, + connectedSourceId?: string, onImport: () => void, onFinish: (status: "success" | "error", message: string, importedTables?: string[]) => void -}> = ({dataLoaderType, paramDefs, authInstructions, onImport, onFinish}) => { +}> = ({dataLoaderType, paramDefs, authInstructions, connectedSourceId, onImport, onFinish}) => { const { t } = useTranslation(); const dispatch = useDispatch(); const theme = useTheme(); @@ -447,6 +503,63 @@ export const DataLoaderForm: React.FC<{ let [isConnecting, setIsConnecting] = useState(false); + // Helper: connect and list tables — branches based on connectedSourceId + const connectAndListTables = useCallback(async (filter?: string) => { + setIsConnecting(true); + try { + if (connectedSourceId) { + // Connected source: first connect, then list tables + const urls = getSourceUrls(connectedSourceId); + const connectResp = await fetchWithIdentity(urls.AUTH_CONNECT, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ params: params }), + }); + const connectData = await connectResp.json(); + if (connectData.status === 'error') { + throw new Error(connectData.message || 'Connection failed'); + } + // List tables + const listResp = await fetchWithIdentity(urls.CATALOG_LIST_TABLES, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ filter: filter?.trim() || null }), + }); + const listData = await listResp.json(); + if (listData.tables) { + setTableMetadata(Object.fromEntries( + listData.tables.map((t: any) => [t.name, t.metadata]) + )); + } else if (listData.status === 'error') { + throw new Error(listData.message || 'Failed to list tables'); + } + } else { + // Legacy data loader: single list-tables call + const resp = await fetchWithIdentity(getUrls().DATA_LOADER_LIST_TABLES, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ + data_loader_type: dataLoaderType, + data_loader_params: params, + table_filter: filter?.trim() || null, + }), + }); + const data = await resp.json(); + if (data.status === 'success') { + setTableMetadata(Object.fromEntries( + data.tables.map((t: any) => [t.name, t.metadata]) + )); + } else { + throw new Error(data.message || 'Failed to list tables'); + } + } + } catch (error: any) { + onFinish("error", error.message || 'Failed to connect'); + } finally { + setIsConnecting(false); + } + }, [connectedSourceId, dataLoaderType, params, onFinish]); + // Auto-select first table for preview when metadata loads useEffect(() => { const tableNames = Object.keys(tableMetadata); @@ -725,8 +838,9 @@ export const DataLoaderForm: React.FC<{ onImport(); dispatch(loadTable({ table: tableObj, - dataLoaderType, - dataLoaderParams: params, + dataLoaderType: connectedSourceId ? undefined : dataLoaderType, + dataLoaderParams: connectedSourceId ? undefined : params, + connectedSourceId, sourceTableName: tableName, importOptions: Object.keys(importOptions).length > 0 ? importOptions : undefined, })).unwrap() @@ -813,36 +927,7 @@ export const DataLoaderForm: React.FC<{ variant="outlined" size="small" sx={{textTransform: "none", height: 30, fontSize: 12}} - onClick={() => { - setIsConnecting(true); - fetchWithIdentity(getUrls().DATA_LOADER_LIST_TABLES, { - method: 'POST', - headers: { - 'Content-Type': 'application/json', - }, - body: JSON.stringify({ - data_loader_type: dataLoaderType, - data_loader_params: params, - table_filter: tableFilter.trim() || null - }) - }).then((response: Response) => response.json()) - .then((data: any) => { - if (data.status === "success") { - console.log(data.tables); - setTableMetadata(Object.fromEntries(data.tables.map((table: any) => { - return [table.name, table.metadata]; - }))); - } else { - console.error('Failed to fetch data loader tables: {}', data.message); - onFinish("error", t('db.failedFetchLoaderTables', { message: data.message })); - } - setIsConnecting(false); - }) - .catch((error: any) => { - onFinish("error", t('db.failedFetchLoaderTablesServer')); - setIsConnecting(false); - }); - }} + onClick={() => connectAndListTables(tableFilter)} > {t('db.refresh')} @@ -851,6 +936,11 @@ export const DataLoaderForm: React.FC<{ size="small" sx={{textTransform: "none", height: 30, fontSize: 12}} onClick={() => { + if (connectedSourceId) { + fetchWithIdentity(getSourceUrls(connectedSourceId).AUTH_DISCONNECT, { + method: 'POST', + }).catch(() => {}); + } setTableMetadata({}); setTableFilter(""); }} @@ -940,36 +1030,7 @@ export const DataLoaderForm: React.FC<{ color="primary" size="small" sx={{textTransform: "none", minWidth: 100, height: 30}} - onClick={() => { - setIsConnecting(true); - fetchWithIdentity(getUrls().DATA_LOADER_LIST_TABLES, { - method: 'POST', - headers: { - 'Content-Type': 'application/json', - }, - body: JSON.stringify({ - data_loader_type: dataLoaderType, - data_loader_params: params, - table_filter: tableFilter.trim() || null - }) - }).then((response: Response) => response.json()) - .then((data: any) => { - if (data.status === "success") { - console.log(data.tables); - setTableMetadata(Object.fromEntries(data.tables.map((table: any) => { - return [table.name, table.metadata]; - }))); - } else { - console.error('Failed to fetch data loader tables: {}', data.message); - onFinish("error", t('db.failedFetchLoaderTables', { message: data.message })); - } - setIsConnecting(false); - }) - .catch((error: any) => { - onFinish("error", t('db.failedFetchLoaderTablesServer')); - setIsConnecting(false); - }); - }}> + onClick={() => connectAndListTables(tableFilter)}> {t('db.connect', { suffix: tableFilter.trim() ? t('db.withFilter') : '' })} } diff --git a/tests/backend/unit/test_plugin_data_writer.py b/tests/backend/unit/test_plugin_data_writer.py index 6832d4d5..a3924fe7 100644 --- a/tests/backend/unit/test_plugin_data_writer.py +++ b/tests/backend/unit/test_plugin_data_writer.py @@ -55,7 +55,7 @@ def write_parquet( table_name: str, *, compression: str = "snappy", - loader_metadata: dict[str, Any] | None = None, + source_info: dict[str, Any] | None = None, ) -> _FakeTableMetadata: self._tables.add(table_name) return _FakeTableMetadata( @@ -101,7 +101,7 @@ def test_write_returns_expected_shape(self, mock_get_ws, _mock_id, writer, sampl @patch("data_formulator.plugins.data_writer.get_identity_id", return_value="user:bob") @patch("data_formulator.plugins.data_writer.get_workspace") - def test_loader_metadata_stamped(self, mock_get_ws, _mock_id, writer, sample_df): + def test_source_info_stamped(self, mock_get_ws, _mock_id, writer, sample_df): ws = MagicMock(spec=_FakeWorkspace) ws.write_parquet.return_value = _FakeTableMetadata( name="sales", row_count=2, columns=[] @@ -111,7 +111,7 @@ def test_loader_metadata_stamped(self, mock_get_ws, _mock_id, writer, sample_df) writer.write_dataframe(sample_df, "sales") _, kwargs = ws.write_parquet.call_args - meta = kwargs["loader_metadata"] + meta = kwargs["source_info"] assert meta["loader_type"] == "plugin:superset" assert meta["source_table"] == "sales" @@ -129,7 +129,7 @@ def test_source_metadata_forwarded(self, mock_get_ws, _mock_id, writer, sample_d ) _, kwargs = ws.write_parquet.call_args - assert kwargs["loader_metadata"]["loader_params"] == {"dashboard_id": 42} + assert kwargs["source_info"]["loader_params"] == {"dashboard_id": 42} # ------------------------------------------------------------------ diff --git a/tests/plugin/test_bigquery/test_bigquery_loader.py b/tests/plugin/test_bigquery/test_bigquery_loader.py index 0d596440..a6327b61 100644 --- a/tests/plugin/test_bigquery/test_bigquery_loader.py +++ b/tests/plugin/test_bigquery/test_bigquery_loader.py @@ -153,7 +153,7 @@ def test_fetch_data_as_arrow_from_table(self) -> None: config = get_test_config() loader = create_loader_for_emulator(config) source_table = "test-project.sample.products" - table = loader.fetch_data_as_arrow(source_table=source_table, size=20) + table = loader.fetch_data_as_arrow(source_table=source_table, import_options={"size": 20}) self.assertIsNotNone(table) self.assertGreater(table.num_rows, 0) @@ -166,7 +166,7 @@ def test_fetch_data_respects_size(self) -> None: config = get_test_config() loader = create_loader_for_emulator(config) table = loader.fetch_data_as_arrow( - source_table="test-project.sample.products", size=5 + source_table="test-project.sample.products", import_options={"size": 5} ) self.assertLessEqual(table.num_rows, 5) @@ -177,7 +177,7 @@ def test_fetch_data_invalid_table_raises(self) -> None: with self.assertRaises(Exception): loader.fetch_data_as_arrow( source_table="test-project.sample.nonexistent_table_xyz", - size=10, + import_options={"size": 10}, ) def test_ingest_table_to_workspace(self) -> None: @@ -188,7 +188,7 @@ def test_ingest_table_to_workspace(self) -> None: target_name = "customers_test" meta = loader.ingest_to_workspace( - workspace, target_name, source_table=table_name, size=100 + workspace, target_name, source_table=table_name, import_options={"size": 100} ) self.assertEqual(meta.name, "customers_test") @@ -209,7 +209,7 @@ def test_ingest_table_auto_name(self) -> None: table_name = "test-project.sample.customers" meta = loader.ingest_to_workspace( - workspace, "customers", source_table=table_name, size=100 + workspace, "customers", source_table=table_name, import_options={"size": 100} ) self.assertEqual(meta.name, "customers") @@ -224,7 +224,7 @@ def test_ingest_products_table(self) -> None: source_table = "test-project.sample.products" meta = loader.ingest_to_workspace( - workspace, target_name, source_table=source_table, size=1000 + workspace, target_name, source_table=source_table, import_options={"size": 1000} ) self.assertGreater(meta.row_count, 0) @@ -240,7 +240,7 @@ def test_ingest_orders_table(self) -> None: loader = create_loader_for_emulator(config) workspace = self._get_workspace() meta = loader.ingest_to_workspace( - workspace, "order_details", source_table="test-project.sample.orders", size=1000 + workspace, "order_details", source_table="test-project.sample.orders", import_options={"size": 1000} ) self.assertGreater(meta.row_count, 0) @@ -257,8 +257,7 @@ def test_ingest_sanitizes_table_name(self) -> None: meta = loader.ingest_to_workspace( workspace, "test-table-with-dashes", - source_table="test-project.sample.customers", - size=10, + source_table="test-project.sample.customers", import_options={"size": 10}, ) # sanitize_table_name produces lowercase with underscores @@ -274,8 +273,7 @@ def test_get_table_info_from_datalake(self) -> None: loader.ingest_to_workspace( workspace, "my_table", - source_table="test-project.sample.customers", - size=5_000, + source_table="test-project.sample.customers", import_options={"size": 5_000}, ) self.assertIn("my_table", workspace.list_tables()) diff --git a/tests/plugin/test_mongodb/test_mongodb_loader.py b/tests/plugin/test_mongodb/test_mongodb_loader.py index 69751aa1..0d553b73 100644 --- a/tests/plugin/test_mongodb/test_mongodb_loader.py +++ b/tests/plugin/test_mongodb/test_mongodb_loader.py @@ -128,7 +128,7 @@ def test_list_tables_row_count(self) -> None: def test_fetch_data_as_arrow(self) -> None: loader = self._get_loader() - table = loader.fetch_data_as_arrow(source_table="products", size=20) + table = loader.fetch_data_as_arrow(source_table="products", import_options={"size": 20}) self.assertIsNotNone(table) self.assertGreater(table.num_rows, 0) self.assertIn("name", table.column_names) @@ -137,14 +137,14 @@ def test_fetch_data_as_arrow(self) -> None: def test_fetch_data_respects_size(self) -> None: loader = self._get_loader() - table = loader.fetch_data_as_arrow(source_table="products", size=5) + table = loader.fetch_data_as_arrow(source_table="products", import_options={"size": 5}) self.assertLessEqual(table.num_rows, 5) def test_ingest_table_to_workspace(self) -> None: loader = self._get_loader() workspace = self._get_workspace() meta = loader.ingest_to_workspace( - workspace, "products_test", source_table="products", size=100 + workspace, "products_test", source_table="products", import_options={"size": 100} ) self.assertEqual(meta.name, "products_test") @@ -162,7 +162,7 @@ def test_ingest_nested_documents_flattened(self) -> None: loader = self._get_loader() workspace = self._get_workspace() meta = loader.ingest_to_workspace( - workspace, "products_nested", source_table="products", size=100 + workspace, "products_nested", source_table="products", import_options={"size": 100} ) df = workspace.read_data_as_df(meta.name) spec_cols = [c for c in df.columns if c.startswith("specs_")] @@ -172,7 +172,7 @@ def test_ingest_with_arrays_flattened(self) -> None: loader = self._get_loader() workspace = self._get_workspace() meta = loader.ingest_to_workspace( - workspace, "products_arrays", source_table="products", size=100 + workspace, "products_arrays", source_table="products", import_options={"size": 100} ) df = workspace.read_data_as_df(meta.name) tag_cols = [c for c in df.columns if c.startswith("tags_")] @@ -182,7 +182,7 @@ def test_ingest_sanitizes_table_name(self) -> None: loader = self._get_loader() workspace = self._get_workspace() meta = loader.ingest_to_workspace( - workspace, "test-table-with-dashes", source_table="products", size=10 + workspace, "test-table-with-dashes", source_table="products", import_options={"size": 10} ) self.assertIn(meta.name, workspace.list_tables()) df = workspace.read_data_as_df(meta.name) @@ -191,7 +191,7 @@ def test_ingest_sanitizes_table_name(self) -> None: def test_get_table_info_from_datalake(self) -> None: loader = self._get_loader() workspace = self._get_workspace() - loader.ingest_to_workspace(workspace, "my_table", source_table="customers", size=5_000) + loader.ingest_to_workspace(workspace, "my_table", source_table="customers", import_options={"size": 5_000}) self.assertIn("my_table", workspace.list_tables()) meta = workspace.get_table_metadata("my_table") diff --git a/tests/plugin/test_mysql/test_mysql_loader.py b/tests/plugin/test_mysql/test_mysql_loader.py index 7265b8a1..6a9d9a38 100644 --- a/tests/plugin/test_mysql/test_mysql_loader.py +++ b/tests/plugin/test_mysql/test_mysql_loader.py @@ -106,7 +106,7 @@ def test_list_tables_with_filter(self) -> None: def test_fetch_data_as_arrow_from_table(self) -> None: loader = MySQLDataLoader(get_test_config()) # Table name from list_tables is database.table (e.g. testdb.products) - table = loader.fetch_data_as_arrow(source_table="testdb.products", size=20) + table = loader.fetch_data_as_arrow(source_table="testdb.products", import_options={"size": 20}) self.assertIsNotNone(table) self.assertGreater(table.num_rows, 0) @@ -116,14 +116,14 @@ def test_fetch_data_as_arrow_from_table(self) -> None: def test_fetch_data_respects_size(self) -> None: loader = MySQLDataLoader(get_test_config()) - table = loader.fetch_data_as_arrow(source_table="testdb.products", size=5) + table = loader.fetch_data_as_arrow(source_table="testdb.products", import_options={"size": 5}) self.assertLessEqual(table.num_rows, 5) def test_ingest_table_to_workspace(self) -> None: loader = MySQLDataLoader(get_test_config()) workspace = self._get_workspace() meta = loader.ingest_to_workspace( - workspace, "products_test", source_table="testdb.products", size=100 + workspace, "products_test", source_table="testdb.products", import_options={"size": 100} ) self.assertEqual(meta.name, "products_test") @@ -140,7 +140,7 @@ def test_ingest_products_table(self) -> None: loader = MySQLDataLoader(get_test_config()) workspace = self._get_workspace() meta = loader.ingest_to_workspace( - workspace, "electronics_products", source_table="testdb.products", size=1000 + workspace, "electronics_products", source_table="testdb.products", import_options={"size": 1000} ) self.assertGreater(meta.row_count, 0) @@ -155,7 +155,7 @@ def test_get_table_info_from_datalake(self) -> None: loader = MySQLDataLoader(get_test_config()) workspace = self._get_workspace() loader.ingest_to_workspace( - workspace, "my_table", source_table="testdb.customers", size=5_000 + workspace, "my_table", source_table="testdb.customers", import_options={"size": 5_000} ) self.assertIn("my_table", workspace.list_tables()) diff --git a/tests/plugin/test_mysql_datalake.py b/tests/plugin/test_mysql_datalake.py index c4ef0c27..6720ad3e 100644 --- a/tests/plugin/test_mysql_datalake.py +++ b/tests/plugin/test_mysql_datalake.py @@ -92,8 +92,7 @@ def test_ingest_table_into_datalake(self) -> None: meta = loader.ingest_to_workspace( workspace, "ingested_table", - source_table=table_name, - size=10_000, + source_table=table_name, import_options={"size": 10_000}, ) self.assertEqual(meta.name, "ingested_table") @@ -114,7 +113,7 @@ def test_get_table_info_from_datalake(self) -> None: workspace = Workspace("test-identity-mysql-info", root_dir=self._workspace_root) source_table = tables[0]["name"] - loader.ingest_to_workspace(workspace, "my_table", source_table=source_table, size=5_000) + loader.ingest_to_workspace(workspace, "my_table", source_table=source_table, import_options={"size": 5_000}) # List tables in workspace names = workspace.list_tables() diff --git a/tests/plugin/test_postgres/test_postgresql_loader.py b/tests/plugin/test_postgres/test_postgresql_loader.py index e98f1b1a..3957fba2 100644 --- a/tests/plugin/test_postgres/test_postgresql_loader.py +++ b/tests/plugin/test_postgres/test_postgresql_loader.py @@ -106,7 +106,7 @@ def test_list_tables_with_filter(self) -> None: def test_fetch_data_as_arrow_from_table(self) -> None: loader = PostgreSQLDataLoader(get_test_config()) # Table name from list_tables is schema.table (e.g. sample.products) - table = loader.fetch_data_as_arrow(source_table="sample.products", size=20) + table = loader.fetch_data_as_arrow(source_table="sample.products", import_options={"size": 20}) self.assertIsNotNone(table) self.assertGreater(table.num_rows, 0) @@ -116,14 +116,14 @@ def test_fetch_data_as_arrow_from_table(self) -> None: def test_fetch_data_respects_size(self) -> None: loader = PostgreSQLDataLoader(get_test_config()) - table = loader.fetch_data_as_arrow(source_table="sample.products", size=5) + table = loader.fetch_data_as_arrow(source_table="sample.products", import_options={"size": 5}) self.assertLessEqual(table.num_rows, 5) def test_ingest_table_to_workspace(self) -> None: loader = PostgreSQLDataLoader(get_test_config()) workspace = self._get_workspace() meta = loader.ingest_to_workspace( - workspace, "products_test", source_table="sample.products", size=100 + workspace, "products_test", source_table="sample.products", import_options={"size": 100} ) self.assertEqual(meta.name, "products_test") @@ -140,7 +140,7 @@ def test_ingest_products_table(self) -> None: loader = PostgreSQLDataLoader(get_test_config()) workspace = self._get_workspace() meta = loader.ingest_to_workspace( - workspace, "electronics_products", source_table="sample.products", size=1000 + workspace, "electronics_products", source_table="sample.products", import_options={"size": 1000} ) self.assertGreater(meta.row_count, 0) @@ -155,7 +155,7 @@ def test_get_table_info_from_datalake(self) -> None: loader = PostgreSQLDataLoader(get_test_config()) workspace = self._get_workspace() loader.ingest_to_workspace( - workspace, "my_table", source_table="sample.customers", size=5_000 + workspace, "my_table", source_table="sample.customers", import_options={"size": 5_000} ) self.assertIn("my_table", workspace.list_tables()) From 86a6f6d9fa250cdf6af468a6b56a9c22ebd4f150 Mon Sep 17 00:00:00 2001 From: Chenglong Wang Date: Tue, 14 Apr 2026 18:49:16 -0700 Subject: [PATCH 3/6] redesign data connector --- .env.template | 33 +- DEVELOPMENT.md | 264 +- .../9-generalized-data-source-plugins.md | 112 +- .../9.1-data-source-connection-model.md | 315 + package.json | 1 + py-src/data_formulator/app.py | 103 +- .../credential_vault/__init__.py | 15 +- ...{connected_source.py => data_connector.py} | 269 +- .../data_loader/athena_data_loader.py | 18 +- .../data_loader/azure_blob_data_loader.py | 14 +- .../data_loader/bigquery_data_loader.py | 8 +- .../data_loader/external_data_loader.py | 32 +- .../data_loader/kusto_data_loader.py | 38 +- .../data_loader/mongodb_data_loader.py | 14 +- .../data_loader/mssql_data_loader.py | 10 + .../data_loader/mysql_data_loader.py | 10 +- .../data_loader/postgresql_data_loader.py | 10 +- .../data_loader/s3_data_loader.py | 10 +- .../data_loader/superset_data_loader.py | 32 +- .../plugins/superset/__init__.py | 28 +- py-src/data_formulator/security/auth.py | 46 +- py-src/data_formulator/tables_routes.py | 271 +- src/app/App.tsx | 73 +- src/app/dfSlice.tsx | 20 +- src/app/identity.ts | 2 +- src/app/tableThunks.ts | 62 +- src/app/useDataRefresh.tsx | 21 +- src/app/utils.tsx | 17 +- src/components/ComponentType.tsx | 3 + src/i18n/locales/en/common.json | 15 +- src/i18n/locales/zh/common.json | 8 +- src/views/DBTableManager.tsx | 870 +- src/views/RefreshDataDialog.tsx | 22 - src/views/UnifiedDataUploadDialog.tsx | 79 +- .../integration/test_plugin_app_config.py | 2 - .../test_superset_data_connector.py | 427 + .../unit/test_all_loader_verification.py | 226 + .../unit/test_data_connector_config.py | 413 + .../unit/test_data_connector_framework.py | 659 + .../backend/unit/test_data_connector_vault.py | 450 + .../test_mysql/test_mysql_data_connector.py | 263 + .../test_postgresql_data_connector.py | 430 + tests/superset/.env.superset | 4 +- tests/superset/README.md | 13 + tests/superset/docker-compose.yml | 1 + tests/superset/start.sh | 6 +- tests/superset/superset_config.py | 78 + tests/test_plan.md | 6 +- yarn.lock | 12575 ++++++++-------- 49 files changed, 10851 insertions(+), 7547 deletions(-) create mode 100644 design-docs/9.1-data-source-connection-model.md rename py-src/data_formulator/{connected_source.py => data_connector.py} (68%) create mode 100644 tests/backend/integration/test_superset_data_connector.py create mode 100644 tests/backend/unit/test_all_loader_verification.py create mode 100644 tests/backend/unit/test_data_connector_config.py create mode 100644 tests/backend/unit/test_data_connector_framework.py create mode 100644 tests/backend/unit/test_data_connector_vault.py create mode 100644 tests/plugin/test_mysql/test_mysql_data_connector.py create mode 100644 tests/plugin/test_postgres/test_postgresql_data_connector.py create mode 100644 tests/superset/superset_config.py diff --git a/.env.template b/.env.template index d4f77721..89a59751 100644 --- a/.env.template +++ b/.env.template @@ -11,6 +11,13 @@ DISABLE_DISPLAY_KEYS=false # if true, API keys will not be shown in the frontend SANDBOX=local # code execution backend: 'local' (default) or 'docker' # LOG_LEVEL=INFO # logging level for data_formulator modules (DEBUG, INFO, WARNING, ERROR) +# --- Feature gates --- +# Disable external data connectors (MySQL, PostgreSQL, etc.). +# Recommended for multi-user anonymous deployments to prevent credential exposure. +# DISABLE_DATA_CONNECTORS=false + +# Prevent users from adding custom LLM endpoints via the UI.\n# Only server-configured models (below) will be available.\n# DISABLE_CUSTOM_MODELS=false + # Flask session secret key — used to sign cookies and encrypt session data. # Required for SSO and plugin auth (Superset, etc.). Generate one with: # python -c "import secrets; print(secrets.token_hex(32))" @@ -191,4 +198,28 @@ OLLAMA_MODELS=qwen3:32b # models with good code generation capabilities recommen # Superset-side setup: # The Superset instance needs a small bridge endpoint at /df-sso-bridge/ # that converts a Superset session into a JWT and posts it back to DF. -# See: superset-sso-bridge-setup.md \ No newline at end of file +# See: superset-sso-bridge-setup.md + +# ------------------------------------------------------------------- +# Deployment profiles (quick-start presets) +# ------------------------------------------------------------------- +# See DEVELOPMENT.md "Deployment Profiles" for full documentation. +# +# Profile 1 — Single-user local (default, no changes needed): +# Just run: data_formulator +# +# Profile 2 — Multi-user anonymous demo: +# WORKSPACE_BACKEND=ephemeral +# DISABLE_DATA_CONNECTORS=true +# DISABLE_CUSTOM_MODELS=true +# DISABLE_DISPLAY_KEYS=true +# (or simply: DISABLE_DATABASE=true as shortcut) +# +# Profile 3 — Multi-user authenticated (enterprise): +# AUTH_PROVIDER=oidc +# OIDC_ISSUER_URL=https://your-idp.example.com/realms/main +# OIDC_CLIENT_ID=data-formulator +# ALLOW_ANONYMOUS=false +# DISABLE_CUSTOM_MODELS=true +# WORKSPACE_BACKEND=azure_blob +# FLASK_SECRET_KEY= \ No newline at end of file diff --git a/DEVELOPMENT.md b/DEVELOPMENT.md index d5815480..543751d8 100644 --- a/DEVELOPMENT.md +++ b/DEVELOPMENT.md @@ -323,68 +323,190 @@ data-formulator/ ← container | `--azure-blob-container` | `AZURE_BLOB_CONTAINER` | `data-formulator` | Blob container name | -## Security Considerations for Production Deployment +## Deployment Profiles -⚠️ **IMPORTANT SECURITY WARNING FOR PRODUCTION DEPLOYMENT** +Data Formulator supports three deployment configurations. **All defaults are optimized for Profile 1 (single-user local)** — you only need to set flags when deploying as multi-user. -When deploying Data Formulator to production, please be aware of the following security considerations: +### Profile 1: Single-User Local (default) -### Data Storage +A personal instance running on `localhost`. No login required, full feature access. -Data Formulator supports three workspace backends: +```bash +# Everything uses defaults — just run it: +data_formulator -| Backend | Flag | Storage | Persistence | -|---------|------|---------|-------------| -| **local** (default) | `--workspace-backend local` | `~/.data_formulator/users//workspaces//` | Server filesystem | -| **azure_blob** | `--workspace-backend azure_blob` | Azure Blob container | Cloud | -| **ephemeral** | `--workspace-backend ephemeral` | Browser IndexedDB (frontend) + temp dirs (backend) | Browser session only | +# Or equivalently: +data_formulator \ + --workspace-backend local \ + --sandbox local +``` -Each workspace contains: -- `workspace.yaml` — table metadata -- `session_state.json` — auto-persisted frontend state -- `data/` — table data as parquet files +| Setting | Value | Why | +|---------|-------|-----| +| `AUTH_PROVIDER` | *(unset)* | Single user, no login needed | +| `WORKSPACE_BACKEND` | `local` | Persist workspaces to `~/.data_formulator/` | +| `DISABLE_DATA_CONNECTORS` | `false` | Full access to MySQL, PostgreSQL, etc. | +| `DISABLE_CUSTOM_MODELS` | `false` | User can add any LLM endpoint | +| `DISABLE_DISPLAY_KEYS` | `false` | User can see/manage their own API keys | +| Credential vault | auto-enabled | Remembers DB credentials across restarts | +| Identity | `local:` | Fixed, OS-derived — survives localStorage clear | -### Identity and Data Isolation +**Security notes:** In single-user localhost mode, the server ignores the `X-Identity-Id` header entirely and uses a fixed identity derived from the OS username (e.g., `local:alice`). This means vault credentials and workspaces are tied to your OS account, not a random browser UUID — clearing localStorage won't orphan your data. -- Each user's data is isolated by a namespaced identity key (e.g., `user:alice@example.com` or `browser:550e8400-...`) -- Anonymous users get a browser-based UUID stored in localStorage -- Authenticated users get their verified user ID from the auth provider -- In multi-tenant deployments, ensure workspace directories are isolated and access-controlled +### Profile 2: Multi-User Anonymous (demo / public hosting) -### Recommended Security Measures +A shared server (e.g., for demos, workshops, public access). No login, no server-side state, no sensitive features. -For production deployment, consider: +```bash +data_formulator \ + --workspace-backend ephemeral \ + --disable-data-connectors \ + --disable-custom-models \ + --disable-display-keys +``` -1. **Use `--workspace-backend ephemeral`** for stateless public hosting (no server-side persistence; data lives only in the user's browser) -2. **Set `DF_ALLOWED_API_BASES`** to restrict which LLM endpoints users can target from the UI, preventing SSRF attacks (e.g. `DF_ALLOWED_API_BASES=https://api.openai.com*,https://*.openai.azure.com/*`). See `.env.template` for details. -3. **Implement proper authentication, authorization, and other security measures** as needed for your specific use case, for example: - - User authentication (OAuth, JWT tokens, etc.) - - Role-based access control - - API rate limiting - - HTTPS/TLS encryption - - Input validation and sanitization +> **Shortcut:** `--disable-database` (or `DISABLE_DATABASE=true`) bundles all of the above into a single flag. + +Or via environment variables: + +```env +WORKSPACE_BACKEND=ephemeral +DISABLE_DATA_CONNECTORS=true +DISABLE_CUSTOM_MODELS=true +DISABLE_DISPLAY_KEYS=true +# Pre-configure the LLM models users can access: +OPENAI_ENABLED=true +OPENAI_API_KEY=sk-... +OPENAI_MODELS=gpt-4.1 +``` -### Configuration for Production +| Setting | Value | Why | +|---------|-------|-----| +| `AUTH_PROVIDER` | *(unset)* | Anonymous access for demos | +| `WORKSPACE_BACKEND` | `ephemeral` | No server-side persistence — data lives only in browser IndexedDB | +| `DISABLE_DATA_CONNECTORS` | `true` | **Critical** — prevents DB credential exposure via identity spoofing | +| `DISABLE_CUSTOM_MODELS` | `true` | Prevents users from adding arbitrary LLM endpoints (SSRF risk) | +| `DISABLE_DISPLAY_KEYS` | `true` | Hides server-configured API keys from UI | +| Credential vault | N/A | No connectors → no credentials to store | +| Identity | anonymous (`browser:`) | Acceptable — no sensitive server-side state to protect | + +**Security notes:** With data connectors disabled, the anonymous identity spoofing risk is eliminated — there are no DB credentials or persistent workspaces on the server to access. Each user's data lives entirely in their browser. The only server-side resource is the LLM proxy, which is locked down by `DF_ALLOWED_API_BASES`. + +### Profile 3: Multi-User Authenticated (enterprise / team) + +A shared server with SSO login. Full features, proper identity isolation. ```bash -# For stateless deployment (recommended for public hosting) -data_formulator --workspace-backend ephemeral +data_formulator \ + --workspace-backend azure_blob \ + --disable-display-keys ``` +```env +AUTH_PROVIDER=oidc +OIDC_ISSUER_URL=https://your-idp.example.com/realms/main +OIDC_CLIENT_ID=data-formulator +ALLOW_ANONYMOUS=false +WORKSPACE_BACKEND=azure_blob +AZURE_BLOB_ACCOUNT_URL=https://.blob.core.windows.net +DISABLE_DISPLAY_KEYS=true +DISABLE_CUSTOM_MODELS=true +FLASK_SECRET_KEY= +``` + +| Setting | Value | Why | +|---------|-------|-----| +| `AUTH_PROVIDER` | `oidc` / `github` / `azure_easyauth` | Verified identity from SSO | +| `ALLOW_ANONYMOUS` | `false` | Login required — no anonymous fallback | +| `WORKSPACE_BACKEND` | `azure_blob` or `local` | Persistent per-user workspaces | +| `DISABLE_DATA_CONNECTORS` | `false` | Safe — identity comes from auth provider, not spoofable | +| `DISABLE_CUSTOM_MODELS` | `true` | Users only use server-configured models | +| `DISABLE_DISPLAY_KEYS` | `true` | Hide server keys; users add their own | +| `FLASK_SECRET_KEY` | set explicitly | Required for stable sessions across server restarts | +| Credential vault | auto-enabled | DB credentials scoped to verified `user:` | +| Identity | `user:` from auth provider | Server-verified, cannot be spoofed | + +**Security notes:** With an auth provider, `get_identity_id()` returns `user:` from the IdP token — the `X-Identity-Id` header is ignored entirely. Workspaces, vault credentials, and DB connections are all scoped to the verified identity. Set `ALLOW_ANONYMOUS=false` to prevent unauthenticated access. + +### Profile Comparison + +| Feature | Profile 1 (Local) | Profile 2 (Demo) | Profile 3 (Enterprise) | +|---------|:-:|:-:|:-:| +| Login required | No | No | Yes | +| Data connectors (DB) | Yes | **No** | Yes | +| Custom LLM endpoints | Yes | **No** | Operator choice | +| Credential vault | Yes | N/A | Yes | +| Workspace persistence | Local disk | Browser only | Cloud / disk | +| Identity | `local:` (fixed) | `browser:` (client) | `user:` (SSO) | + +### CLI Flags Reference (complete) + +| Flag | Env var | Default | Description | +|------|---------|---------|-------------| +| `--workspace-backend` | `WORKSPACE_BACKEND` | `local` | `local`, `azure_blob`, or `ephemeral` | +| `--sandbox` | `SANDBOX` | `local` | Code execution backend: `local` or `docker` | +| `--disable-database` | `DISABLE_DATABASE` | `false` | **Multi-user anonymous preset**: bundles ephemeral + no connectors + no custom models + hide keys | +| `--disable-display-keys` | `DISABLE_DISPLAY_KEYS` | `false` | Hide API keys in frontend UI | +| `--disable-data-connectors` | `DISABLE_DATA_CONNECTORS` | `false` | Disable external DB connectors | +| `--disable-custom-models` | `DISABLE_CUSTOM_MODELS` | `false` | Prevent users from adding custom LLM endpoints | +| `--max-display-rows` | `MAX_DISPLAY_ROWS` | `10000` | Max rows sent to frontend | +| `--data-dir` | `DATA_FORMULATOR_HOME` | `~/.data_formulator` | Data directory | +| `--host` | `HOST` | `127.0.0.1` | Network interface to bind | +| `-p`, `--port` | — | `5567` | Port number | +| `--dev` | `DEV_MODE` | `false` | Development mode (no auto-open browser) | +| — | `AUTH_PROVIDER` | *(unset)* | `oidc`, `github`, `azure_easyauth`, or unset for anonymous | +| — | `ALLOW_ANONYMOUS` | `true` | Allow unauthenticated access when auth provider is set | +| — | `DF_ALLOWED_API_BASES` | *(unset, all allowed)* | Comma-separated URL globs for LLM endpoint allowlist | +| — | `FLASK_SECRET_KEY` | auto-generated | Session signing key (set explicitly for production) | +| `--azure-blob-connection-string` | `AZURE_BLOB_CONNECTION_STRING` | — | Azure Blob shared-key connection string | +| `--azure-blob-account-url` | `AZURE_BLOB_ACCOUNT_URL` | — | Azure Blob account URL for Entra ID auth | +| `--azure-blob-container` | `AZURE_BLOB_CONTAINER` | `data-formulator` | Azure Blob container name | + + +## Security Considerations for Production Deployment + +⚠️ **IMPORTANT SECURITY WARNING FOR PRODUCTION DEPLOYMENT** + +### Identity System + +Data Formulator uses a **namespaced identity** system with three tiers: +- **Local mode** (`127.0.0.1`, no auth provider): Identity is `local:`, determined by the server. The `X-Identity-Id` header is ignored. Vault and workspaces are tied to the OS user. +- **Anonymous mode** (multi-user, no auth provider): Identity is `browser:` where the UUID is generated in the browser's `localStorage`. The server trusts the client-provided `X-Identity-Id` header, but always forces the `browser:` prefix. +- **Authenticated mode** (auth provider configured): Identity is `user:` from the auth provider. The `X-Identity-Id` header is ignored entirely. + +**Key security principle**: An attacker sending `X-Identity-Id: user:alice@...` gets `browser:alice@...` — completely separate from the real `user:alice@...` that only authenticated Alice can access. + +**Anonymous spoofing risk**: In anonymous mode, if an attacker knows another user's browser UUID, they can impersonate them via the `X-Identity-Id` header. This is why **Profile 2 disables data connectors** (no DB credentials to steal) and **Profile 3 requires authentication** (header is ignored). + +### Data Storage + +| Backend | Flag | Storage | Persistence | +|---------|------|---------|-------------| +| **local** (default) | `--workspace-backend local` | `~/.data_formulator/users//workspaces/` | Server filesystem | +| **azure_blob** | `--workspace-backend azure_blob` | Azure Blob container | Cloud | +| **ephemeral** | `--workspace-backend ephemeral` | Browser IndexedDB (frontend) + temp dirs (backend) | Browser session only | + +### Recommended Security Measures + +1. **Multi-user anonymous (demos)**: Use Profile 2 — `--workspace-backend ephemeral --disable-data-connectors --disable-custom-models --disable-display-keys` (or `--disable-database` as shortcut) +2. **Multi-user authenticated**: Use Profile 3 — set `AUTH_PROVIDER`, `ALLOW_ANONYMOUS=false`, and `DISABLE_CUSTOM_MODELS=true` +3. **HTTPS**: Use a reverse proxy (nginx, Azure App Gateway) with TLS termination +4. **`FLASK_SECRET_KEY`**: Set explicitly for production (auto-generated key changes on restart) + ## Authentication Architecture -Data Formulator supports a **hybrid identity system** that supports both anonymous and authenticated users. +Data Formulator supports a **hybrid identity system** with anonymous and authenticated modes. +See **Deployment Profiles** above for which mode to use in each scenario. -### Identity Flow Overview +### Identity Flow ``` ┌─────────────────────────────────────────────────────────────────────┐ │ Frontend Request │ ├─────────────────────────────────────────────────────────────────────┤ │ Headers: │ -│ X-Identity-Id: "browser:550e8400-..." (namespace sent by client) │ -│ Authorization: Bearer (if custom auth implemented) │ -│ (Azure also adds X-MS-CLIENT-PRINCIPAL-ID automatically) │ +│ X-Identity-Id: "local:alice" / "browser:550e8400-..." / ... │ +│ Authorization: Bearer (if auth provider configured) │ └─────────────────────────────────────────────────────────────────────┘ │ ▼ @@ -392,69 +514,33 @@ Data Formulator supports a **hybrid identity system** that supports both anonymo │ Backend Identity Resolution │ │ (auth.py: get_identity_id) │ ├─────────────────────────────────────────────────────────────────────┤ -│ Priority 1: Azure X-MS-CLIENT-PRINCIPAL-ID → "user:" │ -│ Priority 2: JWT Bearer token (if implemented) → "user:" │ -│ Priority 3: X-Identity-Id header → ALWAYS "browser:" │ -│ (client-provided namespace is IGNORED for security) │ +│ Priority 1: Auth provider (OIDC/GitHub/EasyAuth) → "user:" │ +│ Priority 2: Localhost mode (127.0.0.1) → "local:" │ +│ (ignores X-Identity-Id header) │ +│ Priority 3: X-Identity-Id header → "browser:" │ +│ (client-provided namespace prefix is IGNORED) │ └─────────────────────────────────────────────────────────────────────┘ │ ▼ ┌─────────────────────────────────────────────────────────────────────┐ │ Storage Isolation │ ├─────────────────────────────────────────────────────────────────────┤ -│ "user:alice@example.com" → alice's workspace dir (ONLY via auth) │ -│ "browser:550e8400-..." → anonymous user's workspace dir │ +│ "user:alice@example.com" → alice's workspace (ONLY via auth) │ +│ "local:alice" → localhost user's workspace (fixed) │ +│ "browser:550e8400-..." → anonymous user's workspace │ └─────────────────────────────────────────────────────────────────────┘ ``` -### Security Model +### Auth Provider Setup -**Critical Security Rule:** The backend NEVER trusts the namespace prefix from the client-provided `X-Identity-Id` header. Even if a client sends `X-Identity-Id: "user:alice@..."`, the backend strips the prefix and forces `browser:alice@...`. Only verified authentication (Azure headers or JWT) can result in a `user:` prefixed identity. - -The key security principle is **namespaced isolation with forced prefixing**: - -| Scenario | X-Identity-Id Sent | Backend Resolution | Storage Key | -|----------|-------------------|-------------------|-------------| -| Anonymous user | `browser:550e8400-...` | Strips prefix, forces `browser:` | `browser:550e8400-...` | -| Azure logged-in user | `browser:550e8400-...` | Uses Azure header (priority 1) | `user:alice@...` | -| Attacker spoofing | `user:alice@...` (forged) | No valid auth, strips & forces `browser:` | `browser:alice@...` | - -**Why this is secure:** An attacker sending `X-Identity-Id: user:alice@...` gets `browser:alice@...` as their storage key, which is completely separate from the real `user:alice@...` that only authenticated Alice can access. - -### Implementing Custom Authentication - -To add JWT-based authentication: - -1. **Backend** (`security/auth.py`): Uncomment and configure the JWT verification code in `get_identity_id()` -2. **Frontend** (`utils.tsx`): Implement `getAuthToken()` to retrieve the JWT from your auth context -3. **Add JWT secret** to Flask config: `current_app.config['JWT_SECRET']` - -### Azure App Service Authentication - -When deployed to Azure with EasyAuth enabled: -- Azure automatically adds `X-MS-CLIENT-PRINCIPAL-ID` header to authenticated requests -- The backend reads this header first (highest priority) -- No frontend changes needed - Azure handles the auth flow - -### Frontend Identity Management - -The frontend (`src/app/identity.ts`) manages identity as follows: - -```typescript -// Identity is always initialized with browser ID -identity: { type: 'browser', id: getBrowserId() } - -// If user logs in (e.g., via Azure), it's updated to: -identity: { type: 'user', id: userInfo.userId } - -// All API requests send namespaced identity: -// X-Identity-Id: "browser:550e8400-..." or "user:alice@..." -``` +See the `AUTH_PROVIDER` section in `.env.template` for configuration details. -This ensures: -1. **Anonymous users**: Work immediately with localStorage-based browser ID -2. **Logged-in users**: Get their verified user ID from the auth provider -3. **Cross-tab consistency**: Browser ID is shared via localStorage across all tabs +| Provider | `AUTH_PROVIDER` | Setup | +|----------|----------------|-------| +| OIDC / OAuth2 | `oidc` | Set `OIDC_ISSUER_URL` + `OIDC_CLIENT_ID` | +| GitHub | `github` | Set `GITHUB_CLIENT_ID` + `GITHUB_CLIENT_SECRET` | +| Azure EasyAuth | `azure_easyauth` | Enable in Azure App Service (no extra env vars) | +| Anonymous only | *(unset)* | Default — no login, `browser:` identity | ## Usage See the [Usage section on the README.md page](README.md#usage). diff --git a/design-docs/9-generalized-data-source-plugins.md b/design-docs/9-generalized-data-source-plugins.md index 51cafcd1..00de048c 100644 --- a/design-docs/9-generalized-data-source-plugins.md +++ b/design-docs/9-generalized-data-source-plugins.md @@ -1,6 +1,6 @@ # Generalized Data Source Plugins — Unifying DataLoader + Plugin into a Lifecycle-Managed Connection -## Status: Phase 1 complete +## Status: Phase 3 complete (legacy data-loader endpoints removed) ## 1. Problem @@ -22,7 +22,7 @@ This split causes problems: A DataLoader already knows *how* to talk to a data source (connect, list tables, fetch data). A Plugin knows *how* to manage a session (login, persist auth, browse, present UI). **Combining them gives us a lifecycle-managed data connection** — which is what users actually want. -## 2. Proposal: `ConnectedDataSource` — A Generalized Plugin Built from a DataLoader +## 2. Proposal: `DataConnector` — A Generalized Plugin Built from a DataLoader ### 2.1 Core Idea @@ -40,7 +40,7 @@ This means: to add "PostgreSQL as a connected data source," you write **zero new ``` ┌─────────────────────────────────────────────────────────────┐ -│ ConnectedDataSource │ +│ DataConnector │ │ (generic plugin framework) │ │ │ │ ┌──────────────┐ ┌──────────────┐ ┌───────────────────┐ │ @@ -84,17 +84,17 @@ This means we don't need separate abstractions for "BI plugin" vs. "database plu | Component | Change | |-----------|--------| | `ExternalDataLoader` | **Evolves** into the universal data protocol. Gains `catalog_hierarchy()` + `ls()` + `effective_hierarchy()` for tree browsing with scope pinning. | -| `DataSourcePlugin` | **Stays** as the abstract base, but now primarily implemented via `ConnectedDataSource`. | -| **New: `ConnectedDataSource`** | Generic `DataSourcePlugin` subclass that wraps any `ExternalDataLoader`. Auto-generates auth/catalog/data routes. | -| **New: `ConnectedDataSourcePanel`** | Generic React component for all connected data sources (login → tree browser → import). | -| `SupersetPlugin` | **Migrates** to a `ConnectedDataSource` backed by a `SupersetLoader`. Dashboards are `"namespace"` nodes, datasets are `"table"` nodes — hierarchy labels provide the UI terminology. | +| `DataSourcePlugin` | **Stays** as the abstract base, but now primarily implemented via `DataConnector`. | +| **New: `DataConnector`** | Generic `DataSourcePlugin` subclass that wraps any `ExternalDataLoader`. Auto-generates auth/catalog/data routes. | +| **New: `DataConnectorPanel`** | Generic React component for all connected data sources (login → tree browser → import). | +| `SupersetPlugin` | **Migrates** to a `DataConnector` backed by a `SupersetLoader`. Dashboards are `"namespace"` nodes, datasets are `"table"` nodes — hierarchy labels provide the UI terminology. | ## 3. API Design -### 3.1 Backend: `ConnectedDataSource` Base +### 3.1 Backend: `DataConnector` Base ```python -class ConnectedDataSource(DataSourcePlugin): +class DataConnector(DataSourcePlugin): """A DataSourcePlugin auto-generated from an ExternalDataLoader. Provides lifecycle management: connection persistence, catalog browsing, @@ -216,7 +216,7 @@ POST /api/plugins/{id}/catalog/metadata } ``` -**How this maps to `ExternalDataLoader`:** The `ls(path)` method (§3.4) drives every tree expansion. `ConnectedDataSource` adds caching (per-session, with TTL) on top. +**How this maps to `ExternalDataLoader`:** The `ls(path)` method (§3.4) drives every tree expansion. `DataConnector` adds caching (per-session, with TTL) on top. #### 3.2.3 Data Loading + Refresh @@ -264,7 +264,7 @@ POST /api/plugins/{id}/data/preview ### 3.3 Refresh Mechanism -Refresh is a first-class concept. When a table is imported via a `ConnectedDataSource`, the workspace metadata stores: +Refresh is a first-class concept. When a table is imported via a `DataConnector`, the workspace metadata stores: ```python { @@ -750,7 +750,7 @@ class ExternalDataLoader(ABC): **Key design decisions:** - **`CatalogNode.node_type`** uses `"namespace"` / `"table"` (following the Iceberg REST / Unity Catalog convention), not per-source types like `"database"`, `"schema"`. The hierarchy labels provide the per-source terminology. - **`list_tables()` is kept permanently** as the flat/eager complement to `ls()`. It returns every importable table in the pinned scope — simple and complete, but potentially slow. `ls()` is the lazy/hierarchical alternative. The default `ls()` falls back to `list_tables()` for loaders that haven't implemented hierarchical browsing. -- **`effective_hierarchy()` and `pinned_scope()`** live on the loader itself (not on `ConnectedDataSource`), since the loader has access to its own `params`. +- **`effective_hierarchy()` and `pinned_scope()`** live on the loader itself (not on `DataConnector`), since the loader has access to its own `params`. - **`test_connection()`** has a default implementation, but loaders should override with something lightweight. - **`import_options`** is a single extensible dict replacing the old scattered `size`/`sort_columns`/`sort_order`/`columns`/`import_context` params. All data-shaping options go through one bag: `size`, `columns`, `sort_columns`, `sort_order`, `filters`, `source_filters`. Loaders extract what they need; unknown keys are ignored. @@ -864,11 +864,11 @@ When no config file or env vars are set, the framework **auto-discovers** all in ```python def discover_sources(app): - """Auto-register every installed ExternalDataLoader as a ConnectedDataSource plugin.""" + """Auto-register every installed ExternalDataLoader as a DataConnector plugin.""" for key, loader_class in DATA_LOADERS.items(): # DATA_LOADERS is the existing registry from data_loader/__init__.py # Only contains loaders whose pip dependencies are installed - plugin = ConnectedDataSource.from_loader(loader_class, source_id=key) + plugin = DataConnector.from_loader(loader_class, source_id=key) register_plugin(app, plugin) # Log disabled loaders (missing deps) @@ -1062,9 +1062,9 @@ At startup, the framework: 2. **Read** config sources (env vars → YAML → UI settings) → merge 3. **For each configured source** (or auto-discovered loader): - Resolve the `ExternalDataLoader` class from `type` - - Create a `ConnectedDataSource` instance with pre-filled `params` + - Create a `DataConnector` instance with pre-filled `params` - Generate Flask Blueprint with auth/catalog/data routes - - Register frontend module (generic `ConnectedSourcePanel`) + - Register frontend module (generic `DataConnectorPanel`) 4. **Serve** `/api/app-config` with the list of enabled sources ```python @@ -1078,7 +1078,7 @@ def register_sources(app): logger.warn(f"Unknown source type: {source_spec.type}") continue - plugin = ConnectedDataSource.from_loader( + plugin = DataConnector.from_loader( loader_class, source_id=source_spec.id, # auto-generated or from config display_name=source_spec.name, # optional custom name @@ -1090,11 +1090,11 @@ def register_sources(app): ### 4.9 Frontend: No Per-Source Registration Needed -Since all `ConnectedDataSource` plugins use the same generic `ConnectedSourcePanel`, the frontend doesn't need per-source modules either. The backend's `/api/app-config` tells the frontend what sources are available: +Since all `DataConnector` plugins use the same generic `DataConnectorPanel`, the frontend doesn't need per-source modules either. The backend's `/api/app-config` tells the frontend what sources are available: ```json { - "SOURCES": [ + "CONNECTORS": [ { "id": "pg_prod", "type": "postgresql", @@ -1123,24 +1123,24 @@ Since all `ConnectedDataSource` plugins use the same generic `ConnectedSourcePan } ``` -The frontend renders one `ConnectedSourcePanel` per source in the `SOURCES` list — each with its own connection form, tree hierarchy, and icon. **Zero frontend code per source.** +The frontend renders one `DataConnectorPanel` per source in the `SOURCES` list — each with its own connection form, tree hierarchy, and icon. **Zero frontend code per source.** -## 5. Frontend: Generic `ConnectedSourcePanel` +## 5. Frontend: Generic `DataConnectorPanel` ### 5.1 Shared UI for All Database-Type Sources -Instead of writing a custom React panel per data source, `ConnectedDataSource` plugins share a single generic panel: +Instead of writing a custom React panel per data source, `DataConnector` plugins share a single generic panel: ```typescript -// src/plugins/_shared/ConnectedSourcePanel.tsx +// src/plugins/_shared/DataConnectorPanel.tsx -interface ConnectedSourcePanelProps { +interface DataConnectorPanelProps { pluginId: string; config: PluginConfig; callbacks: PluginHostCallbacks; } -function ConnectedSourcePanel({ pluginId, config, callbacks }: ConnectedSourcePanelProps) { +function DataConnectorPanel({ pluginId, config, callbacks }: DataConnectorPanelProps) { // State machine: disconnected → connecting → connected → browsing → importing // 1. If not connected: show connection form (auto-generated from list_params) @@ -1195,12 +1195,12 @@ Once connected, the table browser uses the unified tree from [design-doc #8](8-u ### 5.4 Frontend Plugin Registration -No per-source frontend code needed. The backend's `/api/app-config` response (see §4.9) tells the frontend what sources exist and what their connection forms / hierarchy look like. One generic `ConnectedSourcePanel` handles all of them. +No per-source frontend code needed. The backend's `/api/app-config` response (see §4.9) tells the frontend what sources exist and what their connection forms / hierarchy look like. One generic `DataConnectorPanel` handles all of them. The frontend factory is only needed once, in the shared module: ```typescript -// src/plugins/_shared/ConnectedSourcePanel.tsx +// src/plugins/_shared/DataConnectorPanel.tsx // Handles ALL connected data sources — databases, BI tools, cloud storage // Reads source config from /api/app-config → SOURCES[] // Renders: connection form (from params_form) → tree browser (from hierarchy) → import @@ -1208,12 +1208,12 @@ The frontend factory is only needed once, in the shared module: ## 6. Full Unification: BI Tools as Data Loaders -Since DF only **consumes** data, both databases and BI tools serve the same role: hierarchical sources of importable tables. We unify them under the same `ConnectedDataSource` model. +Since DF only **consumes** data, both databases and BI tools serve the same role: hierarchical sources of importable tables. We unify them under the same `DataConnector` model. ### 6.1 Architecture (Unified) ``` - ConnectedDataSource (generic lifecycle wrapper) + DataConnector (generic lifecycle wrapper) | ┌────────────┼────────────────┐ │ │ │ @@ -1284,7 +1284,7 @@ The rich Superset-specific features (dashboard filters, column metadata, etc.) a ### 6.3 Critical Differences to Be Aware Of -Unification is the right call, but these differences must be handled in the `ConnectedDataSource` framework: +Unification is the right call, but these differences must be handled in the `DataConnector` framework: #### 1. Auth Model Diversity @@ -1295,7 +1295,7 @@ Unification is the right call, but these differences must be handled in the `Con | Superset, Metabase | JWT (username/password → token) | Expires, needs refresh | | Grafana | API key | Long-lived, no refresh | -**Solution:** The `ConnectedDataSource` auth layer must support both: +**Solution:** The `DataConnector` auth layer must support both: - **Persistent connection** mode (databases): store connection object in session, reconnect on failure - **Token** mode (BI tools, cloud): store token in session, auto-refresh on expiry @@ -1359,7 +1359,7 @@ class ExternalDataLoader(ABC): """Optional rate limit hints. None = no limit.""" return None # or {"requests_per_minute": 60, "concurrent": 5} ``` -The `ConnectedDataSource` framework uses this to throttle catalog expansion and data loads. +The `DataConnector` framework uses this to throttle catalog expansion and data loads. #### 5. Import Filtering: Standard SPJ + Source-Defined Filters @@ -1522,24 +1522,36 @@ For sources that can't filter server-side (e.g., some REST APIs), the framework - MongoDB: database required, collection is scope param — 2-level hierarchy - S3, Azure Blob: bucket/container required (can't list safely) — 2-level hierarchy 3. ✅ Unify `fetch_data_as_arrow()` signature: replace `size`/`sort_columns`/`sort_order` positional params with single `import_options: dict` — extensible for `columns`, `filters`, `source_filters`. All 9 loaders, callers, and tests updated. Renamed `loader_metadata` → `source_info`. Removed pandas from PG/MySQL/MSSQL query path (cursor + `pa.table()` directly). `import_options` stored in workspace metadata for refresh replay. -4. ✅ Implement `ConnectedDataSource` base class with generic auth/catalog/data routes — auto-registers all 10 loaders at startup (90 routes under `/api/sources/{id}/`), exposes `SOURCES` in `/api/app-config` -5. ✅ Implement `SupersetLoader(ExternalDataLoader)` — JWT-based auth (`auth_mode="token"`), dashboard→dataset hierarchy, SQL Lab data fetch. Registered as 10th loader, auto-wrapped by `ConnectedDataSource` with 9 routes. +4. ✅ Implement `DataConnector` base class with generic auth/catalog/data routes — auto-registers all 10 loaders at startup (90 routes under `/api/connectors/{id}/`), exposes `SOURCES` in `/api/app-config` +5. ✅ Implement `SupersetLoader(ExternalDataLoader)` — JWT-based auth (`auth_mode="token"`), dashboard→dataset hierarchy, SQL Lab data fetch. Registered as 10th loader, auto-wrapped by `DataConnector` with 9 routes. 6. ✅ Implement config-driven registration — `data-sources.yml` (searched in `DATA_FORMULATOR_HOME`, cwd, `~/.data-formulator/`, `/etc/`), env vars (`DF_SOURCES__id__key`), `${ENV_REF}` resolution, `auto_discover: false` to restrict to configured sources only. Multiple instances of same type supported. -7. ✅ Integrate `ConnectedDataSource` into frontend — `SOURCES` from `/api/app-config` rendered in `DBManagerPane` sidebar alongside legacy loaders. `DataLoaderForm` accepts optional `connectedSourceId` to route through `/api/sources/{id}/*`. `loadTable` thunk updated to support connected source import. Zero new components — reuses existing form/table UI. +7. ✅ Integrate `DataConnector` into frontend — `SOURCES` from `/api/app-config` rendered in `DBManagerPane` sidebar alongside legacy loaders. `DataLoaderForm` accepts optional `connectorId` to route through `/api/connectors/{id}/*`. `loadTable` thunk updated to support connected source import. Zero new components — reuses existing form/table UI. ### Phase 2: Integration Testing -7. Test database loaders end-to-end: PostgreSQL, MySQL via auto-discovery and `data-sources.yml` config +7. ✅ Test database loaders end-to-end: PostgreSQL, MySQL via auto-discovery and `data-sources.yml` config - Connect → browse hierarchy → scope pinning → import with SPJ filters → refresh → disconnect → reconnect from saved credentials -8. Test `SupersetLoader` end-to-end: dashboard → dataset hierarchy, source-defined filters, SSO auth -9. Deprecate old hand-written `SupersetPlugin(DataSourcePlugin)` -10. Verify remaining loaders via auto-discovery: Kusto, BigQuery, MSSQL, MongoDB, S3, Azure Blob - -### Phase 3: Cleanup + Unified Panel - -11. Remove `DataSourcePlugin` base class, `plugins/` directory, and per-plugin `__init__.py` files -12. Integrate with unified data source panel ([doc #8](8-unified-data-source-panel.md)) -13. Old `/api/db-manager/load-table` endpoint → deprecation path + - 40 unit tests for DataConnector framework (mock loader), 17 config tests, E2E route tests for PG + MySQL (Docker-gated) +8. ✅ Test `SupersetLoader` end-to-end: dashboard → dataset hierarchy, source-defined filters, SSO auth + - 16 integration tests with mocked Superset API (JWT auth, catalog browsing, data preview/import, token refresh) +9. ✅ Deprecate old hand-written `SupersetPlugin(DataSourcePlugin)` — deprecation warnings added, docstrings updated +10. ✅ Verify remaining loaders via auto-discovery: Kusto, BigQuery, MSSQL, MongoDB, S3, Azure Blob + - 16 verification tests confirm catalog_hierarchy, effective_hierarchy, scope pinning, auth_mode, list_params, blueprint generation for all 10 loaders + - Also found and fixed operator-precedence bug in `_build_source_specs` YAML ID assignment + +### Phase 3: Cleanup + Unified Panel ✅ (partial) + +- ✅ Removed 8 legacy `/api/tables/data-loader/*` backend routes from `tables_routes.py` +- ✅ Removed 9 `DATA_LOADER_*` URL constants from frontend `utils.tsx` +- ✅ `DBTableManager` now uses only `serverConfig.SOURCES` (DataConnector) for data source discovery +- ✅ `DataLoaderForm` uses only connected source auth/catalog/import routes (no legacy branches) +- ✅ `loadTable` thunk uses only connected source routes for both store-on-server and ephemeral paths +- ✅ `useDataRefresh` uses connected source `DATA_REFRESH` endpoint (requires active connection) +- ✅ Added `connectorId` to `DataSourceConfig` so tables remember their source +- ✅ Added `DISABLED_SOURCES` to app-config for greyed-out UI entries +- ✅ Enhanced `data/preview` route to support full `import_options` (sort, limit) +- [ ] Remove `DataSourcePlugin` base class, `plugins/` directory, and per-plugin `__init__.py` files +- [ ] Integrate with unified data source panel ([doc #8](8-unified-data-source-panel.md)) ### Phase 4: Advanced Features @@ -1568,7 +1580,7 @@ py-src/data_formulator/ mysql_data_loader.py postgresql_data_loader.py ... - connected_source.py ← NEW: ConnectedDataSource framework + data_connector.py ← NEW: DataConnector framework (route generation, form computation, lifecycle) plugins/ ← REMOVED after Phase 3 ``` @@ -1578,7 +1590,7 @@ Post-migration architecture: ``` ExternalDataLoader (driver) ← each source type implements this ↓ -ConnectedDataSource (framework) ← generic lifecycle wrapper, one implementation +DataConnector (framework) ← generic lifecycle wrapper, one implementation ↓ uses auth/ for credentials, tokens, SSO data-sources.yml / auto-discovery ← config, not code ``` @@ -1636,7 +1648,7 @@ The current `fetch_data_as_arrow(source_table, size, ...)` doesn't support colum Some data sources use OAuth/service accounts, not username/password. The `list_params()` already handles this — BigQuery asks for a service account JSON, Kusto uses Azure AD tokens. -The `ConnectedDataSource` auth layer should support: +The `DataConnector` auth layer should support: - **Password mode** (MySQL, PostgreSQL, MSSQL): user/password fields - **Token/key mode** (BigQuery, Kusto): API key or token file - **OAuth mode** (future): redirect-based auth flow @@ -1645,7 +1657,7 @@ The `ConnectedDataSource` auth layer should support: ### Q6: Should the old `db-manager` endpoints remain? -The existing `POST /api/db-manager/load-table` is a stateless, one-shot endpoint. Once `ConnectedDataSource` plugins exist, it's redundant. But we should keep it for backward compatibility and deprecate it gradually. +The existing `POST /api/db-manager/load-table` is a stateless, one-shot endpoint. Once `DataConnector` plugins exist, it's redundant. But we should keep it for backward compatibility and deprecate it gradually. ``` Phase 1-2: Both endpoints work @@ -1660,7 +1672,7 @@ Phase 4: Remove (or keep as thin wrapper that delegates to plugin) ``` ExternalDataLoader (data protocol: how to connect, browse, fetch) + -ConnectedDataSource (lifecycle mgmt: session, caching, refresh, UI) +DataConnector (lifecycle mgmt: session, caching, refresh, UI) = A full plugin — for databases AND BI tools — for free ``` diff --git a/design-docs/9.1-data-source-connection-model.md b/design-docs/9.1-data-source-connection-model.md new file mode 100644 index 00000000..8fa04487 --- /dev/null +++ b/design-docs/9.1-data-source-connection-model.md @@ -0,0 +1,315 @@ +# Data Source Connection Model — Auth, Persistence, and Multi-User Isolation + +## Status: Complete (Phase A + B done, Phase C deferred to doc 9 Phase 4) + +Parent: [9-generalized-data-source-plugins.md](9-generalized-data-source-plugins.md) + +## 1. Problem + +After Phase 3 of the generalized plugin migration, all external data sources flow through `DataConnector`. But the **connection lifecycle** has gaps: + +1. **Connections are ephemeral.** `DataConnector._loaders` is an in-memory dict. Server restart = all connections lost. Users must re-enter credentials every session. +2. **No "already connected" state.** The data loader panel shows all sources as "Available" with a connect form. There's no way to show "you're already connected to Kusto — here are your tables." +3. **Credential storage exists but isn't wired.** `CredentialVault` (Fernet-encrypted SQLite) exists and works for the Superset plugin, but `DataConnector` doesn't use it. +4. **Multi-user isolation works but has no persistence.** Two users hitting `/api/connectors/kusto/auth/connect` get separate loaders (keyed by identity), but neither survives a restart. + +## 2. Desired UX + +The data loader panel should present two categories: + +### 2.1 Connected Sources (user has active/stored credentials) + +``` +┌──────────────────────────────────┐ +│ ● PostgreSQL (prod) Connected │ ← vault has credentials +│ ● Kusto (corp) Connected │ ← vault has credentials +│ ○ BigQuery (analytics) Session │ ← in-memory only, this session +└──────────────────────────────────┘ +``` + +**Behavior:** User clicks → jumps directly to catalog/table browser. No credential form needed. + +- **Vault-backed (●):** Credentials encrypted in `credentials.db`. Auto-reconnect on server restart. +- **Session-only (○):** In-memory only. Connected this session but credentials not persisted. Lost on restart. + +### 2.2 Available Sources (registered but no credentials) + +``` +┌──────────────────────────────────┐ +│ MySQL │ ← installed, no connection yet +│ S3 │ ← installed, no connection yet +│ MongoDB │ ← installed, no connection yet +│ ───────────────────────────── │ +│ Athena (install) │ ← missing deps +│ MSSQL (install) │ ← missing deps +└──────────────────────────────────┘ +``` + +**Behavior:** User clicks → shown credential form → connect → source moves to "Connected" category. + +### 2.3 Multi-User Isolation + +Same route path, different state per identity: + +``` +Route: /api/connectors/kusto/auth/connect +Alice → _loaders["user:alice@corp.com"] = KustoLoader(cluster="alice-cluster") +Bob → _loaders["user:bob@corp.com"] = KustoLoader(cluster="bob-cluster") +``` + +The admin can also pin shared params via config: + +```yaml +# data-sources.yml +sources: + - type: kusto + id: kusto_corp + name: "Corp Kusto" + params: + cluster: "https://corp.kusto.windows.net" # pinned — hidden from user form + # Users only see: database, token +``` + +In this scenario, both Alice and Bob connect to the same cluster but provide their own database and token. Their loaders are still separate. + +## 3. Credential Persistence Design + +### 3.1 Existing Infrastructure + +| Component | Location | Status | +|-----------|----------|--------| +| `CredentialVault` (abstract) | `credential_vault/base.py` | ✅ Working | +| `LocalCredentialVault` (Fernet + SQLite) | `credential_vault/local_vault.py` | ✅ Working | +| Key auto-generation | `credential_vault/__init__.py` | ✅ Zero-config for local mode | +| API endpoints | `credential_routes.py` | ✅ `/api/credentials/store\|list\|delete` | +| Vault integration | `plugins/superset/` only | ⚠️ Only wired for Superset | + +### 3.2 What Needs to Happen + +Wire `DataConnector` into `CredentialVault`: + +``` +Connect flow: + 1. User submits params via /auth/connect + 2. DataConnector._connect() creates loader, tests connection + 3. If success AND vault available: + → vault.store(identity, source_id, {user_params + safe metadata}) + 4. Loader cached in _loaders[identity] + +Auto-reconnect flow (on /auth/status or first catalog/data call): + 1. _loaders[identity] is empty + 2. Check vault.retrieve(identity, source_id) + 3. If credentials found → _connect(stored_params) → test connection + 4. If test fails → delete stale vault entry, return "not connected" + 5. If test succeeds → loader ready, return "connected" + +Disconnect flow: + 1. User calls /auth/disconnect + 2. _loaders.pop(identity) + 3. vault.delete(identity, source_id) +``` + +### 3.3 Storage Architecture: Centralized Vault + +**Decision: Single centralized `credentials.db` at `DATA_FORMULATOR_HOME/`.** All users' credentials in one Fernet-encrypted SQLite file, keyed by `(user_id, source_key)`. + +Considered and rejected: per-user storage at `users/{id}/credentials.db`. + +**Rationale:** + +| Concern | Centralized | Per-user dirs | +|---------|-------------|---------------| +| Security boundary | Server process holds the Fernet key and can decrypt all entries regardless of file layout | Same — server still needs all keys | +| Operational simplicity | One file, one volume mount, one backup | N directories, must manage creation/cleanup/permissions | +| User data deletion (GDPR) | `DELETE WHERE user_id = ?` | Delete user dir | +| Concurrent access | SQLite handles fine (rare writes) | No contention but N DB connections | +| Backend swap (e.g., Azure Key Vault) | One interface to replace | N stores to replace | + +The logical separation is in the composite key `(user_id, source_key)`, not the physical file layout. Admin-configured credentials don't go in the vault at all — they live in `data-sources.yml` with `auto_connect: true`. + +### 3.4 What Gets Stored in the Vault + +```json +{ + "user_params": { + "host": "db.corp.com", + "port": "5432", + "database": "analytics", + "password": "hunter2" + }, + "connected_at": "2026-04-14T10:30:00Z", + "source_id": "postgresql" +} +``` + +The vault encrypts the **entire blob** with Fernet (AES-128-CBC + HMAC-SHA256). The encryption key: +- **Local mode:** Auto-generated, stored at `DATA_FORMULATOR_HOME/.vault_key` +- **Server mode:** Set via `CREDENTIAL_VAULT_KEY` env var + +### 3.5 What Gets Stored in Workspace Metadata (Unchanged) + +Workspace YAML only stores **non-sensitive** params (via `get_safe_params()`). This is already the case — passwords, tokens, and secrets are filtered out. No change needed. + +### 3.6 Connection State Summary + +| Scenario | _loaders dict | Vault | Survives restart? | +|----------|--------------|-------|-------------------| +| Just connected | ✅ has loader | ✅ encrypted | Yes | +| Reconnected from vault | ✅ has loader | ✅ encrypted | Yes | +| Vault disabled / not available | ✅ has loader | ❌ nothing | No | +| Disconnected | ❌ removed | ❌ deleted | — | +| Server restarted, vault has creds | ❌ empty | ✅ encrypted | Yes (auto-reconnect on next access) | + +## 4. Deployment Scenarios + +### 4.1 Local Mode (single user, `WORKSPACE_BACKEND=local`) + +- User IS the admin +- All auto-discovered sources appear as "Available" +- User connects → credentials stored in vault (zero-config, key auto-generated) +- Server restart → auto-reconnect from vault +- No multi-user concerns + +### 4.2 Centrally Managed (multi-user, auth provider configured) + +- Admin configures shared sources in `data-sources.yml` with pinned params +- Each user provides their own credentials (password/token) for the unpinned params +- Vault keyed by `(user_identity, source_id)` — full isolation +- Two users connecting to the same source_id with different params = two separate vault entries, two separate loaders + +Example: + +```yaml +# Admin config: data-sources.yml +sources: + - type: kusto + id: kusto_corp + name: "Corp Kusto" + params: + cluster: "https://corp.kusto.windows.net" +``` + +``` +Alice connects: vault["user:alice", "kusto_corp"] = {database: "sales", token: "aaa"} +Bob connects: vault["user:bob", "kusto_corp"] = {database: "eng", token: "bbb"} + +Same route: /api/connectors/kusto_corp/auth/connect +Different credentials, different catalog results. +``` + +### 4.3 SSO / Token Forwarding (future) + +When the app's auth provider (OIDC/Azure) issues tokens that the data source also accepts: + +``` +User logs in via OIDC → gets access_token +DataConnector sees auth_mode = "token_forward" + → auto-connect using the user's OIDC token (no credential form) + → no vault storage needed (token comes from auth session) +``` + +This is how the Superset SSO bridge already works. Generalizing it to DataConnector is a future enhancement. + +### 4.4 Ephemeral Mode (`WORKSPACE_BACKEND=ephemeral`) + +- No vault (no persistent storage) +- Connections are session-only (in-memory `_loaders` dict) +- Credentials typed each time +- This is fine — ephemeral mode is for demos/public instances where no state should persist + +## 5. Frontend Changes + +### 5.1 `/api/app-config` Enhancement + +Add `CONNECTED_CONNECTORS` to the config response — the list of source_ids where the current user has vault credentials: + +```json +{ + "CONNECTORS": [...], + "DISABLED_SOURCES": {...}, + "CONNECTED_CONNECTORS": ["postgresql", "kusto_corp"] +} +``` + +This lets the frontend immediately render the "Connected / Available" split on mount without calling `/auth/status` for each source. + +### 5.2 Data Loader Panel States + +```typescript +// Derived from serverConfig.CONNECTORS + serverConfig.CONNECTED_CONNECTORS +const connectedSources = sources.filter(s => connectedIds.includes(s.source_id)); +const availableSources = sources.filter(s => !connectedIds.includes(s.source_id)); +``` + +**Connected source row:** +``` +[●] PostgreSQL (prod) [Browse Tables] [Disconnect] +``` + +**Available source row:** +``` +[ ] MySQL [Connect...] +``` + +### 5.3 Connect Flow UI Change + +After successful connect, the source moves from "Available" to "Connected": +1. Frontend sends `{ params, persist }` to `/auth/connect` (30s AbortController timeout) +2. Backend creates loader → tests connection → persists if requested +3. Backend returns `{ status: "connected", persisted: true/false }` +4. Frontend checks `status === "connected"` before calling `onConnected()` +5. Source re-renders in "Connected" category with catalog browser +6. If timeout or error → source stays in "Available", error message shown + +### 5.4 Persist Credentials Toggle + +The connect form includes a "Remember credentials" checkbox (default: checked). +When unchecked, `persist: false` is sent to the backend, and credentials are +session-only (in-memory). The toggle is only shown when there are param fields. + +## 6. Implementation Plan + +### Phase A: Vault Integration in DataConnector + +1. ✅ Add `_vault_store()`, `_vault_retrieve()`, `_vault_delete()`, `_persist_credentials()` helpers +2. ✅ `_connect()` creates loader in-memory; vault persistence is separate via `_persist_credentials()` +3. ✅ Wire into `_disconnect()` → delete from vault +4. ✅ Add auto-reconnect in `_require_loader()` → try vault before raising +5. ✅ Add `CONNECTED_CONNECTORS` to `/api/app-config` +6. ✅ Tests: vault store/retrieve/disconnect/auto-reconnect/persist-flag (21 tests) + +### Phase B: Frontend Two-Panel UX + +7. ✅ Parse `CONNECTED_CONNECTORS` from server config +8. ✅ Split data loader panel into Connected / Available sections +9. ✅ Auto-open catalog browser for connected sources (auto-reconnect from vault) +10. ✅ "Remember credentials" checkbox (default: on), sends `persist` flag to backend +11. ✅ Connection timeout (30s AbortController), verified `status === "connected"` before state transition + +### Phase C: Token Forwarding (deferred) + +12. Add `auth_mode: "token_forward"` to DataConnector +13. Auto-connect using the user's auth session token +14. No credential form needed — just catalog browser + +## 7. Design Decisions (Resolved) + +### D1: Credential persistence — opt-out (default: persist) + +Local users expect "remember me" behavior. They can disconnect to clear. Server admins can disable the vault entirely by not setting `CREDENTIAL_VAULT_KEY` (though local mode auto-generates a key, so it's always available unless explicitly blocked). + +### D2: Credential rotation / expiry — lazy invalidation + +Vault entries don't expire. Auto-reconnect tests the connection — if the password has changed, the stale entry is deleted and the user is prompted to reconnect. Token-based connections (OAuth) would need refresh token support (Phase C). + +### D3: Vault scope — global, not per-workspace + +A user who connects to PostgreSQL in workspace A should see it connected in workspace B too. The vault key is `(user_id, source_key)` with no workspace dimension. + +### D4: Admin-provided credentials — config file, not vault + +Use `auto_connect: true` in `data-sources.yml`. The admin provides full credentials (with `${ENV_VAR}` refs), and all users auto-connect without entering anything. These never enter the per-user vault. + +### D5: Storage architecture — single centralized vault + +One `credentials.db` at `DATA_FORMULATOR_HOME/`, keyed by `(user_id, source_key)`. Not per-user files. The trust boundary is the server process (which holds the Fernet key), so physical file separation adds operational complexity without security benefit. Admin credentials stay in config; user credentials stay in vault. User data deletion is `DELETE WHERE user_id = ?`. diff --git a/package.json b/package.json index 96101464..61f1d7a0 100644 --- a/package.json +++ b/package.json @@ -4,6 +4,7 @@ "version": "0.1.0", "private": true, "dependencies": { + "@azure/msal-browser": "^5.6.3", "@emotion/react": "^11.14.0", "@emotion/styled": "^11.14.0", "@fontsource/roboto": "^4.5.5", diff --git a/py-src/data_formulator/app.py b/py-src/data_formulator/app.py index e649d6c9..d1c21f6b 100644 --- a/py-src/data_formulator/app.py +++ b/py-src/data_formulator/app.py @@ -53,14 +53,18 @@ def default(self, obj): app.json_encoder = CustomJSONEncoder # Default config from env (can be overridden by CLI args) -# Legacy: DISABLE_DATABASE=true → workspace_backend='ephemeral' +# DISABLE_DATABASE=true is a convenience preset for multi-user anonymous deployments. +# It bundles: ephemeral workspace + no data connectors + no custom models + hide keys. +_disable_database = os.environ.get('DISABLE_DATABASE', 'false').lower() == 'true' _default_ws_backend = os.environ.get('WORKSPACE_BACKEND', 'local') -if os.environ.get('DISABLE_DATABASE', 'false').lower() == 'true' and _default_ws_backend == 'local': +if _disable_database and _default_ws_backend == 'local': _default_ws_backend = 'ephemeral' app.config['CLI_ARGS'] = { + 'host': os.environ.get('HOST', '127.0.0.1'), 'sandbox': os.environ.get('SANDBOX', 'local'), - 'disable_display_keys': os.environ.get('DISABLE_DISPLAY_KEYS', 'false').lower() == 'true', - 'disable_file_upload': os.environ.get('DISABLE_FILE_UPLOAD', 'false').lower() == 'true', + 'disable_display_keys': _disable_database or os.environ.get('DISABLE_DISPLAY_KEYS', 'false').lower() == 'true', + 'disable_data_connectors': _disable_database or os.environ.get('DISABLE_DATA_CONNECTORS', 'false').lower() == 'true', + 'disable_custom_models': _disable_database or os.environ.get('DISABLE_CUSTOM_MODELS', 'false').lower() == 'true', 'project_front_page': os.environ.get('PROJECT_FRONT_PAGE', 'false').lower() == 'true', 'max_display_rows': int(os.environ.get('MAX_DISPLAY_ROWS', '10000')), 'data_dir': os.environ.get('DATA_FORMULATOR_HOME', None), @@ -156,10 +160,13 @@ def _register_blueprints(): from data_formulator.plugins import discover_and_register discover_and_register(app) - # Auto-register all installed data loaders as ConnectedDataSource plugins - print(" Loading connected data sources...", flush=True) - from data_formulator.connected_source import register_connected_sources - register_connected_sources(app) + # Auto-register all installed data loaders as DataConnector instances + if not app.config['CLI_ARGS'].get('disable_data_connectors'): + print(" Loading data connectors...", flush=True) + from data_formulator.data_connector import register_data_connectors + register_data_connectors(app) + else: + print(" Data connectors disabled (DISABLE_DATA_CONNECTORS=true)", flush=True) # Register blueprints at module level so WSGI servers (gunicorn) pick up all routes. @@ -211,7 +218,8 @@ def get_app_config(): config = { "SANDBOX": args['sandbox'], "DISABLE_DISPLAY_KEYS": args['disable_display_keys'], - "DISABLE_FILE_UPLOAD": args['disable_file_upload'], + "DISABLE_DATA_CONNECTORS": args.get('disable_data_connectors', False), + "DISABLE_CUSTOM_MODELS": args.get('disable_custom_models', False), "PROJECT_FRONT_PAGE": args['project_front_page'], "MAX_DISPLAY_ROWS": args['max_display_rows'], "DEV_MODE": args.get('dev', False), @@ -229,6 +237,17 @@ def get_app_config(): config["AUTH_PROVIDER"] = provider.name config["AUTH_INFO"] = provider.get_auth_info() + # Return the server-assigned identity so the frontend can use it. + # For localhost mode this is the fixed local: identity; + # for anonymous mode the server echoes back the browser-provided UUID. + try: + from data_formulator.security.auth import get_identity_id + identity = get_identity_id() + id_type, _, id_value = identity.partition(':') + config["IDENTITY"] = {"type": id_type, "id": id_value} + except Exception: + pass # No identity available (e.g. during startup) + # Expose credential vault availability to the frontend from data_formulator.credential_vault import get_credential_vault config["CREDENTIAL_VAULT_ENABLED"] = get_credential_vault() is not None @@ -251,13 +270,35 @@ def get_app_config(): } config["PLUGINS"] = plugins_info - # Expose connected data sources to the frontend - from data_formulator.connected_source import CONNECTED_SOURCES - if CONNECTED_SOURCES: - sources_info: list[dict] = [] - for sid, src in CONNECTED_SOURCES.items(): - sources_info.append(src.get_frontend_config()) - config["SOURCES"] = sources_info + # Expose data connectors to the frontend + from data_formulator.data_connector import DATA_CONNECTORS + if DATA_CONNECTORS: + connectors_info: list[dict] = [] + for sid, src in DATA_CONNECTORS.items(): + connectors_info.append(src.get_frontend_config()) + config["CONNECTORS"] = connectors_info + + # Tell the frontend which connectors the current user has vault credentials for + # so it can render "Connected" vs "Available" without N status calls. + try: + from data_formulator.security.auth import get_identity_id + identity = get_identity_id() + connected_ids: list[str] = [] + for sid, src in DATA_CONNECTORS.items(): + if src.has_stored_credentials(identity) or src._get_loader(identity) is not None: + connected_ids.append(sid) + if connected_ids: + config["CONNECTED_CONNECTORS"] = connected_ids + except Exception: + pass # No identity available (e.g. during startup); skip + + # Expose disabled data sources (missing deps) so UI can show greyed-out entries + from data_formulator.data_loader import DISABLED_LOADERS + if DISABLED_LOADERS: + config["DISABLED_SOURCES"] = { + name: {"install_hint": hint} + for name, hint in DISABLED_LOADERS.items() + } return flask.jsonify(config) @@ -275,10 +316,15 @@ def parse_args() -> argparse.Namespace: parser.add_argument("--disable-display-keys", action='store_true', default=False, help="Whether disable displaying keys in the frontend UI, recommended to turn on if you host the app not just for yourself.") parser.add_argument("--disable-database", action='store_true', default=False, - help="Deprecated: use --workspace-backend=ephemeral instead. " - "Sets workspace backend to 'ephemeral' for backward compatibility.") - parser.add_argument("--disable-file-upload", action='store_true', default=False, - help="Disable file upload functionality. This prevents the app from uploading files to the server.") + help="Multi-user anonymous preset: enables ephemeral workspace, disables data connectors, " + "disables custom LLM endpoints, and hides API keys. Equivalent to setting " + "--workspace-backend=ephemeral --disable-data-connectors --disable-custom-models --disable-display-keys.") + parser.add_argument("--disable-data-connectors", action='store_true', default=False, + help="Disable external data connectors (MySQL, PostgreSQL, etc.). " + "Recommended for multi-user anonymous deployments to prevent credential exposure.") + parser.add_argument("--disable-custom-models", action='store_true', default=False, + help="Prevent users from adding custom LLM endpoints via the UI. " + "Only server-configured models will be available.") parser.add_argument("--project-front-page", action='store_true', default=False, help="Project the front page as the main page instead of the app.") parser.add_argument("--max-display-rows", type=int, @@ -312,16 +358,25 @@ def run_app(): configure_logging() args = parse_args() - # Legacy: --disable-database → workspace_backend='ephemeral' + # --disable-database is a convenience preset for multi-user anonymous deployments. + # It bundles: ephemeral workspace + no data connectors + no custom models + hide keys. workspace_backend = args.workspace_backend - if args.disable_database and workspace_backend == 'local': - workspace_backend = 'ephemeral' + if args.disable_database: + if workspace_backend == 'local': + workspace_backend = 'ephemeral' + args.disable_data_connectors = True + args.disable_custom_models = True + args.disable_display_keys = True + print(" Multi-user anonymous mode (--disable-database): " + "ephemeral workspace, no connectors, no custom models, keys hidden", flush=True) # Override config from CLI args app.config['CLI_ARGS'] = { + 'host': args.host, 'sandbox': args.sandbox, 'disable_display_keys': args.disable_display_keys, - 'disable_file_upload': args.disable_file_upload, + 'disable_data_connectors': args.disable_data_connectors, + 'disable_custom_models': args.disable_custom_models, 'project_front_page': args.project_front_page, 'max_display_rows': args.max_display_rows, 'data_dir': args.data_dir, diff --git a/py-src/data_formulator/credential_vault/__init__.py b/py-src/data_formulator/credential_vault/__init__.py index 2db8abdb..f9f792bd 100644 --- a/py-src/data_formulator/credential_vault/__init__.py +++ b/py-src/data_formulator/credential_vault/__init__.py @@ -69,8 +69,9 @@ def _resolve_key(home: Path) -> Optional[str]: def get_credential_vault() -> Optional[CredentialVault]: """Return the global :class:`CredentialVault` singleton. - Returns ``None`` only when key resolution fails (should not happen - in normal local-mode operation). + Returns ``None`` when: + - Data connectors are disabled (nothing needs credentials) + - Key resolution fails """ global _vault, _initialized if _initialized: @@ -78,6 +79,16 @@ def get_credential_vault() -> Optional[CredentialVault]: _initialized = True + # Skip vault creation when data connectors are disabled (e.g. ephemeral + # demo deployments). No connectors → no credentials to store. + try: + from flask import current_app + if current_app.config.get('CLI_ARGS', {}).get('disable_data_connectors'): + logger.info("Credential vault skipped (data connectors disabled)") + return None + except RuntimeError: + pass # Outside Flask request context — continue normally + home = get_data_formulator_home() key = _resolve_key(home) if not key: diff --git a/py-src/data_formulator/connected_source.py b/py-src/data_formulator/data_connector.py similarity index 68% rename from py-src/data_formulator/connected_source.py rename to py-src/data_formulator/data_connector.py index b20057ed..589de6b7 100644 --- a/py-src/data_formulator/connected_source.py +++ b/py-src/data_formulator/data_connector.py @@ -1,22 +1,22 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT License. -"""ConnectedDataSource — generic lifecycle wrapper for ExternalDataLoader. +"""DataConnector — generic lifecycle wrapper for ExternalDataLoader. Takes any ``ExternalDataLoader`` class and auto-generates a Flask Blueprint -with auth / catalog / data routes. No per-source code needed. +with auth / catalog / data routes. No per-connector code needed. Usage:: - from data_formulator.connected_source import ConnectedDataSource + from data_formulator.data_connector import DataConnector - plugin = ConnectedDataSource.from_loader( + connector = DataConnector.from_loader( PostgreSQLDataLoader, source_id="pg_prod", display_name="Production DB", default_params={"host": "db.corp", "database": "prod"}, ) - app.register_blueprint(plugin.create_blueprint()) + app.register_blueprint(connector.create_blueprint()) """ import dataclasses @@ -34,8 +34,8 @@ logger = logging.getLogger(__name__) -# Registry of enabled ConnectedDataSource instances (populated at startup). -CONNECTED_SOURCES: dict[str, "ConnectedDataSource"] = {} +# Registry of enabled DataConnector instances (populated at startup). +DATA_CONNECTORS: dict[str, "DataConnector"] = {} # --------------------------------------------------------------------------- @@ -47,7 +47,7 @@ def _sanitize_error(error: Exception) -> tuple[str, int]: Never leaks internal details to the client. """ - logger.error("ConnectedDataSource error", exc_info=error) + logger.error("DataConnector error", exc_info=error) msg = str(error).lower() if "required" in msg or "invalid" in msg: return "Invalid connection parameters", 400 @@ -72,10 +72,10 @@ def _hierarchy_dicts(levels: list[dict[str, str]]) -> list[dict[str, str]]: # --------------------------------------------------------------------------- -# ConnectedDataSource +# DataConnector # --------------------------------------------------------------------------- -class ConnectedDataSource(DataSourcePlugin): +class DataConnector(DataSourcePlugin): """A DataSourcePlugin auto-generated from an ExternalDataLoader. Provides: @@ -114,7 +114,7 @@ def from_loader( display_name: str | None = None, default_params: dict[str, Any] | None = None, icon: str | None = None, - ) -> "ConnectedDataSource": + ) -> "DataConnector": return cls( loader_class=loader_class, source_id=source_id, @@ -141,6 +141,16 @@ def _manifest(self) -> dict[str, Any]: "capabilities": ["tables", "catalog", "refresh"], } + # Common filter-tier param appended to every loader's form + _TABLE_FILTER_PARAM = { + "name": "table_filter", + "type": "string", + "required": False, + "default": "", + "tier": "filter", + "description": "Filter table by keywords (e.g. 'sales')", + } + def get_frontend_config(self) -> dict[str, Any]: all_params = self._loader_class.list_params() form_fields: list[dict] = [] @@ -152,6 +162,9 @@ def get_frontend_config(self) -> dict[str, Any]: else: form_fields.append(param) + # Append common table_filter param + form_fields.append(self._TABLE_FILTER_PARAM) + full_hierarchy = self._loader_class.catalog_hierarchy() effective = [ level for level in full_hierarchy @@ -168,21 +181,36 @@ def get_frontend_config(self) -> dict[str, Any]: "hierarchy": _hierarchy_dicts(full_hierarchy), "effective_hierarchy": _hierarchy_dicts(effective), "auth_instructions": self._loader_class.auth_instructions(), + "auth_mode": self._loader_class.auth_mode(), + "delegated_login": self._resolve_delegated_login(), } + def _resolve_delegated_login(self) -> dict[str, Any] | None: + """Resolve delegated login config, converting relative URLs to absolute.""" + raw = self._loader_class.delegated_login_config() + if raw is None: + return None + login_url = raw.get("login_url", "") + # Resolve relative URLs to the connector's API prefix + if login_url and not login_url.startswith("http"): + login_url = f"/api/connectors/{self._source_id}/{login_url}" + # Only send safe fields to the frontend + return {"login_url": login_url, "label": raw.get("label", "")} + def create_blueprint(self) -> Blueprint: bp = Blueprint( - f"source_{self._source_id}", + f"connector_{self._source_id}", __name__, - url_prefix=f"/api/sources/{self._source_id}", + url_prefix=f"/api/connectors/{self._source_id}", ) self._register_auth_routes(bp) self._register_catalog_routes(bp) self._register_data_routes(bp) + return bp def on_enable(self, app: Flask) -> None: - logger.info("ConnectedDataSource '%s' enabled", self._source_id) + logger.info("DataConnector '%s' enabled", self._source_id) # -- Identity + Loader Management -------------------------------------- @@ -191,27 +219,128 @@ def _get_identity() -> str: from data_formulator.security.auth import get_identity_id return get_identity_id() + @staticmethod + def _get_vault(): + """Return the credential vault (or None if unavailable).""" + from data_formulator.credential_vault import get_credential_vault + return get_credential_vault() + + def _vault_store(self, identity: str, user_params: dict[str, Any]) -> bool: + """Encrypt and persist user_params for this source. Returns True on success.""" + vault = self._get_vault() + if vault is None: + return False + try: + vault.store(identity, self._source_id, { + "user_params": user_params, + "source_id": self._source_id, + }) + return True + except Exception as exc: + logger.warning("Failed to store credentials for %s/%s: %s", + identity[:16], self._source_id, exc) + return False + + def _vault_retrieve(self, identity: str) -> dict[str, Any] | None: + """Retrieve stored user_params from the vault. Returns None if absent.""" + vault = self._get_vault() + if vault is None: + return None + try: + data = vault.retrieve(identity, self._source_id) + if data and "user_params" in data: + return data["user_params"] + return None + except Exception as exc: + logger.warning("Failed to retrieve credentials for %s/%s: %s", + identity[:16], self._source_id, exc) + return None + + def _vault_delete(self, identity: str) -> None: + """Delete stored credentials from the vault.""" + vault = self._get_vault() + if vault is None: + return + try: + vault.delete(identity, self._source_id) + except Exception as exc: + logger.warning("Failed to delete credentials for %s/%s: %s", + identity[:16], self._source_id, exc) + + def has_stored_credentials(self, identity: str) -> bool: + """Check if the vault has credentials for this identity+source.""" + vault = self._get_vault() + if vault is None: + return False + try: + return self._source_id in vault.list_sources(identity) + except Exception: + return False + def _get_loader(self, identity: str | None = None) -> ExternalDataLoader | None: identity = identity or self._get_identity() return self._loaders.get(identity) - def _connect(self, user_params: dict[str, Any]) -> ExternalDataLoader: - """Instantiate a loader with merged params (default + user).""" + def _connect(self, user_params: dict[str, Any], persist: bool = True) -> ExternalDataLoader: + """Instantiate a loader with merged params (default + user). + + Note: This only creates the loader and caches it in-memory. + Vault persistence is handled separately by the caller after + connection verification succeeds. + """ merged = {**self._default_params, **user_params} loader = self._loader_class(merged) identity = self._get_identity() self._loaders[identity] = loader return loader + def _persist_credentials(self, user_params: dict[str, Any]) -> bool: + """Store credentials in the vault for the current identity.""" + identity = self._get_identity() + return self._vault_store(identity, user_params) + def _disconnect(self) -> None: identity = self._get_identity() self._loaders.pop(identity, None) + self._vault_delete(identity) + + def _try_auto_reconnect(self, identity: str) -> ExternalDataLoader | None: + """Attempt to restore a connection from vault credentials. + + Returns the loader on success, or None (and cleans up stale vault + entry) on failure. + """ + stored_params = self._vault_retrieve(identity) + if stored_params is None: + return None + try: + merged = {**self._default_params, **stored_params} + loader = self._loader_class(merged) + if loader.test_connection(): + self._loaders[identity] = loader + logger.info("Auto-reconnected '%s' for %s", self._source_id, identity[:16]) + return loader + else: + logger.info("Auto-reconnect test failed for '%s'/%s, clearing stale credentials", + self._source_id, identity[:16]) + self._vault_delete(identity) + return None + except Exception as exc: + logger.warning("Auto-reconnect failed for '%s'/%s: %s", + self._source_id, identity[:16], exc) + self._vault_delete(identity) + return None def _require_loader(self) -> ExternalDataLoader: - loader = self._get_loader() - if loader is None: - raise ValueError("Not connected. Please connect first.") - return loader + identity = self._get_identity() + loader = self._loaders.get(identity) + if loader is not None: + return loader + # Try auto-reconnect from vault + loader = self._try_auto_reconnect(identity) + if loader is not None: + return loader + raise ValueError("Not connected. Please connect first.") # -- Auth Routes ------------------------------------------------------- @@ -223,15 +352,26 @@ def auth_connect(): try: data = request.get_json() or {} user_params = data.get("params", {}) + persist = data.get("persist", True) loader = source._connect(user_params) if not loader.test_connection(): source._disconnect() return jsonify({"status": "error", "message": "Connection test failed"}), 400 + # Only persist to vault after connection is verified + persisted = False + if persist: + persisted = source._persist_credentials(user_params) + else: + # User opted out — clear any previously stored credentials + identity = source._get_identity() + source._vault_delete(identity) + safe = loader.get_safe_params() return jsonify({ "status": "connected", + "persisted": persisted, "params": safe, "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), @@ -247,12 +387,73 @@ def auth_disconnect(): source._disconnect() return jsonify({"status": "disconnected"}) + @bp.route("/auth/token-connect", methods=["POST"]) + def auth_token_connect(): + """Accept tokens from a delegated (popup) login flow and create a connection. + + Expected JSON body:: + + { + "access_token": "eyJ...", + "refresh_token": "eyJ...", // optional + "user": {...}, // optional user info + "params": {"url": "..."}, // extra params (e.g. Superset base URL) + "persist": true + } + """ + try: + data = request.get_json() or {} + access_token = data.get("access_token") + if not access_token: + return jsonify({"status": "error", "message": "Missing access_token"}), 400 + + extra_params = data.get("params", {}) + persist = data.get("persist", True) + + # Build loader params: merge default + extra + tokens + user_params = { + **extra_params, + "access_token": access_token, + "refresh_token": data.get("refresh_token", ""), + } + + loader = source._connect(user_params) + + if not loader.test_connection(): + source._disconnect() + return jsonify({"status": "error", "message": "Token connection test failed"}), 400 + + persisted = False + if persist: + persisted = source._persist_credentials(user_params) + + safe = loader.get_safe_params() + return jsonify({ + "status": "connected", + "persisted": persisted, + "params": safe, + "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), + "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), + "pinned_scope": loader.pinned_scope(), + "user": data.get("user", {}), + }) + except Exception as e: + source._disconnect() + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + @bp.route("/auth/status", methods=["GET"]) def auth_status(): - loader = source._get_loader() + identity = source._get_identity() + loader = source._get_loader(identity) + # Try auto-reconnect from vault if no in-memory loader + if loader is None: + loader = source._try_auto_reconnect(identity) if loader is None: + has_stored = source.has_stored_credentials(identity) return jsonify({ "connected": False, + "has_stored_credentials": has_stored, "params_form": source.get_frontend_config()["params_form"], }) try: @@ -263,10 +464,12 @@ def auth_status(): source._disconnect() return jsonify({ "connected": False, + "has_stored_credentials": False, "params_form": source.get_frontend_config()["params_form"], }) return jsonify({ "connected": True, + "persisted": source._get_vault() is not None, "params": loader.get_safe_params(), "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), @@ -410,10 +613,15 @@ def data_preview(): if not source_table: return jsonify({"status": "error", "message": "source_table is required"}), 400 - size = data.get("size", 10) + import_options = data.get("import_options", {}) + if not import_options: + # Legacy: accept top-level size/row_limit params + size = data.get("size") or data.get("row_limit", 10) + import_options = {"size": size} + arrow_table = loader.fetch_data_as_arrow( source_table=source_table, - import_options={"size": size}, + import_options=import_options, ) df = arrow_table.to_pandas() rows = _json.loads(df.to_json(orient="records", date_format="iso")) @@ -424,6 +632,7 @@ def data_preview(): "columns": columns, "rows": rows, "row_count": len(rows), + "total_row_count": len(rows), }) except Exception as e: safe_msg, status_code = _sanitize_error(e) @@ -552,7 +761,7 @@ def _build_source_specs() -> tuple[list[SourceSpec], bool]: loader_type = entry.get("type", "") if not loader_type: continue - sid = entry.get("id") or f"{loader_type}_{i}" if i > 0 else loader_type + sid = entry.get("id") or (f"{loader_type}_{i}" if i > 0 else loader_type) yaml_specs.append(SourceSpec( source_id=sid, loader_type=loader_type, @@ -588,8 +797,8 @@ def _build_source_specs() -> tuple[list[SourceSpec], bool]: # Registration # --------------------------------------------------------------------------- -def register_connected_sources(app: Flask) -> None: - """Register ConnectedDataSource plugins from config + auto-discovery. +def register_data_connectors(app: Flask) -> None: + """Register DataConnector instances from config + auto-discovery. Called from ``app.py`` during startup. """ @@ -609,7 +818,7 @@ def register_connected_sources(app: Flask) -> None: logger.warning("Unknown source type '%s' for '%s'", spec.loader_type, spec.source_id) continue - source = ConnectedDataSource.from_loader( + source = DataConnector.from_loader( loader_class, source_id=spec.source_id, display_name=spec.display_name, @@ -619,14 +828,14 @@ def register_connected_sources(app: Flask) -> None: bp = source.create_blueprint() app.register_blueprint(bp) source.on_enable(app) - CONNECTED_SOURCES[spec.source_id] = source + DATA_CONNECTORS[spec.source_id] = source logger.info( - "Registered ConnectedDataSource '%s' (type=%s%s)", + "Registered DataConnector '%s' (type=%s%s)", spec.source_id, spec.loader_type, f", pinned={list(spec.default_params.keys())}" if spec.default_params else "", ) for key, reason in DISABLED_LOADERS.items(): - if key not in CONNECTED_SOURCES: + if key not in DATA_CONNECTORS: logger.info("Source '%s' not available: %s", key, reason) diff --git a/py-src/data_formulator/data_loader/athena_data_loader.py b/py-src/data_formulator/data_loader/athena_data_loader.py index 5e612ca1..0a2fc126 100644 --- a/py-src/data_formulator/data_loader/athena_data_loader.py +++ b/py-src/data_formulator/data_loader/athena_data_loader.py @@ -60,15 +60,15 @@ class AthenaDataLoader(ExternalDataLoader): @staticmethod def list_params() -> list[dict[str, Any]]: params_list = [ - {"name": "aws_profile", "type": "string", "required": False, "default": "", "description": "AWS profile name from ~/.aws/credentials (if set, access key and secret are not required)"}, - {"name": "aws_access_key_id", "type": "string", "required": False, "default": "", "description": "AWS access key ID (not required if using aws_profile)"}, - {"name": "aws_secret_access_key", "type": "string", "required": False, "default": "", "description": "AWS secret access key (not required if using aws_profile)"}, - {"name": "aws_session_token", "type": "string", "required": False, "default": "", "description": "AWS session token (required for temporary credentials)"}, - {"name": "region_name", "type": "string", "required": True, "default": "us-east-1", "description": "AWS region name"}, - {"name": "workgroup", "type": "string", "required": False, "default": "primary", "description": "Athena workgroup name (output location is fetched from workgroup configuration)"}, - {"name": "output_location", "type": "string", "required": False, "default": "", "description": "S3 output location for query results (e.g., s3://bucket/path/). If empty, uses workgroup configuration."}, - {"name": "database", "type": "string", "required": False, "default": "", "description": "Default database/catalog to use for queries"}, - {"name": "query_timeout", "type": "number", "required": False, "default": 300, "description": "Query execution timeout in seconds (default: 300 = 5 minutes)"} + {"name": "aws_profile", "type": "string", "required": False, "default": "", "tier": "auth", "description": "AWS profile name from ~/.aws/credentials (if set, access key and secret are not required)"}, + {"name": "aws_access_key_id", "type": "string", "required": False, "default": "", "sensitive": True, "tier": "auth", "description": "AWS access key ID (not required if using aws_profile)"}, + {"name": "aws_secret_access_key", "type": "string", "required": False, "default": "", "sensitive": True, "tier": "auth", "description": "AWS secret access key (not required if using aws_profile)"}, + {"name": "aws_session_token", "type": "string", "required": False, "default": "", "sensitive": True, "tier": "auth", "description": "AWS session token (required for temporary credentials)"}, + {"name": "region_name", "type": "string", "required": True, "default": "us-east-1", "tier": "connection", "description": "AWS region name"}, + {"name": "workgroup", "type": "string", "required": False, "default": "primary", "tier": "connection", "description": "Athena workgroup name (output location is fetched from workgroup configuration)"}, + {"name": "output_location", "type": "string", "required": False, "default": "", "tier": "connection", "description": "S3 output location for query results (e.g., s3://bucket/path/). If empty, uses workgroup configuration."}, + {"name": "database", "type": "string", "required": False, "default": "", "tier": "filter", "description": "Default database/catalog to use for queries"}, + {"name": "query_timeout", "type": "number", "required": False, "default": 300, "tier": "connection", "description": "Query execution timeout in seconds (default: 300 = 5 minutes)"} ] return params_list diff --git a/py-src/data_formulator/data_loader/azure_blob_data_loader.py b/py-src/data_formulator/data_loader/azure_blob_data_loader.py index 91e94242..56f38c61 100644 --- a/py-src/data_formulator/data_loader/azure_blob_data_loader.py +++ b/py-src/data_formulator/data_loader/azure_blob_data_loader.py @@ -18,13 +18,13 @@ class AzureBlobDataLoader(ExternalDataLoader): @staticmethod def list_params() -> list[dict[str, Any]]: params_list = [ - {"name": "account_name", "type": "string", "required": True, "default": "", "description": "Azure storage account name"}, - {"name": "container_name", "type": "string", "required": True, "default": "", "description": "Azure blob container name"}, - {"name": "connection_string", "type": "string", "required": False, "default": "", "description": "Azure storage connection string (alternative to account_name + credentials)"}, - {"name": "credential_chain", "type": "string", "required": False, "default": "cli;managed_identity;env", "description": "Ordered list of Azure credential providers (cli;managed_identity;env)"}, - {"name": "account_key", "type": "string", "required": False, "default": "", "description": "Azure storage account key"}, - {"name": "sas_token", "type": "string", "required": False, "default": "", "description": "Azure SAS token"}, - {"name": "endpoint", "type": "string", "required": False, "default": "blob.core.windows.net", "description": "Azure endpoint override"} + {"name": "account_name", "type": "string", "required": True, "default": "", "tier": "connection", "description": "Azure storage account name"}, + {"name": "container_name", "type": "string", "required": True, "default": "", "tier": "connection", "description": "Azure blob container name"}, + {"name": "connection_string", "type": "string", "required": False, "default": "", "sensitive": True, "tier": "auth", "description": "Azure storage connection string (alternative to account_name + credentials)"}, + {"name": "credential_chain", "type": "string", "required": False, "default": "cli;managed_identity;env", "tier": "auth", "description": "Ordered list of Azure credential providers (cli;managed_identity;env)"}, + {"name": "account_key", "type": "string", "required": False, "default": "", "sensitive": True, "tier": "auth", "description": "Azure storage account key"}, + {"name": "sas_token", "type": "string", "required": False, "default": "", "sensitive": True, "tier": "auth", "description": "Azure SAS token"}, + {"name": "endpoint", "type": "string", "required": False, "default": "blob.core.windows.net", "tier": "connection", "description": "Azure endpoint override"} ] return params_list diff --git a/py-src/data_formulator/data_loader/bigquery_data_loader.py b/py-src/data_formulator/data_loader/bigquery_data_loader.py index f69fcc55..9aa59d92 100644 --- a/py-src/data_formulator/data_loader/bigquery_data_loader.py +++ b/py-src/data_formulator/data_loader/bigquery_data_loader.py @@ -16,10 +16,10 @@ class BigQueryDataLoader(ExternalDataLoader): @staticmethod def list_params() -> list[dict[str, Any]]: return [ - {"name": "project_id", "type": "text", "required": True, "description": "Google Cloud Project ID", "default": ""}, - {"name": "dataset_id", "type": "text", "required": False, "description": "Dataset ID(s) - leave empty for all, or specify one (e.g., 'billing') or multiple separated by commas (e.g., 'billing,enterprise_collected,ga_api')", "default": ""}, - {"name": "credentials_path", "type": "text", "required": False, "description": "Path to service account JSON file (optional)", "default": ""}, - {"name": "location", "type": "text", "required": False, "description": "BigQuery location (default: US)", "default": "US"} + {"name": "project_id", "type": "text", "required": True, "tier": "connection", "description": "Google Cloud Project ID", "default": ""}, + {"name": "dataset_id", "type": "text", "required": False, "tier": "filter", "description": "Dataset ID(s) - leave empty for all, or specify one (e.g., 'billing') or multiple separated by commas (e.g., 'billing,enterprise_collected,ga_api')", "default": ""}, + {"name": "credentials_path", "type": "text", "required": False, "tier": "auth", "description": "Path to service account JSON file (optional)", "default": ""}, + {"name": "location", "type": "text", "required": False, "tier": "connection", "description": "BigQuery location (default: US)", "default": "US"} ] @staticmethod diff --git a/py-src/data_formulator/data_loader/external_data_loader.py b/py-src/data_formulator/data_loader/external_data_loader.py index 754d20a1..c98b6ab8 100644 --- a/py-src/data_formulator/data_loader/external_data_loader.py +++ b/py-src/data_formulator/data_loader/external_data_loader.py @@ -14,7 +14,7 @@ logger = logging.getLogger(__name__) # Sensitive parameter names that should be excluded from stored metadata -SENSITIVE_PARAMS = {'password', 'api_key', 'secret', 'token', 'access_key', 'secret_key'} +SENSITIVE_PARAMS = {'password', 'api_key', 'secret', 'token', 'access_token', 'refresh_token', 'access_key', 'secret_key'} def sanitize_table_name(name_as: str) -> str: @@ -64,15 +64,25 @@ def get_safe_params(self) -> dict[str, Any]: """ Get connection parameters with sensitive values removed. + Uses the ``sensitive`` flag from :meth:`list_params` as the primary + source of truth, falling back to the ``SENSITIVE_PARAMS`` name set + for params not declared in ``list_params``. + Returns: Dictionary of parameters safe to store in metadata """ if not hasattr(self, 'params'): return {} + # Build set of sensitive names from list_params declarations + declared_sensitive = { + p["name"] for p in self.list_params() + if p.get("sensitive") or p.get("type") == "password" + } + return { k: v for k, v in self.params.items() - if k.lower() not in SENSITIVE_PARAMS + if k not in declared_sensitive and k.lower() not in SENSITIVE_PARAMS } @abstractmethod @@ -186,6 +196,24 @@ def auth_instructions() -> str: """Return human-readable authentication instructions.""" pass + @staticmethod + def delegated_login_config() -> dict[str, Any] | None: + """Return config for delegated (popup-based) token login, or None. + + When a loader supports logging in via the external system's own + login page (e.g. Superset's token bridge), return a dict with: + + * ``"login_url"`` — URL to open in a popup. + * ``"label"`` — button label shown in the UI (e.g. "Login via Superset"). + + The popup is expected to post a ``df-sso-auth`` message back via + ``postMessage`` containing ``access_token``, ``refresh_token``, + and ``user``. + + Returns ``None`` by default (not supported). + """ + return None + @abstractmethod def __init__(self, params: dict[str, Any]): """ diff --git a/py-src/data_formulator/data_loader/kusto_data_loader.py b/py-src/data_formulator/data_loader/kusto_data_loader.py index c04cb9c0..4babff89 100644 --- a/py-src/data_formulator/data_loader/kusto_data_loader.py +++ b/py-src/data_formulator/data_loader/kusto_data_loader.py @@ -16,25 +16,19 @@ class KustoDataLoader(ExternalDataLoader): @staticmethod def list_params() -> list[dict[str, Any]]: params_list = [ - {"name": "kusto_cluster", "type": "string", "required": True, "description": "e.g., https://mycluster.region.kusto.windows.net"}, - {"name": "kusto_database", "type": "string", "required": False, "description": "Database name (leave empty to browse all databases)"}, - {"name": "client_id", "type": "string", "required": False, "description": "only for App Key auth"}, - {"name": "client_secret", "type": "string", "required": False, "description": "only for App Key auth"}, - {"name": "tenant_id", "type": "string", "required": False, "description": "only for App Key auth"} + {"name": "kusto_cluster", "type": "string", "required": True, "tier": "connection", "description": "e.g., https://mycluster.region.kusto.windows.net"}, + {"name": "kusto_database", "type": "string", "required": False, "tier": "filter", "description": "Database name (leave empty to browse all databases)"}, + {"name": "client_id", "type": "string", "required": False, "tier": "auth", "description": "Service principal only"}, + {"name": "client_secret", "type": "string", "required": False, "sensitive": True, "tier": "auth", "description": "Service principal only"}, + {"name": "tenant_id", "type": "string", "required": False, "tier": "auth", "description": "Service principal only"} ] return params_list - + @staticmethod def auth_instructions() -> str: - return """**Example (CLI):** kusto_cluster: `https://mycluster.westus.kusto.windows.net` · kusto_database: `mydb` - -**Example (App Key):** kusto_cluster: `https://mycluster.westus.kusto.windows.net` · kusto_database: `mydb` · client_id: `abc-123...` · client_secret: `xyz...` · tenant_id: `def-456...` - -**Option 1 — Azure CLI (recommended):** -Run `az login` in your terminal. Leave `client_id`, `client_secret`, and `tenant_id` empty. + return """**Option 1 — Azure Default Identity (easiest):** Leave auth fields empty. DF will automatically use your Azure CLI login (`az login`), Managed Identity, VS Code credentials, or environment variables — whichever is available. -**Option 2 — App Key Authentication:** -Register an Azure AD application, generate a client secret, and grant it access to your Kusto cluster (e.g., "AllDatabasesViewer" role via Azure Portal → Kusto cluster → Permissions). Provide `client_id`, `client_secret`, and `tenant_id`.""" +**Option 2 — Service Principal:** Provide `client_id`, `client_secret`, and `tenant_id` for a service principal with cluster access.""" def __init__(self, params: dict[str, Any]): self.params = params @@ -47,18 +41,24 @@ def __init__(self, params: dict[str, Any]): try: if self.client_id and self.client_secret and self.tenant_id: + # Service principal auth self.client = KustoClient(KustoConnectionStringBuilder.with_aad_application_key_authentication( self.kusto_cluster, self.client_id, self.client_secret, self.tenant_id)) + logger.info("Using service principal authentication for Kusto client.") else: - cluster_url = KustoConnectionStringBuilder.with_az_cli_authentication(self.kusto_cluster) - logger.info(f"Connecting to Kusto cluster: {self.kusto_cluster}") - self.client = KustoClient(cluster_url) - logger.info("Using Azure CLI authentication for Kusto client.") + # DefaultAzureCredential: tries az login, Managed Identity, VS Code, env vars, etc. + from azure.identity import DefaultAzureCredential + credential = DefaultAzureCredential() + kcsb = KustoConnectionStringBuilder.with_azure_token_credential( + self.kusto_cluster, credential) + self.client = KustoClient(kcsb) + logger.info("Using DefaultAzureCredential for Kusto client (az login / Managed Identity / etc.).") except Exception as e: logger.error(f"Error creating Kusto client: {e}") raise RuntimeError( f"Error creating Kusto client: {e}. " - "Please authenticate with Azure CLI (az login) when starting the app." + "If running locally, run 'az login' or provide service principal credentials. " + "If running on Azure, ensure a Managed Identity is assigned to the host." ) from e def _convert_kusto_datetime_columns(self, df: pd.DataFrame) -> pd.DataFrame: diff --git a/py-src/data_formulator/data_loader/mongodb_data_loader.py b/py-src/data_formulator/data_loader/mongodb_data_loader.py index 0ddddd3e..cf61c070 100644 --- a/py-src/data_formulator/data_loader/mongodb_data_loader.py +++ b/py-src/data_formulator/data_loader/mongodb_data_loader.py @@ -18,13 +18,13 @@ class MongoDBDataLoader(ExternalDataLoader): @staticmethod def list_params() -> list[dict[str, Any]]: params_list = [ - {"name": "host", "type": "string", "required": True, "default": "localhost", "description": "server address"}, - {"name": "port", "type": "int", "required": False, "default": 27017, "description": "server port"}, - {"name": "username", "type": "string", "required": False, "default": "", "description": "leave blank if no auth"}, - {"name": "password", "type": "string", "required": False, "default": "", "description": "leave blank if no auth"}, - {"name": "database", "type": "string", "required": True, "default": "", "description": "database name"}, - {"name": "collection", "type": "string", "required": False, "default": "", "description": "leave empty to list all collections"}, - {"name": "authSource", "type": "string", "required": False, "default": "", "description": "auth database (defaults to target database)"} + {"name": "host", "type": "string", "required": True, "default": "localhost", "tier": "connection", "description": "server address"}, + {"name": "port", "type": "int", "required": False, "default": 27017, "tier": "connection", "description": "server port"}, + {"name": "username", "type": "string", "required": False, "default": "", "tier": "auth", "description": "leave blank if no auth"}, + {"name": "password", "type": "string", "required": False, "default": "", "sensitive": True, "tier": "auth", "description": "leave blank if no auth"}, + {"name": "database", "type": "string", "required": True, "default": "", "tier": "connection", "description": "database name"}, + {"name": "collection", "type": "string", "required": False, "default": "", "tier": "filter", "description": "leave empty to list all collections"}, + {"name": "authSource", "type": "string", "required": False, "default": "", "tier": "auth", "description": "auth database (defaults to target database)"} ] return params_list diff --git a/py-src/data_formulator/data_loader/mssql_data_loader.py b/py-src/data_formulator/data_loader/mssql_data_loader.py index b0305a10..4587cebd 100644 --- a/py-src/data_formulator/data_loader/mssql_data_loader.py +++ b/py-src/data_formulator/data_loader/mssql_data_loader.py @@ -30,6 +30,7 @@ def list_params() -> list[dict[str, Any]]: "type": "string", "required": True, "default": "localhost", + "tier": "connection", "description": "SQL Server host address or instance name", }, { @@ -37,6 +38,7 @@ def list_params() -> list[dict[str, Any]]: "type": "string", "required": False, "default": "", + "tier": "filter", "description": "Database name (leave empty to browse all databases)", }, { @@ -44,6 +46,7 @@ def list_params() -> list[dict[str, Any]]: "type": "string", "required": False, "default": "", + "tier": "auth", "description": "Username (leave empty for Windows Authentication)", }, { @@ -51,6 +54,8 @@ def list_params() -> list[dict[str, Any]]: "type": "string", "required": False, "default": "", + "sensitive": True, + "tier": "auth", "description": "Password (leave empty for Windows Authentication)", }, { @@ -58,6 +63,7 @@ def list_params() -> list[dict[str, Any]]: "type": "string", "required": False, "default": "1433", + "tier": "connection", "description": "SQL Server port (default: 1433)", }, { @@ -65,6 +71,7 @@ def list_params() -> list[dict[str, Any]]: "type": "string", "required": False, "default": "ODBC Driver 17 for SQL Server", + "tier": "connection", "description": "ODBC driver name", }, { @@ -72,6 +79,7 @@ def list_params() -> list[dict[str, Any]]: "type": "string", "required": False, "default": "yes", + "tier": "connection", "description": "Enable encryption (yes/no)", }, { @@ -79,6 +87,7 @@ def list_params() -> list[dict[str, Any]]: "type": "string", "required": False, "default": "no", + "tier": "connection", "description": "Trust server certificate (yes/no)", }, { @@ -86,6 +95,7 @@ def list_params() -> list[dict[str, Any]]: "type": "string", "required": False, "default": "30", + "tier": "connection", "description": "Connection timeout in seconds", }, ] diff --git a/py-src/data_formulator/data_loader/mysql_data_loader.py b/py-src/data_formulator/data_loader/mysql_data_loader.py index f35a151b..dfde9026 100644 --- a/py-src/data_formulator/data_loader/mysql_data_loader.py +++ b/py-src/data_formulator/data_loader/mysql_data_loader.py @@ -15,11 +15,11 @@ class MySQLDataLoader(ExternalDataLoader): @staticmethod def list_params() -> list[dict[str, Any]]: params_list = [ - {"name": "user", "type": "string", "required": True, "default": "root", "description": "MySQL username"}, - {"name": "password", "type": "string", "required": False, "default": "", "description": "leave blank for no password"}, - {"name": "host", "type": "string", "required": True, "default": "localhost", "description": "server address"}, - {"name": "port", "type": "int", "required": False, "default": 3306, "description": "server port"}, - {"name": "database", "type": "string", "required": False, "default": "", "description": "Database name (leave empty to browse all databases)"} + {"name": "user", "type": "string", "required": True, "default": "root", "tier": "auth", "description": "MySQL username"}, + {"name": "password", "type": "string", "required": False, "default": "", "sensitive": True, "tier": "auth", "description": "leave blank for no password"}, + {"name": "host", "type": "string", "required": True, "default": "localhost", "tier": "connection", "description": "server address"}, + {"name": "port", "type": "int", "required": False, "default": 3306, "tier": "connection", "description": "server port"}, + {"name": "database", "type": "string", "required": False, "default": "", "tier": "filter", "description": "Database name (leave empty to browse all databases)"} ] return params_list diff --git a/py-src/data_formulator/data_loader/postgresql_data_loader.py b/py-src/data_formulator/data_loader/postgresql_data_loader.py index 1c62d6eb..05f49691 100644 --- a/py-src/data_formulator/data_loader/postgresql_data_loader.py +++ b/py-src/data_formulator/data_loader/postgresql_data_loader.py @@ -15,11 +15,11 @@ class PostgreSQLDataLoader(ExternalDataLoader): @staticmethod def list_params() -> list[dict[str, Any]]: params_list = [ - {"name": "user", "type": "string", "required": True, "default": "postgres", "description": "PostgreSQL username"}, - {"name": "password", "type": "string", "required": False, "default": "", "description": "leave blank for no password"}, - {"name": "host", "type": "string", "required": True, "default": "localhost", "description": "PostgreSQL host"}, - {"name": "port", "type": "string", "required": False, "default": "5432", "description": "PostgreSQL port"}, - {"name": "database", "type": "string", "required": False, "default": "", "description": "Database name (leave empty to browse all databases)"} + {"name": "user", "type": "string", "required": True, "default": "postgres", "tier": "auth", "description": "PostgreSQL username"}, + {"name": "password", "type": "string", "required": False, "default": "", "sensitive": True, "tier": "auth", "description": "leave blank for no password"}, + {"name": "host", "type": "string", "required": True, "default": "localhost", "tier": "connection", "description": "PostgreSQL host"}, + {"name": "port", "type": "string", "required": False, "default": "5432", "tier": "connection", "description": "PostgreSQL port"}, + {"name": "database", "type": "string", "required": False, "default": "", "tier": "filter", "description": "Database name (leave empty to browse all databases)"} ] return params_list diff --git a/py-src/data_formulator/data_loader/s3_data_loader.py b/py-src/data_formulator/data_loader/s3_data_loader.py index 30a461e7..d8c3be2b 100644 --- a/py-src/data_formulator/data_loader/s3_data_loader.py +++ b/py-src/data_formulator/data_loader/s3_data_loader.py @@ -19,11 +19,11 @@ class S3DataLoader(ExternalDataLoader): @staticmethod def list_params() -> list[dict[str, Any]]: params_list = [ - {"name": "aws_access_key_id", "type": "string", "required": True, "default": "", "description": "AWS access key ID"}, - {"name": "aws_secret_access_key", "type": "string", "required": True, "default": "", "description": "AWS secret access key"}, - {"name": "aws_session_token", "type": "string", "required": False, "default": "", "description": "AWS session token (required for temporary credentials)"}, - {"name": "region_name", "type": "string", "required": True, "default": "us-east-1", "description": "AWS region name"}, - {"name": "bucket", "type": "string", "required": True, "default": "", "description": "S3 bucket name"} + {"name": "aws_access_key_id", "type": "string", "required": True, "default": "", "sensitive": True, "tier": "auth", "description": "AWS access key ID"}, + {"name": "aws_secret_access_key", "type": "string", "required": True, "default": "", "sensitive": True, "tier": "auth", "description": "AWS secret access key"}, + {"name": "aws_session_token", "type": "string", "required": False, "default": "", "sensitive": True, "tier": "auth", "description": "AWS session token (required for temporary credentials)"}, + {"name": "region_name", "type": "string", "required": True, "default": "us-east-1", "tier": "connection", "description": "AWS region name"}, + {"name": "bucket", "type": "string", "required": True, "default": "", "tier": "connection", "description": "S3 bucket name"} ] return params_list diff --git a/py-src/data_formulator/data_loader/superset_data_loader.py b/py-src/data_formulator/data_loader/superset_data_loader.py index 46736469..ffe30f87 100644 --- a/py-src/data_formulator/data_loader/superset_data_loader.py +++ b/py-src/data_formulator/data_loader/superset_data_loader.py @@ -90,11 +90,14 @@ class SupersetLoader(ExternalDataLoader): def list_params() -> list[dict[str, Any]]: return [ {"name": "url", "type": "string", "required": True, + "tier": "connection", "description": "Superset base URL (e.g. https://bi.company.com)"}, - {"name": "username", "type": "string", "required": True, - "description": "Superset username"}, - {"name": "password", "type": "password", "required": True, - "description": "Superset password"}, + {"name": "username", "type": "string", "required": False, + "tier": "auth", + "description": "Superset username (optional if using SSO)"}, + {"name": "password", "type": "password", "required": False, "sensitive": True, + "tier": "auth", + "description": "Superset password (optional if using SSO)"}, ] @staticmethod @@ -109,6 +112,19 @@ def auth_instructions() -> str: def auth_mode() -> str: return "token" + @staticmethod + def delegated_login_config() -> dict[str, Any] | None: + """Return popup-based login config if PLG_SUPERSET_URL is set.""" + import os + superset_url = os.environ.get("PLG_SUPERSET_URL", "") + if not superset_url: + return None + login_url = os.environ.get( + "PLG_SUPERSET_SSO_LOGIN_URL", + f"{superset_url.rstrip('/')}/df-sso-bridge/", + ) + return {"login_url": login_url, "label": "Login via Superset"} + @staticmethod def catalog_hierarchy() -> list[dict[str, str]]: return [ @@ -133,10 +149,12 @@ def __init__(self, params: dict[str, Any]): self._bridge = _SupersetAuthBridge(self.url) # Authenticate immediately - self._access_token: str | None = None - self._refresh_token: str | None = None - if self.username and self.password: + self._access_token: str | None = params.get("access_token") + self._refresh_token: str | None = params.get("refresh_token") + if not self._access_token and self.username and self.password: self._do_login() + elif not self._access_token: + raise ValueError("Superset requires either username/password or an SSO access token") def _do_login(self) -> None: result = self._bridge.login(self.username, self.password) diff --git a/py-src/data_formulator/plugins/superset/__init__.py b/py-src/data_formulator/plugins/superset/__init__.py index a9db8a27..a40ee461 100644 --- a/py-src/data_formulator/plugins/superset/__init__.py +++ b/py-src/data_formulator/plugins/superset/__init__.py @@ -3,6 +3,13 @@ """Superset data source plugin for Data Formulator. +.. deprecated:: + This legacy plugin is superseded by ``SupersetLoader`` registered as a + ``DataConnector`` (see ``data_loader/superset_data_loader.py``). + It will be removed in Phase 3 of the generalized-plugin migration. + New deployments should use the DataConnector route at + ``/api/connectors/superset/`` instead of ``/api/plugins/superset/``. + Provides: - Password / SSO authentication against a Superset instance - Dataset & dashboard catalog browsing with native filter support @@ -13,7 +20,9 @@ from __future__ import annotations +import logging import os +import warnings from typing import Any from flask import Blueprint, Flask @@ -25,7 +34,13 @@ class SupersetPlugin(DataSourcePlugin): - """Concrete ``DataSourcePlugin`` for Apache Superset.""" + """Concrete ``DataSourcePlugin`` for Apache Superset. + + .. deprecated:: + Superseded by ``SupersetLoader`` + ``DataConnector``. + Routes at ``/api/plugins/superset/`` will be removed in Phase 3. + Use ``/api/connectors/superset/`` instead. + """ @staticmethod def manifest() -> dict[str, Any]: @@ -71,6 +86,17 @@ def get_frontend_config(self) -> dict[str, Any]: def on_enable(self, app: Flask) -> None: """Create shared service objects and store them as Flask extensions.""" + logger = logging.getLogger(__name__) + logger.warning( + "SupersetPlugin (legacy) is deprecated. " + "Use the DataConnector at /api/connectors/superset/ instead. " + "This plugin will be removed in Phase 3." + ) + warnings.warn( + "SupersetPlugin is deprecated; use SupersetLoader via DataConnector", + DeprecationWarning, + stacklevel=2, + ) superset_url = os.environ["PLG_SUPERSET_URL"].rstrip("/") cache_ttl = int(os.environ.get("PLG_SUPERSET_CACHE_TTL", "300")) diff --git a/py-src/data_formulator/security/auth.py b/py-src/data_formulator/security/auth.py index f0b58ce3..e31f1cc7 100644 --- a/py-src/data_formulator/security/auth.py +++ b/py-src/data_formulator/security/auth.py @@ -7,14 +7,17 @@ AUTH_PROVIDER=oidc → OIDCProvider → user: AUTH_PROVIDER=azure_easyauth → AzureEasyAuth → user: - (not set) → anonymous only → browser: + (not set, localhost) → single-user → local: + (not set, 0.0.0.0) → anonymous only → browser: Security Model: +- Local users: Fixed OS-derived identity (single-user localhost only) - Anonymous users: Browser UUID from X-Identity-Id header (prefixed with "browser:") - Authenticated users: Verified identity from a configured AuthProvider (prefixed with "user:") - Namespacing ensures authenticated user data cannot be accessed by spoofing headers """ +import getpass import logging import os import re @@ -48,6 +51,10 @@ # Whether unauthenticated requests may fall back to browser UUID identity. _allow_anonymous: bool = True +# Single-user localhost mode: use fixed OS-derived identity instead of +# trusting the client-provided X-Identity-Id header. +_localhost_identity: Optional[str] = None + def _validate_identity_value(value: str, source: str) -> str: """Validate and return a trimmed identity value. @@ -76,8 +83,13 @@ def init_auth(app: Flask) -> None: Reads ``AUTH_PROVIDER`` to select a provider and ``ALLOW_ANONYMOUS`` to control whether unauthenticated requests are permitted. + + When no provider is configured and the server is bound to a + loopback address (``127.0.0.1`` / ``localhost``), enables + single-user localhost mode with a fixed ``local:`` + identity. """ - global _provider, _allow_anonymous + global _provider, _allow_anonymous, _localhost_identity _allow_anonymous = os.environ.get( "ALLOW_ANONYMOUS", "true" @@ -86,7 +98,24 @@ def init_auth(app: Flask) -> None: provider_name = os.environ.get("AUTH_PROVIDER", "").strip().lower() if not provider_name or provider_name == "anonymous": - logger.info("Auth mode: anonymous only (no AUTH_PROVIDER configured)") + # Determine if single-user localhost mode applies. + host = app.config.get('CLI_ARGS', {}).get('host', os.environ.get('HOST', '127.0.0.1')) + if host in ('127.0.0.1', 'localhost', '::1'): + try: + username = getpass.getuser() + validated = _validate_identity_value(username, "os_username") + _localhost_identity = f"local:{validated}" + logger.info( + "Auth mode: single-user localhost (identity=%s)", + _localhost_identity, + ) + except Exception: + logger.warning( + "Could not determine OS username; falling back to anonymous mode" + ) + logger.info("Auth mode: anonymous only (no AUTH_PROVIDER configured)") + else: + logger.info("Auth mode: anonymous only (no AUTH_PROVIDER configured)") return provider_cls = get_provider_class(provider_name) @@ -129,11 +158,12 @@ def get_identity_id() -> str: Resolution order: 1. Active AuthProvider → ``user:`` - 2. Anonymous fallback (``ALLOW_ANONYMOUS=true``) → ``browser:`` - 3. Neither → ``ValueError`` + 2. Single-user localhost → ``local:`` + 3. Anonymous fallback (``ALLOW_ANONYMOUS=true``) → ``browser:`` + 4. Neither → ``ValueError`` Returns: - ``"user:"`` or ``"browser:"`` + ``"user:"``, ``"local:"``, or ``"browser:"`` Raises: ValueError: when no identity can be determined. @@ -157,6 +187,10 @@ def get_identity_id() -> str: ) raise ValueError(f"Authentication failed: {e}") + # --- single-user localhost ----------------------------------------- + if _localhost_identity: + return _localhost_identity + # --- anonymous fallback -------------------------------------------- if _allow_anonymous: client_identity = request.headers.get("X-Identity-Id") diff --git a/py-src/data_formulator/tables_routes.py b/py-src/data_formulator/tables_routes.py index 1e2a57d2..d24b1208 100644 --- a/py-src/data_formulator/tables_routes.py +++ b/py-src/data_formulator/tables_routes.py @@ -13,7 +13,6 @@ from flask import request, jsonify, Blueprint, Response import pandas as pd from pathlib import Path -from data_formulator.data_loader import DATA_LOADERS, DISABLED_LOADERS from data_formulator.security.auth import get_identity_id from data_formulator.datalake.workspace import Workspace from data_formulator.workspace_factory import get_workspace as _create_workspace @@ -793,272 +792,4 @@ def sanitize_db_error_message(error: Exception) -> tuple[str, int]: if re.search(pattern, error_msg, re.IGNORECASE): return safe_msg, status_code - return "An unexpected error occurred", 500 - - -@tables_bp.route('/data-loader/list-data-loaders', methods=['GET']) -def data_loader_list_data_loaders(): - """List all available data loaders and disabled ones with install hints.""" - - try: - return jsonify({ - "status": "success", - "data_loaders": { - name: { - "params": data_loader.list_params(), - "auth_instructions": data_loader.auth_instructions() - } - for name, data_loader in DATA_LOADERS.items() - }, - "disabled_loaders": { - name: {"install_hint": hint} - for name, hint in DISABLED_LOADERS.items() - } - }) - except Exception as e: - safe_msg, status_code = sanitize_db_error_message(e) - return jsonify({ - "status": "error", - "message": safe_msg - }), status_code - -@tables_bp.route('/data-loader/list-tables', methods=['POST']) -def data_loader_list_tables(): - """List tables from a data loader (no workspace needed).""" - try: - data = request.get_json() - data_loader_type = data.get('data_loader_type') - data_loader_params = data.get('data_loader_params') - table_filter = data.get('table_filter', None) - - if data_loader_type not in DATA_LOADERS: - return jsonify({"status": "error", "message": f"Invalid data loader type. Must be one of: {', '.join(DATA_LOADERS.keys())}"}), 400 - - data_loader = DATA_LOADERS[data_loader_type](data_loader_params) - if hasattr(data_loader, 'list_tables') and 'table_filter' in data_loader.list_tables.__code__.co_varnames: - tables = data_loader.list_tables(table_filter=table_filter) - else: - tables = data_loader.list_tables() - - return jsonify({"status": "success", "tables": tables}) - except Exception as e: - safe_msg, status_code = sanitize_db_error_message(e) - return jsonify({"status": "error", "message": safe_msg}), status_code - - -@tables_bp.route('/data-loader/ingest-data', methods=['POST']) -def data_loader_ingest_data(): - """Ingest data from a data loader into the workspace as parquet.""" - try: - data = request.get_json() - data_loader_type = data.get('data_loader_type') - data_loader_params = data.get('data_loader_params') - table_name = data.get('table_name') - import_options = data.get('import_options', {}) or {} - row_limit = import_options.get('row_limit', 1000000) - sort_columns = import_options.get('sort_columns') - sort_order = import_options.get('sort_order', 'asc') - - if data_loader_type not in DATA_LOADERS: - return jsonify({"status": "error", "message": f"Invalid data loader type. Must be one of: {', '.join(DATA_LOADERS.keys())}"}), 400 - - workspace = _get_workspace() - data_loader = DATA_LOADERS[data_loader_type](data_loader_params) - safe_name = parquet_sanitize_table_name(table_name.split('.')[-1] if '.' in table_name else table_name) - meta = data_loader.ingest_to_workspace( - workspace, - safe_name, - source_table=table_name, - import_options={ - "size": row_limit, - "sort_columns": sort_columns, - "sort_order": sort_order, - }, - ) - return jsonify({ - "status": "success", - "message": "Successfully ingested data from data loader", - "table_name": meta.name, - }) - except Exception as e: - safe_msg, status_code = sanitize_db_error_message(e) - return jsonify({"status": "error", "message": safe_msg}), status_code - - -@tables_bp.route('/data-loader/view-query-sample', methods=['POST']) -def data_loader_view_query_sample(): - """View a sample of data from a query (fetches from external source, no workspace).""" - try: - data = request.get_json() - data_loader_type = data.get('data_loader_type') - data_loader_params = data.get('data_loader_params') - query = data.get('query') - - if data_loader_type not in DATA_LOADERS: - return jsonify({"status": "error", "message": f"Invalid data loader type. Must be one of: {', '.join(DATA_LOADERS.keys())}"}), 400 - - data_loader = DATA_LOADERS[data_loader_type](data_loader_params) - if hasattr(data_loader, 'view_query_sample') and callable(getattr(data_loader, 'view_query_sample')): - sample = data_loader.view_query_sample(query) - else: - return jsonify({ - "status": "error", - "message": "Query sample is only supported for loaders that implement view_query_sample. Use a source table to fetch data.", - }), 400 - return jsonify({"status": "success", "sample": sample, "message": "Successfully retrieved query sample"}) - except Exception as e: - safe_msg, status_code = sanitize_db_error_message(e) - return jsonify({"status": "error", "sample": [], "message": safe_msg}), status_code - - -@tables_bp.route('/data-loader/fetch-data', methods=['POST']) -def data_loader_fetch_data(): - """Fetch data from an external data loader and return as JSON rows WITHOUT saving to workspace. - - This is used when storeOnServer=false (local-only / incognito mode). - The data is returned directly to the frontend without being persisted as parquet. - """ - try: - data = request.get_json() - data_loader_type = data.get('data_loader_type') - data_loader_params = data.get('data_loader_params') - table_name = data.get('table_name') - row_limit = data.get('row_limit', 10000) - sort_columns = data.get('sort_columns') - sort_order = data.get('sort_order', 'asc') - - if not data_loader_type or not table_name: - return jsonify({"status": "error", "message": "data_loader_type and table_name are required"}), 400 - - if data_loader_type not in DATA_LOADERS: - return jsonify({"status": "error", "message": f"Invalid data loader type. Must be one of: {', '.join(DATA_LOADERS.keys())}"}), 400 - - data_loader = DATA_LOADERS[data_loader_type](data_loader_params) - - # Fetch data as DataFrame (not Arrow, since we need JSON output not parquet) - df = data_loader.fetch_data_as_dataframe( - source_table=table_name, - import_options={ - "size": row_limit, - "sort_columns": sort_columns, - "sort_order": sort_order, - }, - ) - - total_row_count = len(df) - # Apply row limit - if len(df) > row_limit: - df = df.head(row_limit) - - df = _dedup_dataframe_columns(df) - rows = json.loads(df.to_json(orient='records', date_format='iso')) - columns = [{"name": col, "type": str(df[col].dtype)} for col in df.columns] - - return jsonify({ - "status": "success", - "rows": rows, - "columns": columns, - "total_row_count": total_row_count, - "row_limit_applied": row_limit, - }) - except Exception as e: - safe_msg, status_code = sanitize_db_error_message(e) - return jsonify({"status": "error", "message": safe_msg}), status_code - - -@tables_bp.route('/data-loader/ingest-data-from-query', methods=['POST']) -def data_loader_ingest_data_from_query(): - """Ingest data from a query into the workspace as parquet.""" - return jsonify({ - "status": "error", - "message": "Ingestion from custom query is not supported. Please select a source table to ingest.", - }), 400 - - -@tables_bp.route('/data-loader/refresh-table', methods=['POST']) -def data_loader_refresh_table(): - """Refresh a table by re-fetching from its source and updating parquet in the workspace.""" - try: - data = request.get_json() - table_name = data.get('table_name') - updated_params = data.get('data_loader_params', {}) - - if not table_name: - return jsonify({"status": "error", "message": "table_name is required"}), 400 - - workspace = _get_workspace() - meta = workspace.get_table_metadata(table_name) - if meta is None: - return jsonify({"status": "error", "message": f"No table '{table_name}' found. Cannot refresh."}), 400 - if not meta.loader_type: - return jsonify({"status": "error", "message": f"No source metadata for table '{table_name}'. Cannot refresh."}), 400 - - old_content_hash = meta.content_hash - data_loader_type = meta.loader_type - data_loader_params = {**(meta.loader_params or {}), **updated_params} - - if data_loader_type not in DATA_LOADERS: - return jsonify({"status": "error", "message": f"Unknown data loader type: {data_loader_type}"}), 400 - - data_loader = DATA_LOADERS[data_loader_type](data_loader_params) - if meta.source_table: - arrow_table = data_loader.fetch_data_as_arrow( - source_table=meta.source_table, - import_options=meta.import_options, - ) - else: - return jsonify({ - "status": "error", - "message": "Refresh is not supported for tables ingested from a query. Only table-based sources can be refreshed.", - }), 400 - - new_meta, data_changed = workspace.refresh_parquet_from_arrow(table_name, arrow_table) - return jsonify({ - "status": "success", - "message": f"Successfully refreshed table '{table_name}'", - "row_count": new_meta.row_count, - "content_hash": new_meta.content_hash, - "data_changed": data_changed, - }) - except Exception as e: - safe_msg, status_code = sanitize_db_error_message(e) - return jsonify({"status": "error", "message": safe_msg}), status_code - - -@tables_bp.route('/data-loader/get-table-metadata', methods=['POST']) -def data_loader_get_table_metadata(): - """Get source metadata for a specific table from workspace.""" - try: - data = request.get_json() - table_name = data.get('table_name') - if not table_name: - return jsonify({"status": "error", "message": "table_name is required"}), 400 - - workspace = _get_workspace() - meta = workspace.get_table_metadata(table_name) - metadata = _table_metadata_to_source_metadata(meta) if meta else None - return jsonify({ - "status": "success", - "metadata": metadata, - "message": f"No metadata found for table '{table_name}'" if metadata is None else None, - }) - except Exception as e: - safe_msg, status_code = sanitize_db_error_message(e) - return jsonify({"status": "error", "message": safe_msg}), status_code - - -@tables_bp.route('/data-loader/list-table-metadata', methods=['GET']) -def data_loader_list_table_metadata(): - """Get source metadata for all tables in the workspace.""" - try: - workspace = _get_workspace() - metadata_list = [] - for name in workspace.list_tables(): - meta = workspace.get_table_metadata(name) - m = _table_metadata_to_source_metadata(meta) if meta else None - if m: - metadata_list.append(m) - return jsonify({"status": "success", "metadata": metadata_list}) - except Exception as e: - safe_msg, status_code = sanitize_db_error_message(e) - return jsonify({"status": "error", "message": safe_msg}), status_code \ No newline at end of file + return "An unexpected error occurred", 500 \ No newline at end of file diff --git a/src/app/App.tsx b/src/app/App.tsx index 92a9670f..50f69a35 100644 --- a/src/app/App.tsx +++ b/src/app/App.tsx @@ -942,50 +942,61 @@ export const AppFC: FC = function AppFC(appProps) { } }, [configLoaded]); - // Unified auth initialisation — driven by /api/auth/info + // Unified auth initialisation — driven by /api/auth/info and server IDENTITY const [authChecked, setAuthChecked] = useState(false); const [migrationBrowserId, setMigrationBrowserId] = useState(null); + const serverConfig = useSelector((state: DataFormulatorState) => state.serverConfig); useEffect(() => { + if (!configLoaded) return; + (async () => { const prevType = localStorage.getItem('df_identity_type'); const prevBrowserId = localStorage.getItem('df_browser_id'); - let resolvedIdentity: { type: 'user' | 'browser'; id: string; displayName?: string } | null = null; + let resolvedIdentity: { type: 'user' | 'browser' | 'local'; id: string; displayName?: string } | null = null; - try { - const info: AuthInfo | null = await getAuthInfo(); + // Check if the server assigned a fixed identity (e.g. localhost mode) + const serverIdentity = serverConfig?.IDENTITY; + if (serverIdentity?.type === 'local' && serverIdentity?.id) { + resolvedIdentity = { type: 'local', id: serverIdentity.id }; + } - if (info?.action === 'frontend') { - // OIDC PKCE — check for an existing session - const user = await getOidcUser(); - if (user && !user.expired) { - resolvedIdentity = { - type: 'user', - id: user.profile.sub, - displayName: user.profile.name ?? undefined, - }; - } - } else if (info?.action === 'transparent') { - // Azure App Service EasyAuth — headers injected by Azure - try { - const resp = await fetch('/.auth/me'); - const result = await resp.json(); - if (Array.isArray(result) && result.length > 0) { - const authData = result[0]; - const name = authData['user_claims']?.find((item: any) => item.typ === 'name')?.val || ''; - const userId = authData['user_id']; - if (userId) { - resolvedIdentity = { type: 'user', id: userId, displayName: name }; + if (!resolvedIdentity) { + try { + const info: AuthInfo | null = await getAuthInfo(); + + if (info?.action === 'frontend') { + // OIDC PKCE — check for an existing session + const user = await getOidcUser(); + if (user && !user.expired) { + resolvedIdentity = { + type: 'user', + id: user.profile.sub, + displayName: user.profile.name ?? undefined, + }; + } + } else if (info?.action === 'transparent') { + // Azure App Service EasyAuth — headers injected by Azure + try { + const resp = await fetch('/.auth/me'); + const result = await resp.json(); + if (Array.isArray(result) && result.length > 0) { + const authData = result[0]; + const name = authData['user_claims']?.find((item: any) => item.typ === 'name')?.val || ''; + const userId = authData['user_id']; + if (userId) { + resolvedIdentity = { type: 'user', id: userId, displayName: name }; + } } + } catch { + // fall through to browser identity } - } catch { - // fall through to browser identity } + // 'redirect' and 'none' → browser identity (resolvedIdentity stays null) + } catch { + // fall through to browser identity } - // 'redirect' and 'none' → browser identity (resolvedIdentity stays null) - } catch { - // fall through to browser identity } if (!resolvedIdentity) { @@ -1011,7 +1022,7 @@ export const AppFC: FC = function AppFC(appProps) { setAuthChecked(true); })(); - }, []); + }, [configLoaded]); useEffect(() => { document.title = toolName; diff --git a/src/app/dfSlice.tsx b/src/app/dfSlice.tsx index 7a51e2d9..430e1d0a 100644 --- a/src/app/dfSlice.tsx +++ b/src/app/dfSlice.tsx @@ -41,7 +41,8 @@ export interface SSEMessage { // Add interface for app configuration export interface ServerConfig { DISABLE_DISPLAY_KEYS: boolean; - DISABLE_FILE_UPLOAD: boolean; + DISABLE_DATA_CONNECTORS: boolean; + DISABLE_CUSTOM_MODELS: boolean; PROJECT_FRONT_PAGE: boolean; MAX_DISPLAY_ROWS: number; AVAILABLE_LANGUAGES: string[]; @@ -55,17 +56,23 @@ export interface ServerConfig { [key: string]: unknown; }; PLUGINS?: Record; - SOURCES?: Array<{ + CONNECTORS?: Array<{ source_id: string; source_type: string; name: string; icon: string; - params_form: Array<{name: string; type: string; required: boolean; default?: string; description?: string}>; + params_form: Array<{name: string; type: string; required: boolean; default?: string; description?: string; sensitive?: boolean; tier?: 'connection' | 'auth' | 'filter'}>; pinned_params: Record; hierarchy: Array<{key: string; label: string}>; effective_hierarchy: Array<{key: string; label: string}>; auth_instructions: string; + auth_mode?: string; + delegated_login?: { login_url: string; label?: string } | null; }>; + DISABLED_SOURCES?: Record; + CONNECTED_CONNECTORS?: string[]; + IDENTITY?: { type: string; id: string }; + CREDENTIAL_VAULT_ENABLED?: boolean; } export interface ModelConfig { @@ -116,8 +123,8 @@ export interface DataFormulatorState { exploration: string; }; - // Identity management: user identity (if logged in) or browser identity (localStorage-based) - // Always initialized with browser identity, updated to user identity if logged in + // Identity management: local (localhost), user (SSO), or browser (anonymous multi-user) + // Initialized with browser identity, then updated from server config or auth provider identity: Identity; /** * Server-managed global models loaded from the backend on every app start. @@ -204,7 +211,8 @@ const initialState: DataFormulatorState = { serverConfig: { DISABLE_DISPLAY_KEYS: false, - DISABLE_FILE_UPLOAD: false, + DISABLE_DATA_CONNECTORS: false, + DISABLE_CUSTOM_MODELS: false, PROJECT_FRONT_PAGE: false, MAX_DISPLAY_ROWS: 10000, AVAILABLE_LANGUAGES: ['en', 'zh'], diff --git a/src/app/identity.ts b/src/app/identity.ts index e92a91ed..b7316bf1 100644 --- a/src/app/identity.ts +++ b/src/app/identity.ts @@ -14,7 +14,7 @@ const BROWSER_ID_KEY = 'df_browser_id'; -export type IdentityType = 'user' | 'browser'; +export type IdentityType = 'user' | 'browser' | 'local'; export interface Identity { type: IdentityType; diff --git a/src/app/tableThunks.ts b/src/app/tableThunks.ts index c32a62cb..266b1574 100644 --- a/src/app/tableThunks.ts +++ b/src/app/tableThunks.ts @@ -16,7 +16,7 @@ import { createAsyncThunk } from '@reduxjs/toolkit'; import { DataSourceConfig, DictTable } from '../components/ComponentType'; import { Type } from '../data/types'; import { inferTypeFromValueArray } from '../data/utils'; -import { fetchWithIdentity, getUrls, getSourceUrls, computeContentHash } from './utils'; +import { fetchWithIdentity, getUrls, getConnectorUrls, computeContentHash } from './utils'; import { DataFormulatorState, dfActions, fetchFieldSemanticType } from './dfSlice'; import { tableDataDB } from './workspaceDB'; @@ -40,12 +40,9 @@ export interface LoadTablePayload { // orphaned sheets when re-uploading a file. replaceSource?: boolean; - // For database sources loaded via external data loader: - dataLoaderType?: string; - dataLoaderParams?: Record; + // For database sources loaded via data connector: sourceTableName?: string; - // For connected data sources (new /api/sources/{id}/ routes): - connectedSourceId?: string; + connectorId?: string; importOptions?: { rowLimit?: number; sortColumns?: string[]; @@ -76,7 +73,7 @@ export const loadTable = createAsyncThunk< >( 'dataFormulator/loadTable', async (payload, { dispatch, getState }) => { - const { table, file, replaceSource, dataLoaderType, dataLoaderParams, sourceTableName, connectedSourceId, importOptions } = payload; + const { table, file, replaceSource, sourceTableName, connectorId, importOptions } = payload; const state = getState(); const frontendRowLimit = state.config?.frontendRowLimit ?? 50000; const existingTables = state.tables; @@ -131,29 +128,15 @@ export const loadTable = createAsyncThunk< if (storeOnServer) { // === STORE ON SERVER PATH === - if (sourceType === 'database' && sourceTableName && (dataLoaderType || connectedSourceId)) { - // Database source: ingest to workspace via data loader or connected source + if (sourceType === 'database' && sourceTableName && connectorId) { + // Database source: ingest to workspace via data connector try { - let ingestUrl: string; - let ingestBody: any; - if (connectedSourceId) { - // Connected source route - ingestUrl = getSourceUrls(connectedSourceId).DATA_IMPORT; - ingestBody = { - source_table: sourceTableName, - table_name: sourceTableName, - import_options: importOptions || {}, - }; - } else { - // Legacy data loader route - ingestUrl = getUrls().DATA_LOADER_INGEST_DATA; - ingestBody = { - data_loader_type: dataLoaderType, - data_loader_params: dataLoaderParams, - table_name: sourceTableName, - import_options: importOptions || {}, - }; - } + const ingestUrl = getConnectorUrls(connectorId).DATA_IMPORT; + const ingestBody = { + source_table: sourceTableName, + table_name: sourceTableName, + import_options: importOptions || {}, + }; const response = await fetchWithIdentity(ingestUrl, { method: 'POST', headers: { 'Content-Type': 'application/json' }, @@ -244,26 +227,26 @@ export const loadTable = createAsyncThunk< } } else { // === LOCAL ONLY PATH (storeOnServer = false) === - if (sourceType === 'database' && dataLoaderType && dataLoaderParams && sourceTableName) { - // Database source: fetch data without saving to workspace + if (sourceType === 'database' && connectorId && sourceTableName) { + // Database source: fetch data via data connector preview (no workspace save) try { - const response = await fetchWithIdentity(getUrls().DATA_LOADER_FETCH_DATA, { + const response = await fetchWithIdentity(getConnectorUrls(connectorId).DATA_PREVIEW, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ - data_loader_type: dataLoaderType, - data_loader_params: dataLoaderParams, - table_name: sourceTableName, - row_limit: frontendRowLimit, - sort_columns: importOptions?.sortColumns, - sort_order: importOptions?.sortOrder, + source_table: sourceTableName, + import_options: { + size: frontendRowLimit, + sort_columns: importOptions?.sortColumns, + sort_order: importOptions?.sortOrder, + }, }), }); const data = await response.json(); if (data.status === 'success') { const rows = data.rows; const names = rows.length > 0 ? Object.keys(rows[0]) : []; - const totalCount: number = data.total_row_count ?? rows.length; + const totalCount: number = data.total_row_count ?? table.virtual?.rowCount ?? rows.length; originalRowCount = totalCount; truncated = rows.length < totalCount; @@ -282,7 +265,6 @@ export const loadTable = createAsyncThunk< levels: [] } }), {}), - // No virtual field = local-only (not stored on server) anchored: true, }; } else { diff --git a/src/app/useDataRefresh.tsx b/src/app/useDataRefresh.tsx index 87b3c552..41ee25ef 100644 --- a/src/app/useDataRefresh.tsx +++ b/src/app/useDataRefresh.tsx @@ -7,7 +7,7 @@ import { DataFormulatorState, dfActions, selectRefreshConfigs } from './dfSlice' import { AppDispatch } from './store'; import { DictTable } from '../components/ComponentType'; import { createTableFromText } from '../data/utils'; -import { fetchWithIdentity, getUrls, computeContentHash } from './utils'; +import { fetchWithIdentity, getUrls, getConnectorUrls, computeContentHash } from './utils'; /** Gzip-compress a string into a Blob using the browser's CompressionStream API. */ async function compressBlob(data: string): Promise { @@ -105,9 +105,8 @@ export function useDataRefresh() { /** * Refreshes a virtual table from its original database source. - * Backend stores connection info and knows how to refresh - frontend just triggers it. + * Uses the connected source refresh endpoint when available. * Backend returns data_changed flag - if false, skip resampling to avoid unnecessary work. - * DuckDB views that depend on this table will auto-recalculate only if data changed. */ const refreshDatabaseTable = useCallback(async (table: DictTable): Promise => { if (!table.virtual?.tableId) { @@ -115,12 +114,16 @@ export function useDataRefresh() { } const tableName = table.virtual.tableId; + const connectorId = table.source?.connectorId; + + if (!connectorId) { + return { tableId: table.id, success: false, message: 'No connector for this table. Please reconnect to the data source.' }; + } try { - // Tell backend to refresh the table - it has the connection info stored - console.log(`[DataRefresh] Requesting backend to refresh "${tableName}" from external source...`); + console.log(`[DataRefresh] Requesting connector '${connectorId}' to refresh "${tableName}"...`); - const refreshResponse = await fetchWithIdentity(getUrls().DATA_LOADER_REFRESH_TABLE, { + const refreshResponse = await fetchWithIdentity(getConnectorUrls(connectorId).DATA_REFRESH, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ table_name: tableName }) @@ -129,16 +132,15 @@ export function useDataRefresh() { const refreshData = await refreshResponse.json(); if (refreshData.status !== 'success') { - // Backend doesn't have connection info for this table console.log(`[DataRefresh] Cannot refresh "${tableName}": ${refreshData.message}`); return { tableId: table.id, success: false, - message: refreshData.message || 'No connection info stored for this table' + message: refreshData.message || 'Refresh failed. You may need to reconnect to the data source.' }; } - console.log(`[DataRefresh] Backend refreshed "${tableName}" (${refreshData.row_count} rows, data_changed=${refreshData.data_changed}, hash=${refreshData.content_hash?.slice(0, 8)})`); + console.log(`[DataRefresh] Backend refreshed "${tableName}" (${refreshData.row_count} rows, data_changed=${refreshData.data_changed})`); // If data hasn't changed, skip resampling - no need to update frontend if (!refreshData.data_changed) { @@ -147,7 +149,6 @@ export function useDataRefresh() { tableId: table.id, success: true, message: `Data unchanged (${refreshData.row_count} rows)`, - // Don't include newRows - signals no update needed }; } diff --git a/src/app/utils.tsx b/src/app/utils.tsx index 6595a892..e0374f77 100644 --- a/src/app/utils.tsx +++ b/src/app/utils.tsx @@ -49,16 +49,6 @@ export function getUrls() { SYNC_TABLE_DATA: `/api/tables/sync-table-data`, EXPORT_TABLE_CSV: `/api/tables/export-table-csv`, - DATA_LOADER_LIST_DATA_LOADERS: `/api/tables/data-loader/list-data-loaders`, - DATA_LOADER_LIST_TABLES: `/api/tables/data-loader/list-tables`, - DATA_LOADER_INGEST_DATA: `/api/tables/data-loader/ingest-data`, - DATA_LOADER_VIEW_QUERY_SAMPLE: `/api/tables/data-loader/view-query-sample`, - DATA_LOADER_INGEST_DATA_FROM_QUERY: `/api/tables/data-loader/ingest-data-from-query`, - DATA_LOADER_REFRESH_TABLE: `/api/tables/data-loader/refresh-table`, - DATA_LOADER_FETCH_DATA: `/api/tables/data-loader/fetch-data`, - DATA_LOADER_GET_TABLE_METADATA: `/api/tables/data-loader/get-table-metadata`, - DATA_LOADER_LIST_TABLE_METADATA: `/api/tables/data-loader/list-table-metadata`, - GET_RECOMMENDATION_QUESTIONS: `/api/agent/get-recommendation-questions`, GENERATE_REPORT_STREAM: `/api/agent/generate-report-stream`, @@ -82,14 +72,15 @@ export function getUrls() { } /** - * Build API URLs for a ConnectedDataSource by source_id. + * Build API URLs for a DataConnector by connector ID. */ -export function getSourceUrls(sourceId: string) { - const base = `/api/sources/${sourceId}`; +export function getConnectorUrls(connectorId: string) { + const base = `/api/connectors/${connectorId}`; return { AUTH_CONNECT: `${base}/auth/connect`, AUTH_DISCONNECT: `${base}/auth/disconnect`, AUTH_STATUS: `${base}/auth/status`, + AUTH_TOKEN_CONNECT: `${base}/auth/token-connect`, CATALOG_LS: `${base}/catalog/ls`, CATALOG_METADATA: `${base}/catalog/metadata`, CATALOG_LIST_TABLES: `${base}/catalog/list_tables`, diff --git a/src/components/ComponentType.tsx b/src/components/ComponentType.tsx index 4e972739..f136f149 100644 --- a/src/components/ComponentType.tsx +++ b/src/components/ComponentType.tsx @@ -133,6 +133,9 @@ export interface DataSourceConfig { // Whether this table can be refreshed (backend has connection info) canRefresh?: boolean; + // Connector ID (for tables loaded via a DataConnector) + connectorId?: string; + // The original table name before backend sanitization (e.g. "Sales Report 2024") originalTableName?: string; } diff --git a/src/i18n/locales/en/common.json b/src/i18n/locales/en/common.json index abda974d..a09448c1 100644 --- a/src/i18n/locales/en/common.json +++ b/src/i18n/locales/en/common.json @@ -298,7 +298,9 @@ "failedIngestData": "Failed to ingest data: {{error}}", "emptyValue": "(empty)", "notInstalledHint": "Not installed. Run: {{hint}}", - "selectDataLoader": "Select a data loader from the left panel", + "selectDataLoader": "Select a data source from the left panel", + "connectedSection": "Connected", + "availableSection": "Available", "uploadingData": "Uploading data...", "rowsCount": "{{count}} rows", "sampleRowsCount": "{{count}} sample rows", @@ -307,7 +309,16 @@ "subsetLoaded": "Subset loaded", "unload": "Unload", "loadTableSubset": "Load Table Subset", - "loadTableBtn": "Load Table" + "loadTableBtn": "Load Table", + "rememberCredentials": "Remember credentials", + "connectionTimeout": "Connection timed out. Please check your credentials and try again.", + "delegatedLogin": "Login via service", + "popupBlocked": "Popup was blocked. Please allow popups and try again.", + "tierConnection": "Connection", + "tierAuth": "Sign in", + "tierFilter": "Scope", + "tierAuthOr": "or", + "tierAuthManual": "Enter credentials manually" }, "dataThread": { "title": "Data Threads", diff --git a/src/i18n/locales/zh/common.json b/src/i18n/locales/zh/common.json index f4765caf..ffb88758 100644 --- a/src/i18n/locales/zh/common.json +++ b/src/i18n/locales/zh/common.json @@ -298,7 +298,9 @@ "failedIngestData": "写入数据失败:{{error}}", "emptyValue": "(空)", "notInstalledHint": "未安装。请运行:{{hint}}", - "selectDataLoader": "请从左侧面板选择数据加载器", + "selectDataLoader": "请从左侧面板选择数据源", + "connectedSection": "已连接", + "availableSection": "可用", "uploadingData": "正在上传数据...", "rowsCount": "{{count}} 行", "sampleRowsCount": "{{count}} 行(样本)", @@ -307,7 +309,9 @@ "subsetLoaded": "子集已加载", "unload": "卸载", "loadTableSubset": "加载表子集", - "loadTableBtn": "加载表" + "loadTableBtn": "加载表", + "rememberCredentials": "记住凭据", + "connectionTimeout": "连接超时。请检查凭据后重试。" }, "dataThread": { "title": "数据线程", diff --git a/src/views/DBTableManager.tsx b/src/views/DBTableManager.tsx index 43d5d6fb..8b95c33b 100644 --- a/src/views/DBTableManager.tsx +++ b/src/views/DBTableManager.tsx @@ -33,7 +33,7 @@ import SearchIcon from '@mui/icons-material/Search'; import Autocomplete from '@mui/material/Autocomplete'; -import { getUrls, getSourceUrls, fetchWithIdentity } from '../app/utils'; +import { getUrls, getConnectorUrls, fetchWithIdentity } from '../app/utils'; import { borderColor } from '../app/tokens'; import { CustomReactTable } from './ReactTable'; import { DataSourceConfig, DictTable } from '../components/ComponentType'; @@ -53,6 +53,7 @@ import UploadFileIcon from '@mui/icons-material/UploadFile'; import DownloadIcon from '@mui/icons-material/Download'; import RestartAltIcon from '@mui/icons-material/RestartAlt'; import CloudUploadIcon from '@mui/icons-material/CloudUpload'; +import ClearIcon from '@mui/icons-material/Clear'; export const handleDBDownload = async (identityId: string) => { @@ -136,14 +137,18 @@ export const DBManagerPane: React.FC<{ const serverConfig = useSelector((state: DataFormulatorState) => state.serverConfig); const dataLoaderConnectParams = useSelector((state: DataFormulatorState) => state.dataLoaderConnectParams); + // Disabled data sources (missing deps) from app-config + const disabledSources = serverConfig.DISABLED_SOURCES ?? {}; - // maps data loader type to list of param defs - const [dataLoaderMetadata, setDataLoaderMetadata] = useState>({}); + // Sources with vault credentials or active in-memory loaders + const [connectedIds, setConnectedIds] = useState>( + new Set(serverConfig.CONNECTED_CONNECTORS ?? []) + ); - // loaders whose deps are missing on the server, keyed by name -> install hint - const [disabledLoaders, setDisabledLoaders] = useState>({}); + // Split sources into connected vs available + const allSources = serverConfig.CONNECTORS ?? []; + const connectedSources = allSources.filter(s => connectedIds.has(s.source_id)); + const availableSources = allSources.filter(s => !connectedIds.has(s.source_id)); const [dbTables, setDbTables] = useState([]); const [selectedTabKey, setSelectedTabKey] = useState(""); @@ -160,10 +165,6 @@ export const DBManagerPane: React.FC<{ })); } - useEffect(() => { - fetchDataLoaders(); - }, []); - useEffect(() => { if (selectedDataLoader === "") { if (dbTables.length == 0) { @@ -201,70 +202,78 @@ export const DBManagerPane: React.FC<{ return undefined; }; - const fetchDataLoaders = async () => { - fetchWithIdentity(getUrls().DATA_LOADER_LIST_DATA_LOADERS, { - method: 'GET', - headers: { - 'Content-Type': 'application/json', - }, - }) - .then(response => response.json()) - .then(data => { - if (data.status === "success") { - setDataLoaderMetadata(data.data_loaders); - setDisabledLoaders(data.disabled_loaders ?? {}); - } else { - console.error('Failed to fetch data loader params:', data.error); - } - }) - .catch(error => { - console.error('Failed to fetch data loader params:', error); - }); - } - useEffect(() => { fetchTables(); }, []); + const sourceButtonSx = (sourceId: string) => ({ + fontSize: 12, + textTransform: "none" as const, + width: '100%', + justifyContent: 'flex-start', + textAlign: 'left' as const, + borderRadius: 0, + py: 1, + px: 2, + color: selectedDataLoader === sourceId ? 'primary.main' : 'text.secondary', + borderRight: selectedDataLoader === sourceId ? 2 : 0, + borderColor: 'primary.main', + }); + let tableSelectionPanel = - {/* Active data loaders */} - {Object.keys(dataLoaderMetadata ?? {}).map((dataLoaderType) => ( + {/* Connected sources — user has stored or active credentials */} + {connectedSources.length > 0 && ( + + {t('db.connectedSection')} + + )} + {connectedSources.map((source) => ( ))} - {/* Disabled loaders (missing deps) — greyed out with install hint */} - {Object.keys(disabledLoaders).length > 0 && ( + {/* Available sources — registered but no credentials */} + {availableSources.length > 0 && ( + + {t('db.availableSection')} + + )} + {availableSources.map((source) => ( + + ))} + + {/* Disabled sources (missing deps) — greyed out with install hint */} + {Object.keys(disabledSources).length > 0 && ( )} - {Object.entries(disabledLoaders).map(([loaderName, { install_hint }]) => ( + {Object.entries(disabledSources).map(([sourceName, { install_hint }]) => ( - {loaderName} + {sourceName} ))} - - {/* Connected data sources from /api/app-config SOURCES */} - {(serverConfig.SOURCES ?? []).length > 0 && ( - - )} - {(serverConfig.SOURCES ?? []).map((source) => ( - - ))} let dataConnectorView = - {/* Empty state when no loader selected */} + {/* Empty state when no source selected */} {selectedDataLoader === '' && ( @@ -338,42 +318,20 @@ export const DBManagerPane: React.FC<{ )} - - {/* Data loader forms */} - {dataLoaderMetadata && Object.entries(dataLoaderMetadata).map(([dataLoaderType, metadata]) => ( - selectedDataLoader === dataLoaderType && ( - - { - setIsUploading(true); - }} - onFinish={(status, message, importedTables) => { - setIsUploading(false); - if (status === "success") { - setSystemMessage(message, "success"); - } else { - setSystemMessage(message, "error"); - } - }} - /> - - ) - ))} - {/* Connected data source forms */} - {(serverConfig.SOURCES ?? []).map((source) => ( - selectedDataLoader === `source:${source.source_id}` && ( + {/* Data source forms (connected + available) */} + {allSources.map((source) => ( + selectedDataLoader === source.source_id && ( { setIsUploading(true); }} @@ -384,7 +342,17 @@ export const DBManagerPane: React.FC<{ } else { setSystemMessage(message, "error"); } - }} + }} + onConnected={() => { + setConnectedIds(prev => new Set([...prev, source.source_id])); + }} + onDisconnected={() => { + setConnectedIds(prev => { + const next = new Set(prev); + next.delete(source.source_id); + return next; + }); + }} /> ) @@ -460,12 +428,17 @@ export const DBManagerPane: React.FC<{ export const DataLoaderForm: React.FC<{ dataLoaderType: string, - paramDefs: {name: string, default?: string, type: string, required: boolean, description?: string}[], + paramDefs: {name: string, default?: string, type: string, required: boolean, description?: string, sensitive?: boolean, tier?: 'connection' | 'auth' | 'filter'}[], authInstructions: string, - connectedSourceId?: string, + connectorId?: string, + autoConnect?: boolean, + delegatedLogin?: { login_url: string; label?: string } | null, + authMode?: string, onImport: () => void, - onFinish: (status: "success" | "error", message: string, importedTables?: string[]) => void -}> = ({dataLoaderType, paramDefs, authInstructions, connectedSourceId, onImport, onFinish}) => { + onFinish: (status: "success" | "error", message: string, importedTables?: string[]) => void, + onConnected?: () => void, + onDisconnected?: () => void, +}> = ({dataLoaderType, paramDefs, authInstructions, connectorId, autoConnect, delegatedLogin, authMode, onImport, onFinish, onConnected, onDisconnected}) => { const { t } = useTranslation(); const dispatch = useDispatch(); const theme = useTheme(); @@ -475,7 +448,6 @@ export const DataLoaderForm: React.FC<{ const [tableMetadata, setTableMetadata] = useState>({}); const [selectedPreviewTable, setSelectedPreviewTable] = useState(null); - let [tableFilter, setTableFilter] = useState(""); // Import mode for the currently selected table const [importMode, setImportMode] = useState<'full' | 'subset'>('full'); const [subsetConfig, setSubsetConfig] = useState<{ rowLimit: number; sortColumns: string[]; sortOrder: 'asc' | 'desc' }>({ rowLimit: 1000, sortColumns: [], sortOrder: 'asc' }); @@ -502,63 +474,205 @@ export const DataLoaderForm: React.FC<{ }, [workspaceLoadedTables, loadedTables]); let [isConnecting, setIsConnecting] = useState(false); + const [persistCredentials, setPersistCredentials] = useState(true); + + // Sensitive params (passwords, tokens, secrets) live in component state only — + // never persisted to Redux / localStorage. + // Sensitivity is declared by the loader via `sensitive: true` or `type: "password"`. + const sensitiveParamNames = useMemo( + () => new Set(paramDefs.filter(p => p.sensitive || p.type === 'password').map(p => p.name)), + [paramDefs] + ); + const [sensitiveParams, setSensitiveParams] = useState>({}); - // Helper: connect and list tables — branches based on connectedSourceId + // Merged params: Redux (non-sensitive) + component state (sensitive) + const mergedParams = useMemo( + () => ({ ...params, ...sensitiveParams }), + [params, sensitiveParams] + ); + + // Ref for the connected-state table filter input (uncontrolled for performance) + const filterInputRef = useRef(null); + + // Connection timeout in milliseconds (30 seconds) + const CONNECTION_TIMEOUT_MS = 30_000; + + // Helper: connect and list tables via data connector const connectAndListTables = useCallback(async (filter?: string) => { setIsConnecting(true); + const controller = new AbortController(); + const timeoutId = setTimeout(() => controller.abort(), CONNECTION_TIMEOUT_MS); try { - if (connectedSourceId) { - // Connected source: first connect, then list tables - const urls = getSourceUrls(connectedSourceId); - const connectResp = await fetchWithIdentity(urls.AUTH_CONNECT, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ params: params }), - }); - const connectData = await connectResp.json(); - if (connectData.status === 'error') { - throw new Error(connectData.message || 'Connection failed'); - } - // List tables - const listResp = await fetchWithIdentity(urls.CATALOG_LIST_TABLES, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ filter: filter?.trim() || null }), - }); - const listData = await listResp.json(); - if (listData.tables) { - setTableMetadata(Object.fromEntries( - listData.tables.map((t: any) => [t.name, t.metadata]) - )); - } else if (listData.status === 'error') { - throw new Error(listData.message || 'Failed to list tables'); - } - } else { - // Legacy data loader: single list-tables call - const resp = await fetchWithIdentity(getUrls().DATA_LOADER_LIST_TABLES, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ - data_loader_type: dataLoaderType, - data_loader_params: params, - table_filter: filter?.trim() || null, - }), - }); - const data = await resp.json(); - if (data.status === 'success') { - setTableMetadata(Object.fromEntries( - data.tables.map((t: any) => [t.name, t.metadata]) - )); - } else { - throw new Error(data.message || 'Failed to list tables'); - } + const sourceId = connectorId!; + const urls = getConnectorUrls(sourceId); + // Strip table_filter from params sent to connect (it's for catalog browsing, not connection) + const { table_filter: _tf, ...connectParams } = mergedParams as Record; + const connectResp = await fetchWithIdentity(urls.AUTH_CONNECT, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ params: connectParams, persist: persistCredentials }), + signal: controller.signal, + }); + clearTimeout(timeoutId); + const connectData = await connectResp.json(); + if (connectData.status !== 'connected') { + throw new Error(connectData.message || 'Connection failed'); } + // List tables before promoting to "connected" state + const tableFilterValue = filter ?? (mergedParams as Record).table_filter ?? ''; + const listResp = await fetchWithIdentity(urls.CATALOG_LIST_TABLES, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ filter: tableFilterValue?.trim() || null }), + }); + const listData = await listResp.json(); + if (listData.tables) { + setTableMetadata(Object.fromEntries( + listData.tables.map((t: any) => [t.name, t.metadata]) + )); + } else if (listData.status === 'error') { + throw new Error(listData.message || 'Failed to list tables'); + } + // Only promote to "connected" after tables are loaded + onConnected?.(); } catch (error: any) { - onFinish("error", error.message || 'Failed to connect'); + clearTimeout(timeoutId); + if (error.name === 'AbortError') { + onFinish("error", t('db.connectionTimeout')); + } else { + onFinish("error", error.message || 'Failed to connect'); + } } finally { setIsConnecting(false); } - }, [connectedSourceId, dataLoaderType, params, onFinish]); + }, [connectorId, mergedParams, persistCredentials, onFinish, onConnected, t]); + + // Delegated (popup-based) login flow for token-based connectors + const popupRef = useRef(null); + const pollTimerRef = useRef | null>(null); + + const handleDelegatedLogin = useCallback(() => { + if (!delegatedLogin?.login_url || !connectorId) return; + setIsConnecting(true); + + const url = new URL(delegatedLogin.login_url, window.location.origin); + url.searchParams.set('df_origin', window.location.origin); + // Pass auth-tier form params (e.g. client_id, tenant_id) to the login endpoint + for (const p of paramDefs) { + if (p.tier === 'auth' && !p.sensitive && p.type !== 'password' && mergedParams[p.name]) { + url.searchParams.set(p.name, mergedParams[p.name]); + } + } + + const width = 600; + const height = 700; + const left = window.screenX + (window.outerWidth - width) / 2; + const top = window.screenY + (window.outerHeight - height) / 2; + const popup = window.open( + url.toString(), + 'df-sso-login', + `width=${width},height=${height},left=${left},top=${top},toolbar=no,menubar=no`, + ); + + if (!popup) { + onFinish("error", t('db.popupBlocked') || 'Popup was blocked. Please allow popups and try again.'); + setIsConnecting(false); + return; + } + popupRef.current = popup; + + const handler = async (event: MessageEvent) => { + if (event.data?.type !== 'df-sso-auth') return; + window.removeEventListener('message', handler); + if (pollTimerRef.current) { clearInterval(pollTimerRef.current); pollTimerRef.current = null; } + popup.close(); + + const { access_token, refresh_token, user } = event.data; + if (access_token) { + try { + const urls = getConnectorUrls(connectorId); + // Send tokens to backend token-connect endpoint + const connectResp = await fetchWithIdentity(urls.AUTH_TOKEN_CONNECT, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ + access_token, + refresh_token, + user, + params: mergedParams, // include any filled-in params (e.g. url) + persist: persistCredentials, + }), + }); + const connectData = await connectResp.json(); + if (connectData.status !== 'connected') { + throw new Error(connectData.message || 'Token connection failed'); + } + // List tables + const listResp = await fetchWithIdentity(urls.CATALOG_LIST_TABLES, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ filter: null }), + }); + const listData = await listResp.json(); + if (listData.tables) { + setTableMetadata(Object.fromEntries( + listData.tables.map((t: any) => [t.name, t.metadata]) + )); + } + onConnected?.(); + } catch (err: any) { + onFinish("error", err.message || 'Login failed'); + } + } + setIsConnecting(false); + }; + + window.addEventListener('message', handler); + + pollTimerRef.current = setInterval(() => { + if (popup.closed) { + if (pollTimerRef.current) { clearInterval(pollTimerRef.current); pollTimerRef.current = null; } + window.removeEventListener('message', handler); + setIsConnecting(false); + } + }, 1000); + }, [delegatedLogin, connectorId, params, persistCredentials, onFinish, onConnected, t]); + + // Auto-connect on mount if this source has stored vault credentials. + // Uses auth/status which auto-reconnects from vault, then lists tables. + const autoConnectTriggered = useRef(false); + useEffect(() => { + if (autoConnect && connectorId && !autoConnectTriggered.current && Object.keys(tableMetadata).length === 0) { + autoConnectTriggered.current = true; + (async () => { + setIsConnecting(true); + try { + const urls = getConnectorUrls(connectorId); + // auth/status triggers auto-reconnect from vault + const statusResp = await fetchWithIdentity(urls.AUTH_STATUS, { method: 'GET' }); + const statusData = await statusResp.json(); + if (statusData.connected) { + // Already connected / reconnected from vault — list tables + const listResp = await fetchWithIdentity(urls.CATALOG_LIST_TABLES, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ filter: null }), + }); + const listData = await listResp.json(); + if (listData.tables) { + setTableMetadata(Object.fromEntries( + listData.tables.map((t: any) => [t.name, t.metadata]) + )); + } + } + } catch (err) { + console.warn('Auto-connect failed for', connectorId, err); + } finally { + setIsConnecting(false); + } + })(); + } + }, [autoConnect, connectorId]); // Auto-select first table for preview when metadata loads useEffect(() => { @@ -832,15 +946,14 @@ export const DataLoaderForm: React.FC<{ databaseTable: tableName, canRefresh: true, lastRefreshed: Date.now(), + connectorId: connectorId, }, }; onImport(); dispatch(loadTable({ table: tableObj, - dataLoaderType: connectedSourceId ? undefined : dataLoaderType, - dataLoaderParams: connectedSourceId ? undefined : params, - connectedSourceId, + connectorId, sourceTableName: tableName, importOptions: Object.keys(importOptions).length > 0 ? importOptions : undefined, })).unwrap() @@ -876,78 +989,77 @@ export const DataLoaderForm: React.FC<{ } {isConnected ? ( - // Connected state: show connection parameters and disconnect button - - - - - {dataLoaderType} + // Connected state: show connection info + table browser + + {/* Header: source name · connection params · disconnect */} + + + {dataLoaderType} + + {paramDefs.filter(p => params[p.name] && !sensitiveParamNames.has(p.name) && p.tier !== 'auth').map((paramDef) => ( + + {paramDef.name}: {params[paramDef.name]} - {paramDefs.filter((paramDef) => params[paramDef.name]).length > 0 && ( - · - )} - {paramDefs.filter((paramDef) => params[paramDef.name]).map((paramDef, index) => ( - - - {paramDef.name}: - - - {params[paramDef.name] || t('db.emptyValue')} - - {index < paramDefs.filter((p) => params[p.name]).length - 1 && ( - · - )} - - ))} - - - - - {t('db.tableFilter')} - - setTableFilter(event.target.value)} - /> - - - - + ))} + + + + {/* Search bar: filter + refresh in a pill-shaped container */} + + + { + if (e.key === 'Enter') { + const val = (e.target as HTMLInputElement).value; + dispatch(dfActions.updateDataLoaderConnectParam({dataLoaderType, paramName: 'table_filter', paramValue: val})); + connectAndListTables(val); + } + }} + inputRef={filterInputRef} + /> + + {tableMetadataBox} @@ -958,82 +1070,208 @@ export const DataLoaderForm: React.FC<{ {dataLoaderType} - - {paramDefs.map((paramDef) => ( - - - {paramDef.name} - {paramDef.required && *} - - { + const hasTiers = paramDefs.some(p => p.tier); + // Section wrapper: subtle background, rounded, with label + const sectionSx = { mt: 1, px: 1.5, pt: 0.75, pb: 1.5, borderRadius: 1, backgroundColor: 'rgba(0,0,0,0.025)' }; + // Shared input style: standard variant (underline), label always shrunk so placeholder is visible + const inputSx = { + '& .MuiInput-underline:before': { borderBottomColor: 'rgba(0,0,0,0.15)' }, + '& .MuiInputBase-root': { fontSize: 12, mt: 1.5 }, + '& .MuiInputBase-input': { fontSize: 12, py: 0.5, px: 0 }, + '& .MuiInputBase-input::placeholder': { fontSize: 11, opacity: 0.45 }, + '& .MuiInputLabel-root': { fontSize: 11, color: 'text.secondary', fontWeight: 500 }, + '& .MuiInputLabel-root.Mui-focused': { color: 'primary.main' }, + }; + const shrinkProps = { shrink: true }; + // Pick 2 or 3 columns to minimise orphan fields on the last row + const balancedCols = (n: number) => { + if (n <= 2) return 2; + if (n % 3 === 0) return 3; // 3,6,9 → perfect 3-col rows + if (n % 2 === 0) return 2; // 4,8 → perfect 2-col rows + return 3; // 5,7 → 3 cols (3+2, 3+3+1) is acceptable + }; + if (!hasTiers) { + // Legacy: no tier field, render flat grid + const cols = balancedCols(paramDefs.length); + return ( + + {paramDefs.map((paramDef) => ( + { + if (sensitiveParamNames.has(paramDef.name)) { + setSensitiveParams(prev => ({ ...prev, [paramDef.name]: event.target.value })); + } else { + dispatch(dfActions.updateDataLoaderConnectParam({ dataLoaderType, paramName: paramDef.name, paramValue: event.target.value })); + } + }} + /> + ))} + + ); + } + + const renderParamGrid = (tierParams: typeof paramDefs) => { + const cols = balancedCols(tierParams.length); + return ( + + {tierParams.map((paramDef) => ( + { + if (sensitiveParamNames.has(paramDef.name)) { + setSensitiveParams(prev => ({ ...prev, [paramDef.name]: event.target.value })); + } else { + dispatch(dfActions.updateDataLoaderConnectParam({ dataLoaderType, paramName: paramDef.name, paramValue: event.target.value })); + } + }} + /> + ))} + + ); + }; + + const connectionParams = paramDefs.filter(p => p.tier === 'connection'); + const filterParams = paramDefs.filter(p => p.tier === 'filter'); + const authParams = paramDefs.filter(p => p.tier === 'auth'); + const hasDelegated = !!delegatedLogin?.login_url; + + return ( + <> + {/* Tier 1: Connection */} + {connectionParams.length > 0 && ( + + + {t('db.tierConnection')} + + {renderParamGrid(connectionParams)} + + )} + + {/* Tier 2: Scope */} + {filterParams.length > 0 && ( + + + {t('db.tierFilter')} + + {renderParamGrid(filterParams)} + + )} + + {/* Tier 3: Sign in — Connect lives here */} + + + {t('db.tierAuth')} + + + {hasDelegated && authParams.length > 0 ? ( + /* Left/right split: delegated | or | credentials + connect */ + + {/* Left: delegated login */} + + + + {/* Right: credential fields + connect */} + + + {authParams.map((paramDef) => ( + { + if (sensitiveParamNames.has(paramDef.name)) { + setSensitiveParams(prev => ({ ...prev, [paramDef.name]: event.target.value })); + } else { + dispatch(dfActions.updateDataLoaderConnectParam({ dataLoaderType, paramName: paramDef.name, paramValue: event.target.value })); + } + }} + /> + ))} + + + + + ) : hasDelegated ? ( + /* Delegated only */ + + ) : ( + /* Manual credentials only + connect */ + <> + {renderParamGrid(authParams)} + + + )} + + + ); + })()} + {paramDefs.length > 0 && ( + { - dispatch(dfActions.updateDataLoaderConnectParam({ - dataLoaderType, paramName: paramDef.name, - paramValue: event.target.value})); - }} + checked={persistCredentials} + onChange={(e) => setPersistCredentials(e.target.checked)} + sx={{ p: 0.5 }} /> - - ))} - - - - - {t('db.tableFilter')} - - setTableFilter(event.target.value)} - /> - - {paramDefs.length > 0 && - } - + } + label={ + + {t('db.rememberCredentials')} + + } + /> + )} {authInstructions.trim() && ( ({ mt: 2, px: 1.5, py: 1, diff --git a/src/views/RefreshDataDialog.tsx b/src/views/RefreshDataDialog.tsx index 307e3d13..3419c63f 100644 --- a/src/views/RefreshDataDialog.tsx +++ b/src/views/RefreshDataDialog.tsx @@ -23,8 +23,6 @@ import { } from '@mui/material'; import CloseIcon from '@mui/icons-material/Close'; import UploadFileIcon from '@mui/icons-material/UploadFile'; -import { useSelector } from 'react-redux'; -import { DataFormulatorState } from '../app/dfSlice'; import { DictTable } from '../components/ComponentType'; import { createTableFromText, loadTextDataWrapper, loadBinaryDataWrapper, readFileText } from '../data/utils'; import { useTranslation } from 'react-i18next'; @@ -71,7 +69,6 @@ export const RefreshDataDialog: React.FC = ({ const [isLoading, setIsLoading] = useState(false); const [error, setError] = useState(null); const fileInputRef = useRef(null); - const serverConfig = useSelector((state: DataFormulatorState) => state.serverConfig); // Constants for content size limits const MAX_DISPLAY_LINES = 20; @@ -458,24 +455,6 @@ export const RefreshDataDialog: React.FC = ({ - {serverConfig.DISABLE_FILE_UPLOAD ? ( - - - {t('upload.fileUploadDisabled')} - - - {t('refresh.installLocallyForUpload')}
- - https://github.com/microsoft/data-formulator - -
-
- ) : ( = ({
- )} diff --git a/src/views/UnifiedDataUploadDialog.tsx b/src/views/UnifiedDataUploadDialog.tsx index 668f64f8..fded3363 100644 --- a/src/views/UnifiedDataUploadDialog.tsx +++ b/src/views/UnifiedDataUploadDialog.tsx @@ -1186,52 +1186,41 @@ export const UnifiedDataUploadDialog: React.FC = ( onChange={handleFileInputChange} /> - {/* File Upload Section - only show drop zone when file upload is enabled */} - {!serverConfig.DISABLE_FILE_UPLOAD ? ( - fileInputRef.current?.click()} - onDrop={handleFileDrop} - onDragOver={handleDragOver} - onDragEnter={handleDragEnter} - onDragLeave={handleDragLeave} - > - - - {t('upload.dragDrop')} - - - {t('upload.or')} {t('upload.browse')} - - {!showFilePreview && ( - - {t('upload.supportedFormats')} - - )} - - ) : ( - - - {t('upload.fileUploadDisabled')} - - - {t('upload.useLoadFromUrl')} + {/* File Upload Section */} + fileInputRef.current?.click()} + onDrop={handleFileDrop} + onDragOver={handleDragOver} + onDragEnter={handleDragEnter} + onDragLeave={handleDragLeave} + > + + + {t('upload.dragDrop')} + + + {t('upload.or')} {t('upload.browse')} + + {!showFilePreview && ( + + {t('upload.supportedFormats')} - - )} + )} +
{showFilePreview && ( diff --git a/tests/backend/integration/test_plugin_app_config.py b/tests/backend/integration/test_plugin_app_config.py index a0bad17a..749869a8 100644 --- a/tests/backend/integration/test_plugin_app_config.py +++ b/tests/backend/integration/test_plugin_app_config.py @@ -61,7 +61,6 @@ def app(): _app.config["CLI_ARGS"] = { "sandbox": "local", "disable_display_keys": False, - "disable_file_upload": False, "project_front_page": False, "max_display_rows": 10000, "dev": False, @@ -75,7 +74,6 @@ def get_app_config(): config = { "SANDBOX": args["sandbox"], "DISABLE_DISPLAY_KEYS": args["disable_display_keys"], - "DISABLE_FILE_UPLOAD": args["disable_file_upload"], "PROJECT_FRONT_PAGE": args["project_front_page"], "MAX_DISPLAY_ROWS": args["max_display_rows"], "DEV_MODE": args.get("dev", False), diff --git a/tests/backend/integration/test_superset_data_connector.py b/tests/backend/integration/test_superset_data_connector.py new file mode 100644 index 00000000..647769c8 --- /dev/null +++ b/tests/backend/integration/test_superset_data_connector.py @@ -0,0 +1,427 @@ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. + +"""Tests for SupersetLoader via DataConnector routes. + +All Superset API calls are mocked — no real Superset instance needed. + +Covers: +- JWT-based auth (token mode): connect / disconnect / status +- Dashboard → dataset hierarchy browsing +- "All Datasets" synthetic namespace +- Dataset metadata retrieval +- Data fetch via SQL Lab (mocked) +- Token refresh / re-login flow +- Frontend config for Superset sources +""" +from __future__ import annotations + +import base64 +import json +import time +from typing import Any +from unittest.mock import MagicMock, patch + +import flask +import pyarrow as pa +import pytest + +from data_formulator.data_connector import DataConnector +from data_formulator.data_loader.external_data_loader import CatalogNode + +pytestmark = [pytest.mark.backend, pytest.mark.plugin] + + +# ------------------------------------------------------------------ +# JWT helpers +# ------------------------------------------------------------------ + +def _make_jwt(exp: float | None = None, sub: str = "admin") -> str: + """Build a fake JWT with a valid exp claim.""" + if exp is None: + exp = time.time() + 3600 # 1 hour from now + header = base64.urlsafe_b64encode(json.dumps({"alg": "HS256"}).encode()).rstrip(b"=").decode() + payload = base64.urlsafe_b64encode( + json.dumps({"sub": sub, "exp": exp}).encode() + ).rstrip(b"=").decode() + sig = base64.urlsafe_b64encode(b"fake-signature").rstrip(b"=").decode() + return f"{header}.{payload}.{sig}" + + +def _expired_jwt() -> str: + return _make_jwt(exp=time.time() - 100) + + +# ------------------------------------------------------------------ +# Mock Superset API +# ------------------------------------------------------------------ + +class MockSupersetClient: + """Simulates SupersetClient API responses.""" + + def __init__(self, url): + self.url = url + + def list_datasets(self, token, page=0, page_size=100): + datasets = [ + {"id": 1, "table_name": "orders_fact", "schema": "public", + "database": {"id": 1, "database_name": "analytics"}, "row_count": 50000}, + {"id": 2, "table_name": "users_dim", "schema": "public", + "database": {"id": 1, "database_name": "analytics"}, "row_count": 10000}, + ] + start = page * page_size + batch = datasets[start:start + page_size] + return {"result": batch, "count": len(datasets)} + + def list_dashboards(self, token, page=0, page_size=500): + return { + "result": [ + {"id": 10, "dashboard_title": "Sales Dashboard"}, + {"id": 20, "dashboard_title": "User Analytics"}, + ], + "count": 2, + } + + def get_dashboard_datasets(self, token, dashboard_id): + if dashboard_id == 10: + return { + "result": [ + {"id": 1, "table_name": "orders_fact", "schema": "public", + "database": {"id": 1, "database_name": "analytics"}, "row_count": 50000}, + ] + } + if dashboard_id == 20: + return { + "result": [ + {"id": 2, "table_name": "users_dim", "schema": "public", + "database": {"id": 1, "database_name": "analytics"}, "row_count": 10000}, + ] + } + return {"result": []} + + def get_dataset_detail(self, token, dataset_id): + datasets = { + 1: { + "id": 1, "table_name": "orders_fact", "schema": "public", + "database": {"id": 1, "database_name": "analytics"}, + "columns": [ + {"column_name": "order_id", "type": "INT"}, + {"column_name": "customer_id", "type": "INT"}, + {"column_name": "amount", "type": "DECIMAL(10,2)"}, + {"column_name": "order_date", "type": "TIMESTAMP"}, + ], + "row_count": 50000, + "description": "Main orders fact table", + "kind": "physical", + }, + 2: { + "id": 2, "table_name": "users_dim", "schema": "public", + "database": {"id": 1, "database_name": "analytics"}, + "columns": [ + {"column_name": "user_id", "type": "INT"}, + {"column_name": "name", "type": "VARCHAR"}, + {"column_name": "email", "type": "VARCHAR"}, + ], + "row_count": 10000, + "kind": "physical", + }, + } + return datasets.get(dataset_id, {}) + + def create_sql_session(self, token): + return {"session_id": "mock-session-123"} + + def execute_sql_with_session(self, session, db_id, sql, schema, limit): + return { + "data": [ + {"order_id": 1, "customer_id": 100, "amount": 99.99, "order_date": "2025-01-01"}, + {"order_id": 2, "customer_id": 101, "amount": 150.00, "order_date": "2025-01-02"}, + {"order_id": 3, "customer_id": 100, "amount": 75.50, "order_date": "2025-01-03"}, + ] + } + + +class MockAuthBridge: + def __init__(self, url): + self.url = url + + def login(self, username, password): + if username == "admin" and password == "admin": + return {"access_token": _make_jwt(), "refresh_token": _make_jwt()} + raise ValueError("Invalid credentials") + + def refresh_token(self, refresh_token): + return {"access_token": _make_jwt()} + + +# ------------------------------------------------------------------ +# Fixtures +# ------------------------------------------------------------------ + +@pytest.fixture(autouse=True) +def _mock_superset_imports(): + """Patch the lazy-imported Superset helpers.""" + import data_formulator.data_loader.superset_data_loader as sdl + old_client, old_bridge = sdl._SupersetClient, sdl._SupersetAuthBridge + sdl._SupersetClient = MockSupersetClient + sdl._SupersetAuthBridge = MockAuthBridge + yield + sdl._SupersetClient, sdl._SupersetAuthBridge = old_client, old_bridge + + +@pytest.fixture +def app(): + _app = flask.Flask(__name__) + _app.config["TESTING"] = True + _app.secret_key = "test" + return _app + + +@pytest.fixture +def source(): + from data_formulator.data_loader.superset_data_loader import SupersetLoader + return DataConnector.from_loader( + SupersetLoader, + source_id="superset", + display_name="Test Superset", + ) + + +@pytest.fixture +def client(app, source): + app.register_blueprint(source.create_blueprint()) + return app.test_client() + + +@pytest.fixture +def connected_client(client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = client.post("/api/connectors/superset/auth/connect", json={ + "params": {"url": "https://bi.example.com", "username": "admin", "password": "admin"}, + }) + assert resp.status_code == 200 + yield client + + +# ================================================================== +# Tests: Auth (JWT token mode) +# ================================================================== + +class TestSupersetAuth: + + def test_connect_success(self, client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = client.post("/api/connectors/superset/auth/connect", json={ + "params": {"url": "https://bi.example.com", "username": "admin", "password": "admin"}, + }) + data = resp.get_json() + assert resp.status_code == 200 + assert data["status"] == "connected" + # Hierarchy: dashboard → dataset + keys = [h["key"] for h in data["hierarchy"]] + assert keys == ["dashboard", "dataset"] + + def test_connect_bad_credentials(self, client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = client.post("/api/connectors/superset/auth/connect", json={ + "params": {"url": "https://bi.example.com", "username": "admin", "password": "wrong"}, + }) + assert resp.status_code in (400, 500) + data = resp.get_json() + assert data["status"] == "error" + # Must not leak the password + assert "wrong" not in json.dumps(data) + + def test_auth_mode_is_token(self): + from data_formulator.data_loader.superset_data_loader import SupersetLoader + assert SupersetLoader.auth_mode() == "token" + + def test_disconnect_and_status(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/superset/auth/disconnect") + assert resp.get_json()["status"] == "disconnected" + + resp = connected_client.get("/api/connectors/superset/auth/status") + assert resp.get_json()["connected"] is False + + +# ================================================================== +# Tests: Catalog browsing (dashboard → dataset hierarchy) +# ================================================================== + +class TestSupersetCatalog: + + def test_ls_root_lists_dashboards_and_all_datasets(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/superset/catalog/ls", json={"path": []}) + data = resp.get_json() + assert resp.status_code == 200 + + names = [n["name"] for n in data["nodes"]] + assert "Sales Dashboard" in names + assert "User Analytics" in names + assert "All Datasets" in names + + # All root nodes should be namespace + for node in data["nodes"]: + assert node["node_type"] == "namespace" + + def test_ls_dashboard_lists_its_datasets(self, connected_client): + """Expand Sales Dashboard → should see orders_fact.""" + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/superset/catalog/ls", json={ + "path": ["10"], # Sales Dashboard ID + }) + data = resp.get_json() + assert resp.status_code == 200 + assert len(data["nodes"]) >= 1 + names = [n["name"] for n in data["nodes"]] + assert "orders_fact" in names + for n in data["nodes"]: + assert n["node_type"] == "table" + + def test_ls_all_datasets(self, connected_client): + """Expand 'All Datasets' → should see both datasets.""" + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/superset/catalog/ls", json={ + "path": ["__all__"], + }) + data = resp.get_json() + names = [n["name"] for n in data["nodes"]] + assert "orders_fact" in names + assert "users_dim" in names + + def test_ls_with_filter(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/superset/catalog/ls", json={ + "path": ["__all__"], + "filter": "orders", + }) + nodes = resp.get_json()["nodes"] + assert len(nodes) == 1 + assert nodes[0]["name"] == "orders_fact" + + def test_catalog_metadata(self, connected_client): + """Get metadata for a specific dataset.""" + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/superset/catalog/metadata", json={ + "path": ["10", "1"], # dashboard_id, dataset_id + }) + data = resp.get_json() + assert resp.status_code == 200 + meta = data["metadata"] + assert meta["dataset_id"] == 1 + assert meta["row_count"] == 50000 + col_names = [c["name"] for c in meta["columns"]] + assert "order_id" in col_names + assert "amount" in col_names + + def test_list_tables_flat(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/superset/catalog/list_tables", json={}) + data = resp.get_json() + assert len(data["tables"]) == 2 + names = [t["name"] for t in data["tables"]] + assert any("orders_fact" in n for n in names) + assert any("users_dim" in n for n in names) + + +# ================================================================== +# Tests: Data routes +# ================================================================== + +class TestSupersetData: + + def test_preview(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/superset/data/preview", json={ + "source_table": "1:orders_fact", + "size": 3, + }) + data = resp.get_json() + assert resp.status_code == 200 + assert data["status"] == "success" + assert data["row_count"] > 0 + col_names = {c["name"] for c in data["columns"]} + assert "order_id" in col_names + + def test_import(self, connected_client): + mock_meta = MagicMock() + mock_meta.name = "orders" + mock_meta.row_count = 3 + + with patch.object(DataConnector, "_get_identity", return_value="test-user"), \ + patch("data_formulator.security.auth.get_identity_id", return_value="test-user"), \ + patch("data_formulator.workspace_factory.get_workspace") as mock_ws: + + from data_formulator.data_loader.superset_data_loader import SupersetLoader + with patch.object(SupersetLoader, "ingest_to_workspace", return_value=mock_meta): + resp = connected_client.post("/api/connectors/superset/data/import", json={ + "source_table": "1:orders_fact", + "table_name": "orders", + }) + data = resp.get_json() + assert resp.status_code == 200 + assert data["status"] == "success" + assert data["table_name"] == "orders" + assert data["refreshable"] is True + + +# ================================================================== +# Tests: Token refresh +# ================================================================== + +class TestSupersetTokenRefresh: + + def test_connect_with_expired_token_triggers_refresh(self, client): + """If token expires between connect and catalog call, refresh should work.""" + from data_formulator.data_loader.superset_data_loader import SupersetLoader + + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + # Connect + resp = client.post("/api/connectors/superset/auth/connect", json={ + "params": {"url": "https://bi.example.com", "username": "admin", "password": "admin"}, + }) + assert resp.status_code == 200 + + # Now artificially expire the token + # The mock _ensure_token will handle refresh via MockAuthBridge + resp = client.post("/api/connectors/superset/catalog/ls", json={"path": []}) + assert resp.status_code == 200 + assert len(resp.get_json()["nodes"]) > 0 + + +# ================================================================== +# Tests: Frontend Config +# ================================================================== + +class TestSupersetFrontendConfig: + + def test_config_structure(self, source): + cfg = source.get_frontend_config() + assert cfg["source_id"] == "superset" + assert cfg["name"] == "Test Superset" + # All params should be in form (nothing pinned) + form_names = {f["name"] for f in cfg["params_form"]} + assert "url" in form_names + assert "username" in form_names + assert "password" in form_names + + def test_pinned_url(self): + from data_formulator.data_loader.superset_data_loader import SupersetLoader + source = DataConnector.from_loader( + SupersetLoader, + source_id="superset_corp", + display_name="Corp Superset", + default_params={"url": "https://bi.corp.com"}, + ) + cfg = source.get_frontend_config() + assert cfg["pinned_params"]["url"] == "https://bi.corp.com" + form_names = {f["name"] for f in cfg["params_form"]} + assert "url" not in form_names + assert "username" in form_names + + def test_hierarchy_is_dashboard_dataset(self, source): + cfg = source.get_frontend_config() + keys = [h["key"] for h in cfg["hierarchy"]] + assert keys == ["dashboard", "dataset"] diff --git a/tests/backend/unit/test_all_loader_verification.py b/tests/backend/unit/test_all_loader_verification.py new file mode 100644 index 00000000..c50fc348 --- /dev/null +++ b/tests/backend/unit/test_all_loader_verification.py @@ -0,0 +1,226 @@ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. + +"""Verification tests for all data loader catalog hierarchies and static methods. + +Ensures every registered loader correctly implements: +- catalog_hierarchy() with valid level keys/labels +- effective_hierarchy() with scope pinning +- pinned_scope() +- auth_mode() +- list_params() / auth_instructions() + +No database connections required — tests static/class methods only. +""" +from __future__ import annotations + +import pytest + +from data_formulator.data_loader.external_data_loader import ExternalDataLoader + +pytestmark = [pytest.mark.backend, pytest.mark.plugin] + + +# ------------------------------------------------------------------ +# Test data: expected hierarchies per loader type +# ------------------------------------------------------------------ + +_EXPECTED_HIERARCHIES = { + "mysql": ["database", "table"], + "postgresql": ["database", "schema", "table"], + "mssql": ["database", "schema", "table"], + "bigquery": ["project_id", "dataset_id", "table"], + "kusto": ["kusto_database", "table"], + "athena": ["database", "table"], + "mongodb": ["database", "collection"], + "s3": ["bucket", "table"], + "azure_blob": ["container_name", "table"], + "superset": ["dashboard", "dataset"], +} + + +def _get_available_loaders() -> dict[str, type[ExternalDataLoader]]: + """Import the DATA_LOADERS registry, which only includes loaders whose deps are installed.""" + from data_formulator.data_loader import DATA_LOADERS + return DATA_LOADERS + + +# ================================================================== +# Tests: catalog_hierarchy +# ================================================================== + +class TestAllLoaderCatalogHierarchies: + + def test_all_loaders_have_hierarchy(self): + """Every registered loader must implement catalog_hierarchy().""" + for key, cls in _get_available_loaders().items(): + h = cls.catalog_hierarchy() + assert isinstance(h, list), f"{key}: catalog_hierarchy() must return a list" + assert len(h) >= 1, f"{key}: hierarchy must have at least one level" + for level in h: + assert "key" in level, f"{key}: each level must have 'key'" + assert "label" in level, f"{key}: each level must have 'label'" + + def test_hierarchies_match_expected(self): + """Verify expected hierarchy keys for all available loaders.""" + for key, cls in _get_available_loaders().items(): + expected = _EXPECTED_HIERARCHIES.get(key) + if expected is None: + continue # unknown loader, skip + actual = [level["key"] for level in cls.catalog_hierarchy()] + assert actual == expected, f"{key}: expected {expected}, got {actual}" + + def test_last_level_is_importable(self): + """The last hierarchy level should be the importable leaf (table/file/dataset/etc.).""" + importable_keys = {"table", "collection", "dataset", "object", "blob"} + for key, cls in _get_available_loaders().items(): + h = cls.catalog_hierarchy() + last_key = h[-1]["key"] + assert last_key in importable_keys, ( + f"{key}: last level '{last_key}' not in expected importable: {importable_keys}" + ) + + +# ================================================================== +# Tests: effective_hierarchy and pinned_scope +# ================================================================== + +class TestScopePinningAllLoaders: + + @pytest.mark.parametrize("loader_key,pin_param,expected_removed", [ + ("mysql", "database", "database"), + ("postgresql", "database", "database"), + ("mssql", "database", "database"), + ("athena", "database", "database"), + ]) + def test_pinning_removes_level(self, loader_key, pin_param, expected_removed): + """When a hierarchy-level param is provided, that level is pinned out.""" + loaders = _get_available_loaders() + cls = loaders.get(loader_key) + if cls is None: + pytest.skip(f"{loader_key} not available") + + # Create a minimal stub that has params but doesn't connect + # We need to test effective_hierarchy, which uses self.params + class MockInstance: + params = {pin_param: "test_value"} + catalog_hierarchy = staticmethod(cls.catalog_hierarchy) + effective_hierarchy = ExternalDataLoader.effective_hierarchy + pinned_scope = ExternalDataLoader.pinned_scope + + inst = MockInstance() + eff = inst.effective_hierarchy() + eff_keys = [l["key"] for l in eff] + assert expected_removed not in eff_keys + + pinned = inst.pinned_scope() + assert pinned[pin_param] == "test_value" + + def test_no_pinning_returns_full_hierarchy(self): + """With no scope params, effective_hierarchy == catalog_hierarchy.""" + loaders = _get_available_loaders() + for key, cls in loaders.items(): + class MockInstance: + params = {} + catalog_hierarchy = staticmethod(cls.catalog_hierarchy) + effective_hierarchy = ExternalDataLoader.effective_hierarchy + pinned_scope = ExternalDataLoader.pinned_scope + + inst = MockInstance() + full = cls.catalog_hierarchy() + eff = inst.effective_hierarchy() + assert eff == full, f"{key}: with no pinning, effective should match full" + + +# ================================================================== +# Tests: auth_mode +# ================================================================== + +class TestAuthModes: + + def test_default_auth_mode_is_connection(self): + """Most loaders use the default 'connection' auth mode.""" + connection_loaders = {"mysql", "postgresql", "mssql", "bigquery", "kusto", + "athena", "mongodb", "s3", "azure_blob"} + for key, cls in _get_available_loaders().items(): + if key in connection_loaders: + assert cls.auth_mode() == "connection", f"{key}: expected 'connection'" + + def test_superset_uses_token_mode(self): + loaders = _get_available_loaders() + if "superset" in loaders: + assert loaders["superset"].auth_mode() == "token" + + +# ================================================================== +# Tests: list_params and auth_instructions +# ================================================================== + +class TestStaticMethods: + + def test_all_loaders_have_list_params(self): + for key, cls in _get_available_loaders().items(): + params = cls.list_params() + assert isinstance(params, list), f"{key}: list_params() must return a list" + assert len(params) > 0, f"{key}: must have at least one param" + for p in params: + assert "name" in p, f"{key}: each param must have 'name'" + assert "type" in p, f"{key}: each param must have 'type'" + + def test_all_loaders_have_auth_instructions(self): + for key, cls in _get_available_loaders().items(): + instructions = cls.auth_instructions() + assert isinstance(instructions, str) + assert len(instructions) > 10, f"{key}: auth_instructions should be helpful" + + def test_all_loaders_have_required_host_or_identifier(self): + """Each loader should require at least one identifying connection param.""" + for key, cls in _get_available_loaders().items(): + params = cls.list_params() + required = [p for p in params if p.get("required", False)] + assert len(required) > 0, f"{key}: should have at least one required param" + + def test_rate_limit_returns_dict_or_none(self): + for key, cls in _get_available_loaders().items(): + result = cls.rate_limit() + assert result is None or isinstance(result, dict), ( + f"{key}: rate_limit() must return None or dict" + ) + + +# ================================================================== +# Tests: DataConnector wrapping for all loaders +# ================================================================== + +class TestDataConnectorWrapping: + + def test_all_loaders_can_be_wrapped(self): + """DataConnector.from_loader() works for every registered loader.""" + from data_formulator.data_connector import DataConnector + for key, cls in _get_available_loaders().items(): + source = DataConnector.from_loader(cls, source_id=key) + cfg = source.get_frontend_config() + assert cfg["source_id"] == key + assert len(cfg["hierarchy"]) > 0 + assert len(cfg["params_form"]) > 0 + + def test_all_loaders_blueprints_have_all_routes(self): + """Each wrapped loader should have 9 routes.""" + import flask + from data_formulator.data_connector import DataConnector + expected_suffixes = [ + "/auth/connect", "/auth/disconnect", "/auth/status", + "/catalog/ls", "/catalog/metadata", "/catalog/list_tables", + "/data/import", "/data/refresh", "/data/preview", + ] + for key, cls in _get_available_loaders().items(): + app = flask.Flask(__name__) + app.config["TESTING"] = True + source = DataConnector.from_loader(cls, source_id=key) + app.register_blueprint(source.create_blueprint()) + rules = [rule.rule for rule in app.url_map.iter_rules()] + for suffix in expected_suffixes: + expected_rule = f"/api/connectors/{key}{suffix}" + assert expected_rule in rules, ( + f"{key}: missing route {expected_rule}" + ) diff --git a/tests/backend/unit/test_data_connector_config.py b/tests/backend/unit/test_data_connector_config.py new file mode 100644 index 00000000..f8fd5420 --- /dev/null +++ b/tests/backend/unit/test_data_connector_config.py @@ -0,0 +1,413 @@ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. + +"""Tests for config-driven data source registration. + +Covers: +- YAML configuration parsing and source spec generation +- Environment variable parsing (DF_SOURCES____=) +- Auto-discovery of installed loaders +- Config priority: env > YAML > auto-discovery +- Multiple instances of the same loader type +- DF_AUTO_DISCOVER_SOURCES=false +- ${ENV_REF} expansion in params +- register_data_connectors() end-to-end +""" +from __future__ import annotations + +import os +import textwrap +from pathlib import Path +from typing import Any +from unittest.mock import MagicMock, patch + +import flask +import pytest + +from data_formulator.data_connector import ( + DATA_CONNECTORS, + DataConnector, + SourceSpec, + _build_source_specs, + _load_yaml_config, + _parse_env_sources, + _resolve_env_refs, + register_data_connectors, +) +from data_formulator.data_loader.external_data_loader import ( + CatalogNode, + ExternalDataLoader, +) + +pytestmark = [pytest.mark.backend, pytest.mark.plugin] + + +# ------------------------------------------------------------------ +# Minimal mock loader for registration tests +# ------------------------------------------------------------------ + +class _StubLoader(ExternalDataLoader): + def __init__(self, params): + self.params = params + + def test_connection(self): + return True + + def list_tables(self, table_filter=None): + return [] + + def fetch_data_as_arrow(self, source_table, import_options=None): + import pyarrow as pa + return pa.table({"x": [1]}) + + @staticmethod + def list_params(): + return [ + {"name": "host", "type": "string", "required": True}, + {"name": "database", "type": "string", "required": False}, + ] + + @staticmethod + def auth_instructions(): + return "Stub loader" + + +# ------------------------------------------------------------------ +# Fixtures +# ------------------------------------------------------------------ + +@pytest.fixture(autouse=True) +def _clean_data_connectors(): + """Reset the global DATA_CONNECTORS dict between tests.""" + old = dict(DATA_CONNECTORS) + DATA_CONNECTORS.clear() + yield + DATA_CONNECTORS.clear() + DATA_CONNECTORS.update(old) + + +@pytest.fixture +def app(): + _app = flask.Flask(__name__) + _app.config["TESTING"] = True + _app.secret_key = "test" + return _app + + +# ================================================================== +# Tests: Environment Variable Parsing +# ================================================================== + +class TestEnvVarParsing: + + def test_parse_env_sources_basic(self, monkeypatch): + monkeypatch.setenv("DF_SOURCES__pg_prod__type", "postgresql") + monkeypatch.setenv("DF_SOURCES__pg_prod__name", "Production DB") + monkeypatch.setenv("DF_SOURCES__pg_prod__params__host", "db.example.com") + monkeypatch.setenv("DF_SOURCES__pg_prod__params__database", "prod") + + specs = _parse_env_sources() + assert len(specs) == 1 + s = specs[0] + assert s.source_id == "pg_prod" + assert s.loader_type == "postgresql" + assert s.display_name == "Production DB" + assert s.default_params["host"] == "db.example.com" + assert s.default_params["database"] == "prod" + + def test_parse_env_sources_multiple(self, monkeypatch): + monkeypatch.setenv("DF_SOURCES__pg__type", "postgresql") + monkeypatch.setenv("DF_SOURCES__pg__params__host", "pg.local") + monkeypatch.setenv("DF_SOURCES__mysql__type", "mysql") + monkeypatch.setenv("DF_SOURCES__mysql__params__host", "mysql.local") + + specs = _parse_env_sources() + assert len(specs) == 2 + types = {s.loader_type for s in specs} + assert types == {"postgresql", "mysql"} + + def test_parse_env_sources_missing_type_skipped(self, monkeypatch): + monkeypatch.setenv("DF_SOURCES__broken__params__host", "localhost") + # No DF_SOURCES__broken__type set + specs = _parse_env_sources() + assert len(specs) == 0 + + def test_parse_env_sources_default_name(self, monkeypatch): + monkeypatch.setenv("DF_SOURCES__pg__type", "postgresql") + specs = _parse_env_sources() + assert specs[0].display_name == "Postgresql" + + +# ================================================================== +# Tests: YAML Config Loading +# ================================================================== + +class TestYamlConfigLoading: + + def test_load_yaml_from_cwd(self, tmp_path, monkeypatch): + yaml_content = textwrap.dedent("""\ + auto_discover: false + sources: + - type: postgresql + name: "My PG" + params: + host: pg.example.com + database: mydb + - type: mysql + name: "My MySQL" + params: + host: mysql.example.com + """) + yaml_file = tmp_path / "data-sources.yml" + yaml_file.write_text(yaml_content) + monkeypatch.chdir(tmp_path) + monkeypatch.delenv("DATA_FORMULATOR_HOME", raising=False) + + config = _load_yaml_config() + assert config is not None + assert config["auto_discover"] is False + assert len(config["sources"]) == 2 + assert config["sources"][0]["type"] == "postgresql" + assert config["sources"][0]["params"]["host"] == "pg.example.com" + + def test_load_yaml_from_df_home(self, tmp_path, monkeypatch): + yaml_content = textwrap.dedent("""\ + sources: + - type: bigquery + params: + project: my-gcp-project + """) + yaml_file = tmp_path / "data-sources.yml" + yaml_file.write_text(yaml_content) + monkeypatch.setenv("DATA_FORMULATOR_HOME", str(tmp_path)) + # Make sure cwd doesn't have one too + monkeypatch.chdir(Path(__file__).parent) + + config = _load_yaml_config() + assert config is not None + assert config["sources"][0]["type"] == "bigquery" + + def test_load_yaml_returns_none_if_missing(self, tmp_path, monkeypatch): + monkeypatch.chdir(tmp_path) + monkeypatch.delenv("DATA_FORMULATOR_HOME", raising=False) + config = _load_yaml_config() + assert config is None + + +# ================================================================== +# Tests: _build_source_specs +# ================================================================== + +class TestBuildSourceSpecs: + + def test_auto_discovery_includes_all_data_loaders(self, monkeypatch): + """Without config, all DATA_LOADERS should appear.""" + monkeypatch.delenv("DATA_FORMULATOR_HOME", raising=False) + monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) + # Clear env source vars + for key in list(os.environ): + if key.startswith("DF_SOURCES__"): + monkeypatch.delenv(key) + + mock_loaders = {"stub_a": _StubLoader, "stub_b": _StubLoader} + + with patch("data_formulator.data_connector._load_yaml_config", return_value=None), \ + patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): + specs, auto_discover = _build_source_specs() + + assert auto_discover is True + ids = {s.source_id for s in specs} + assert "stub_a" in ids + assert "stub_b" in ids + + def test_auto_discovery_disabled_by_env(self, monkeypatch): + monkeypatch.setenv("DF_AUTO_DISCOVER_SOURCES", "false") + for key in list(os.environ): + if key.startswith("DF_SOURCES__"): + monkeypatch.delenv(key) + + mock_loaders = {"stub": _StubLoader} + with patch("data_formulator.data_connector._load_yaml_config", return_value=None), \ + patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): + specs, auto_discover = _build_source_specs() + + assert auto_discover is False + # No env specs + no yaml specs + no auto-discovery → empty + assert len(specs) == 0 + + def test_auto_discovery_disabled_by_yaml(self, monkeypatch): + for key in list(os.environ): + if key.startswith("DF_SOURCES__"): + monkeypatch.delenv(key) + monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) + + yaml_config = { + "auto_discover": False, + "sources": [{"type": "stub", "name": "My Stub"}], + } + mock_loaders = {"stub": _StubLoader, "other": _StubLoader} + + with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): + specs, auto_discover = _build_source_specs() + + assert auto_discover is False + assert len(specs) == 1 + assert specs[0].loader_type == "stub" + + def test_env_overrides_yaml(self, monkeypatch): + """Env var source with same ID overrides YAML source.""" + monkeypatch.setenv("DF_SOURCES__pg__type", "postgresql") + monkeypatch.setenv("DF_SOURCES__pg__name", "Env PG") + monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) + + yaml_config = { + "auto_discover": False, + "sources": [ + {"type": "postgresql", "id": "pg", "name": "YAML PG"}, + ], + } + mock_loaders = {"postgresql": _StubLoader} + with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): + specs, _ = _build_source_specs() + + # Env wins + pg_spec = next(s for s in specs if s.source_id == "pg") + assert pg_spec.display_name == "Env PG" + + def test_multiple_instances_same_type(self, monkeypatch): + for key in list(os.environ): + if key.startswith("DF_SOURCES__"): + monkeypatch.delenv(key) + monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) + + yaml_config = { + "auto_discover": False, + "sources": [ + {"type": "stub", "id": "stub_prod", "name": "Production", "params": {"host": "prod.corp"}}, + {"type": "stub", "id": "stub_stage", "name": "Staging", "params": {"host": "stage.corp"}}, + ], + } + mock_loaders = {"stub": _StubLoader} + with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): + specs, _ = _build_source_specs() + + assert len(specs) == 2 + ids = {s.source_id for s in specs} + assert ids == {"stub_prod", "stub_stage"} + + def test_env_ref_resolution_in_yaml_params(self, monkeypatch): + monkeypatch.setenv("DB_PASSWORD", "s3cret") + for key in list(os.environ): + if key.startswith("DF_SOURCES__"): + monkeypatch.delenv(key) + monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) + + yaml_config = { + "auto_discover": False, + "sources": [ + {"type": "stub", "params": {"host": "db.corp", "password": "${DB_PASSWORD}"}}, + ], + } + mock_loaders = {"stub": _StubLoader} + with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): + specs, _ = _build_source_specs() + + assert specs[0].default_params["password"] == "s3cret" + assert specs[0].default_params["host"] == "db.corp" + + +# ================================================================== +# Tests: register_data_connectors +# ================================================================== + +class TestRegisterConnectedSources: + + def test_registers_blueprints(self, app, monkeypatch): + for key in list(os.environ): + if key.startswith("DF_SOURCES__"): + monkeypatch.delenv(key) + monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) + + mock_loaders = {"stub": _StubLoader} + mock_disabled = {} + yaml_config = { + "auto_discover": False, + "sources": [{"type": "stub", "name": "Test Stub"}], + } + + with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders), \ + patch("data_formulator.data_loader.DISABLED_LOADERS", mock_disabled): + register_data_connectors(app) + + assert "stub" in DATA_CONNECTORS + rules = [rule.rule for rule in app.url_map.iter_rules()] + assert "/api/connectors/stub/auth/connect" in rules + + def test_skips_unknown_loader_type(self, app, monkeypatch): + for key in list(os.environ): + if key.startswith("DF_SOURCES__"): + monkeypatch.delenv(key) + monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) + + yaml_config = { + "auto_discover": False, + "sources": [{"type": "nonexistent"}], + } + + with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + patch("data_formulator.data_loader.DATA_LOADERS", {}), \ + patch("data_formulator.data_loader.DISABLED_LOADERS", {}): + register_data_connectors(app) + + assert len(DATA_CONNECTORS) == 0 + + def test_logs_disabled_loaders(self, app, monkeypatch): + for key in list(os.environ): + if key.startswith("DF_SOURCES__"): + monkeypatch.delenv(key) + monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) + + yaml_config = { + "auto_discover": False, + "sources": [{"type": "kusto"}], + } + + with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + patch("data_formulator.data_loader.DATA_LOADERS", {}), \ + patch("data_formulator.data_loader.DISABLED_LOADERS", {"kusto": "pip install azure-kusto-data"}): + register_data_connectors(app) + + assert len(DATA_CONNECTORS) == 0 + + def test_frontend_config_in_sources(self, app, monkeypatch): + for key in list(os.environ): + if key.startswith("DF_SOURCES__"): + monkeypatch.delenv(key) + monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) + + mock_loaders = {"stub": _StubLoader} + yaml_config = { + "auto_discover": False, + "sources": [ + {"type": "stub", "name": "My Stub", "params": {"host": "db.corp"}}, + ], + } + + with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders), \ + patch("data_formulator.data_loader.DISABLED_LOADERS", {}): + register_data_connectors(app) + + source = DATA_CONNECTORS["stub"] + cfg = source.get_frontend_config() + assert cfg["name"] == "My Stub" + assert cfg["pinned_params"]["host"] == "db.corp" + # host should NOT be in form fields + form_names = {f["name"] for f in cfg["params_form"]} + assert "host" not in form_names + assert "database" in form_names diff --git a/tests/backend/unit/test_data_connector_framework.py b/tests/backend/unit/test_data_connector_framework.py new file mode 100644 index 00000000..f72f4bd6 --- /dev/null +++ b/tests/backend/unit/test_data_connector_framework.py @@ -0,0 +1,659 @@ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. + +"""Unit tests for the DataConnector framework. + +Tests the generic lifecycle wrapper using a mock ExternalDataLoader — no +real database or network access required. + +Covers: +- Blueprint creation and route registration +- Auth routes: connect / disconnect / status +- Catalog routes: ls / metadata / list_tables +- Data routes: import / preview / refresh +- Error handling and safe error messages +- Identity-based loader isolation +- Frontend config generation +""" +from __future__ import annotations + +import json +from dataclasses import dataclass, field +from typing import Any +from unittest.mock import MagicMock, patch + +import flask +import pyarrow as pa +import pytest + +from data_formulator.data_connector import ( + DATA_CONNECTORS, + DataConnector, + SourceSpec, + _build_source_specs, + _node_to_dict, + _resolve_env_refs, + _sanitize_error, +) +from data_formulator.data_loader.external_data_loader import ( + CatalogNode, + ExternalDataLoader, +) + +pytestmark = [pytest.mark.backend, pytest.mark.plugin] + + +# ------------------------------------------------------------------ +# Mock loader +# ------------------------------------------------------------------ + +class MockLoader(ExternalDataLoader): + """In-memory loader for testing the DataConnector wrapper.""" + + _test_tables = { + "users": pa.table({ + "id": [1, 2, 3, 4, 5], + "name": ["Alice", "Bob", "Carol", "Dave", "Eve"], + "email": ["a@x.com", "b@x.com", "c@x.com", "d@x.com", "e@x.com"], + }), + "orders": pa.table({ + "id": [1, 2, 3], + "user_id": [1, 2, 1], + "amount": [100.0, 200.0, 50.0], + }), + } + + def __init__(self, params: dict[str, Any]): + self.params = params + self._connected = True + host = params.get("host", "") + if host == "bad-host": + raise ConnectionError("Connection refused") + + def test_connection(self) -> bool: + return self._connected + + @staticmethod + def catalog_hierarchy() -> list[dict[str, str]]: + return [ + {"key": "database", "label": "Database"}, + {"key": "schema", "label": "Schema"}, + {"key": "table", "label": "Table"}, + ] + + def ls(self, path=None, filter=None): + path = path or [] + eff = self.effective_hierarchy() + if len(path) >= len(eff): + return [] + level_key = eff[len(path)]["key"] + if level_key == "database": + return [CatalogNode("testdb", "namespace", ["testdb"])] + if level_key == "schema": + return [CatalogNode("public", "namespace", [*(path), "public"])] + if level_key == "table": + nodes = [] + for name in self._test_tables: + if filter and filter.lower() not in name.lower(): + continue + nodes.append(CatalogNode( + name, "table", [*path, name], + metadata={"row_count": self._test_tables[name].num_rows}, + )) + return nodes + return [] + + def get_metadata(self, path): + table_name = path[-1] if path else None + if table_name in self._test_tables: + t = self._test_tables[table_name] + return { + "name": table_name, + "row_count": t.num_rows, + "columns": [{"name": c, "type": str(t.schema.field(c).type)} for c in t.column_names], + } + return {} + + def list_tables(self, table_filter=None): + result = [] + for name, t in self._test_tables.items(): + if table_filter and table_filter.lower() not in name.lower(): + continue + result.append({ + "name": f"public.{name}", + "metadata": { + "columns": [{"name": c} for c in t.column_names], + "row_count": t.num_rows, + }, + }) + return result + + def fetch_data_as_arrow(self, source_table, import_options=None): + opts = import_options or {} + size = opts.get("size", 1000) + table_key = source_table.split(".")[-1] + t = self._test_tables.get(table_key) + if t is None: + raise ValueError(f"Table not found: {source_table}") + return t.slice(0, min(size, t.num_rows)) + + @staticmethod + def list_params(): + return [ + {"name": "host", "type": "string", "required": True, "description": "Host"}, + {"name": "port", "type": "number", "required": False, "default": 5432}, + {"name": "user", "type": "string", "required": True}, + {"name": "password", "type": "password", "required": True}, + {"name": "database", "type": "string", "required": False, "scope_level": True}, + {"name": "schema", "type": "string", "required": False, "scope_level": True}, + ] + + @staticmethod + def auth_instructions(): + return "Connect with host, user, password." + + +class FailingTestConnectionLoader(MockLoader): + """Loader whose test_connection always returns False.""" + + def test_connection(self) -> bool: + return False + + +# ------------------------------------------------------------------ +# Fixtures +# ------------------------------------------------------------------ + +@pytest.fixture +def app(): + """Minimal Flask app with a DataConnector for MockLoader.""" + _app = flask.Flask(__name__) + _app.config["TESTING"] = True + _app.secret_key = "test-secret" + return _app + + +@pytest.fixture +def source(): + """A DataConnector wrapping MockLoader.""" + return DataConnector.from_loader( + MockLoader, + source_id="mock_db", + display_name="Mock Database", + default_params={"host": "localhost"}, + icon="mock", + ) + + +@pytest.fixture +def source_pinned(): + """A DataConnector with database pre-pinned.""" + return DataConnector.from_loader( + MockLoader, + source_id="mock_pinned", + display_name="Mock Pinned", + default_params={"host": "localhost", "database": "testdb"}, + ) + + +@pytest.fixture +def client(app, source): + """Flask test client with source blueprint registered.""" + bp = source.create_blueprint() + app.register_blueprint(bp) + return app.test_client() + + +@pytest.fixture +def connected_client(client): + """Client that is already connected.""" + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = client.post("/api/connectors/mock_db/auth/connect", json={ + "params": {"host": "localhost", "user": "test", "password": "test"}, + }) + assert resp.status_code == 200 + yield client + + +# ================================================================== +# Tests: Blueprint & Registration +# ================================================================== + +class TestBlueprintCreation: + + def test_blueprint_has_correct_prefix(self, source): + bp = source.create_blueprint() + assert bp.url_prefix == "/api/connectors/mock_db" + + def test_blueprint_registers_routes(self, app, source): + bp = source.create_blueprint() + app.register_blueprint(bp) + rules = [rule.rule for rule in app.url_map.iter_rules()] + assert "/api/connectors/mock_db/auth/connect" in rules + assert "/api/connectors/mock_db/auth/disconnect" in rules + assert "/api/connectors/mock_db/auth/status" in rules + assert "/api/connectors/mock_db/catalog/ls" in rules + assert "/api/connectors/mock_db/catalog/metadata" in rules + assert "/api/connectors/mock_db/catalog/list_tables" in rules + assert "/api/connectors/mock_db/data/import" in rules + assert "/api/connectors/mock_db/data/refresh" in rules + assert "/api/connectors/mock_db/data/preview" in rules + + +# ================================================================== +# Tests: Frontend Config +# ================================================================== + +class TestFrontendConfig: + + def test_frontend_config_structure(self, source): + cfg = source.get_frontend_config() + assert cfg["source_id"] == "mock_db" + assert cfg["name"] == "Mock Database" + assert cfg["icon"] == "mock" + assert "params_form" in cfg + assert "pinned_params" in cfg + assert "hierarchy" in cfg + assert "effective_hierarchy" in cfg + + def test_pinned_params_excluded_from_form(self, source): + cfg = source.get_frontend_config() + form_names = {f["name"] for f in cfg["params_form"]} + assert "host" not in form_names # host is pre-pinned + assert "user" in form_names + assert "password" in form_names + assert cfg["pinned_params"]["host"] == "localhost" + + def test_hierarchy_included(self, source): + cfg = source.get_frontend_config() + keys = [h["key"] for h in cfg["hierarchy"]] + assert keys == ["database", "schema", "table"] + + def test_pinned_source_effective_hierarchy(self, source_pinned): + cfg = source_pinned.get_frontend_config() + eff_keys = [h["key"] for h in cfg["effective_hierarchy"]] + assert "database" not in eff_keys + assert "schema" in eff_keys + assert "table" in eff_keys + assert cfg["pinned_params"]["database"] == "testdb" + + +# ================================================================== +# Tests: Auth Routes +# ================================================================== + +class TestAuthRoutes: + + def test_connect_success(self, client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = client.post("/api/connectors/mock_db/auth/connect", json={ + "params": {"host": "localhost", "user": "test", "password": "test"}, + }) + data = resp.get_json() + assert resp.status_code == 200 + assert data["status"] == "connected" + assert "hierarchy" in data + assert "effective_hierarchy" in data + + def test_connect_merges_default_params(self, client): + """Default params (host=localhost) merged with user params.""" + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = client.post("/api/connectors/mock_db/auth/connect", json={ + "params": {"user": "test", "password": "test"}, + }) + data = resp.get_json() + assert resp.status_code == 200 + assert data["status"] == "connected" + + def test_connect_bad_host_returns_error(self, app): + source = DataConnector.from_loader(MockLoader, source_id="mock_bad") + app.register_blueprint(source.create_blueprint()) + c = app.test_client() + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = c.post("/api/connectors/mock_bad/auth/connect", json={ + "params": {"host": "bad-host", "user": "x", "password": "x"}, + }) + assert resp.status_code in (400, 500, 502) + data = resp.get_json() + assert data["status"] == "error" + + def test_connect_fails_when_test_connection_fails(self, app): + source = DataConnector.from_loader( + FailingTestConnectionLoader, source_id="mock_fail" + ) + app.register_blueprint(source.create_blueprint()) + c = app.test_client() + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = c.post("/api/connectors/mock_fail/auth/connect", json={ + "params": {"host": "localhost", "user": "x", "password": "x"}, + }) + assert resp.status_code == 400 + assert resp.get_json()["status"] == "error" + + def test_disconnect(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/mock_db/auth/disconnect") + assert resp.status_code == 200 + assert resp.get_json()["status"] == "disconnected" + + def test_status_connected(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.get("/api/connectors/mock_db/auth/status") + data = resp.get_json() + assert data["connected"] is True + assert "hierarchy" in data + + def test_status_not_connected(self, client): + with patch.object(DataConnector, "_get_identity", return_value="other-user"): + resp = client.get("/api/connectors/mock_db/auth/status") + data = resp.get_json() + assert data["connected"] is False + assert "params_form" in data + + def test_disconnect_then_status(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + connected_client.post("/api/connectors/mock_db/auth/disconnect") + resp = connected_client.get("/api/connectors/mock_db/auth/status") + data = resp.get_json() + assert data["connected"] is False + + def test_safe_params_exclude_password(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.get("/api/connectors/mock_db/auth/status") + data = resp.get_json() + assert "password" not in data.get("params", {}) + + +# ================================================================== +# Tests: Catalog Routes +# ================================================================== + +class TestCatalogRoutes: + + def test_ls_root(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={ + "path": [], + }) + data = resp.get_json() + assert resp.status_code == 200 + assert "nodes" in data + assert len(data["nodes"]) > 0 + # First level is "database" → namespace + assert data["nodes"][0]["node_type"] == "namespace" + + def test_ls_returns_hierarchy(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={"path": []}) + data = resp.get_json() + assert "hierarchy" in data + assert "effective_hierarchy" in data + assert data["hierarchy"][0]["key"] == "database" + + def test_ls_drill_down_to_tables(self, connected_client): + """Expand database → schema → tables.""" + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + # Level 1: databases + resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={"path": []}) + db_node = resp.get_json()["nodes"][0] + assert db_node["name"] == "testdb" + + # Level 2: schemas + resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={ + "path": db_node["path"], + }) + schema_node = resp.get_json()["nodes"][0] + assert schema_node["name"] == "public" + + # Level 3: tables + resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={ + "path": schema_node["path"], + }) + tables = resp.get_json()["nodes"] + table_names = {t["name"] for t in tables} + assert "users" in table_names + assert "orders" in table_names + for t in tables: + assert t["node_type"] == "table" + + def test_ls_with_filter(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={ + "path": ["testdb", "public"], + "filter": "user", + }) + tables = resp.get_json()["nodes"] + assert len(tables) == 1 + assert tables[0]["name"] == "users" + + def test_ls_not_connected_returns_error(self, client): + with patch.object(DataConnector, "_get_identity", return_value="nobody"): + resp = client.post("/api/connectors/mock_db/catalog/ls", json={"path": []}) + assert resp.status_code in (400, 500, 502) + + def test_catalog_metadata(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/mock_db/catalog/metadata", json={ + "path": ["testdb", "public", "users"], + }) + data = resp.get_json() + assert resp.status_code == 200 + assert data["metadata"]["name"] == "users" + assert data["metadata"]["row_count"] == 5 + assert len(data["metadata"]["columns"]) == 3 + + def test_list_tables_flat(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/mock_db/catalog/list_tables", json={}) + data = resp.get_json() + assert resp.status_code == 200 + assert len(data["tables"]) == 2 + names = {t["name"] for t in data["tables"]} + assert "public.users" in names + assert "public.orders" in names + + def test_list_tables_with_filter(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/mock_db/catalog/list_tables", json={ + "filter": "order", + }) + data = resp.get_json() + assert len(data["tables"]) == 1 + assert "orders" in data["tables"][0]["name"] + + +# ================================================================== +# Tests: Data Routes +# ================================================================== + +class TestDataRoutes: + + def test_preview(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/mock_db/data/preview", json={ + "source_table": "public.users", + "size": 3, + }) + data = resp.get_json() + assert resp.status_code == 200 + assert data["status"] == "success" + assert data["row_count"] <= 3 + col_names = {c["name"] for c in data["columns"]} + assert "id" in col_names + assert "name" in col_names + + def test_preview_missing_source_table(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/mock_db/data/preview", json={}) + assert resp.status_code == 400 + + def test_import_requires_source_table(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/mock_db/data/import", json={}) + assert resp.status_code == 400 + + def test_import_success(self, connected_client): + """Import calls ingest_to_workspace and returns metadata.""" + mock_meta = MagicMock() + mock_meta.name = "users" + mock_meta.row_count = 5 + + with patch.object(DataConnector, "_get_identity", return_value="test-user"), \ + patch("data_formulator.security.auth.get_identity_id", return_value="test-user"), \ + patch("data_formulator.workspace_factory.get_workspace") as mock_ws, \ + patch.object(MockLoader, "ingest_to_workspace", return_value=mock_meta): + resp = connected_client.post("/api/connectors/mock_db/data/import", json={ + "source_table": "public.users", + "table_name": "users", + }) + data = resp.get_json() + assert resp.status_code == 200 + assert data["status"] == "success" + assert data["table_name"] == "users" + assert data["row_count"] == 5 + assert data["refreshable"] is True + + def test_refresh_requires_table_name(self, connected_client): + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = connected_client.post("/api/connectors/mock_db/data/refresh", json={}) + assert resp.status_code == 400 + + +# ================================================================== +# Tests: Error Handling +# ================================================================== + +class TestErrorHandling: + + def test_sanitize_error_connection_refused(self): + msg, code = _sanitize_error(ConnectionError("Connection refused")) + assert code == 502 + assert "Connection failed" in msg + + def test_sanitize_error_permission(self): + msg, code = _sanitize_error(PermissionError("access denied for user")) + assert code == 403 + + def test_sanitize_error_invalid_params(self): + msg, code = _sanitize_error(ValueError("host is required")) + assert code == 400 + + def test_sanitize_error_unknown(self): + msg, code = _sanitize_error(RuntimeError("something weird happened")) + assert code == 500 + # Should not leak the original message + assert "unexpected" in msg.lower() + + def test_error_does_not_leak_internal_details(self, client): + """Errors from loader should not expose connection strings or stack traces.""" + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = client.post("/api/connectors/mock_db/auth/connect", json={ + "params": {"host": "bad-host", "user": "x", "password": "secret123"}, + }) + data = resp.get_json() + assert "secret123" not in json.dumps(data) + + +# ================================================================== +# Tests: Identity Isolation +# ================================================================== + +class TestIdentityIsolation: + + def test_different_identities_have_separate_loaders(self, client): + """Two users connecting to the same source get separate loader instances.""" + with patch.object(DataConnector, "_get_identity", return_value="user-a"): + client.post("/api/connectors/mock_db/auth/connect", json={ + "params": {"host": "localhost", "user": "A", "password": "A"}, + }) + with patch.object(DataConnector, "_get_identity", return_value="user-b"): + client.post("/api/connectors/mock_db/auth/connect", json={ + "params": {"host": "localhost", "user": "B", "password": "B"}, + }) + # Both should be connected + with patch.object(DataConnector, "_get_identity", return_value="user-a"): + resp = client.get("/api/connectors/mock_db/auth/status") + assert resp.get_json()["connected"] is True + with patch.object(DataConnector, "_get_identity", return_value="user-b"): + resp = client.get("/api/connectors/mock_db/auth/status") + assert resp.get_json()["connected"] is True + + def test_disconnect_does_not_affect_other_user(self, client): + with patch.object(DataConnector, "_get_identity", return_value="user-a"): + client.post("/api/connectors/mock_db/auth/connect", json={ + "params": {"host": "localhost", "user": "A", "password": "A"}, + }) + with patch.object(DataConnector, "_get_identity", return_value="user-b"): + client.post("/api/connectors/mock_db/auth/connect", json={ + "params": {"host": "localhost", "user": "B", "password": "B"}, + }) + # Disconnect user-a + with patch.object(DataConnector, "_get_identity", return_value="user-a"): + client.post("/api/connectors/mock_db/auth/disconnect") + resp = client.get("/api/connectors/mock_db/auth/status") + assert resp.get_json()["connected"] is False + # user-b should still be connected + with patch.object(DataConnector, "_get_identity", return_value="user-b"): + resp = client.get("/api/connectors/mock_db/auth/status") + assert resp.get_json()["connected"] is True + + +# ================================================================== +# Tests: Scope Pinning +# ================================================================== + +class TestScopePinning: + + def test_pinned_database_skips_database_level(self, app, source_pinned): + """When database is pinned, ls([]) should start at schema level.""" + app.register_blueprint(source_pinned.create_blueprint()) + c = app.test_client() + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + c.post("/api/connectors/mock_pinned/auth/connect", json={ + "params": {"user": "test", "password": "test"}, + }) + resp = c.post("/api/connectors/mock_pinned/catalog/ls", json={"path": []}) + data = resp.get_json() + # Should skip database level and show schemas directly + eff_keys = [h["key"] for h in data["effective_hierarchy"]] + assert "database" not in eff_keys + nodes = data["nodes"] + assert len(nodes) > 0 + # First node should be a schema namespace + assert nodes[0]["node_type"] == "namespace" + + def test_pinned_scope_in_connect_response(self, app, source_pinned): + app.register_blueprint(source_pinned.create_blueprint()) + c = app.test_client() + with patch.object(DataConnector, "_get_identity", return_value="test-user"): + resp = c.post("/api/connectors/mock_pinned/auth/connect", json={ + "params": {"user": "test", "password": "test"}, + }) + data = resp.get_json() + assert data["pinned_scope"]["database"] == "testdb" + + +# ================================================================== +# Tests: Helpers +# ================================================================== + +class TestHelpers: + + def test_node_to_dict(self): + node = CatalogNode("users", "table", ["db", "public", "users"], {"row_count": 5}) + d = _node_to_dict(node) + assert d["name"] == "users" + assert d["node_type"] == "table" + assert d["path"] == ["db", "public", "users"] + assert d["metadata"]["row_count"] == 5 + + def test_resolve_env_refs(self, monkeypatch): + monkeypatch.setenv("MY_SECRET", "hunter2") + result = _resolve_env_refs({"password": "${MY_SECRET}", "host": "localhost"}) + assert result["password"] == "hunter2" + assert result["host"] == "localhost" + + def test_resolve_env_refs_missing(self, monkeypatch): + monkeypatch.delenv("NONEXISTENT_VAR", raising=False) + result = _resolve_env_refs({"password": "${NONEXISTENT_VAR}"}) + assert result["password"] == "" diff --git a/tests/backend/unit/test_data_connector_vault.py b/tests/backend/unit/test_data_connector_vault.py new file mode 100644 index 00000000..380c556c --- /dev/null +++ b/tests/backend/unit/test_data_connector_vault.py @@ -0,0 +1,450 @@ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. + +"""Unit tests for DataConnector credential vault integration. + +Tests: +- Credentials stored in vault on connect +- Credentials deleted from vault on disconnect +- Auto-reconnect from vault when in-memory loader is missing +- Stale vault credentials cleaned up when connection test fails +- has_stored_credentials() reflects vault state +- Vault unavailable: graceful fallback to session-only +- DATA_CONNECTORS in app-config response +""" +from __future__ import annotations + +import json +from typing import Any +from unittest.mock import MagicMock, patch + +import flask +import pyarrow as pa +import pytest + +from data_formulator.data_connector import ( + DATA_CONNECTORS, + DataConnector, +) +from data_formulator.credential_vault.base import CredentialVault +from data_formulator.data_loader.external_data_loader import ( + CatalogNode, + ExternalDataLoader, +) + +pytestmark = [pytest.mark.backend, pytest.mark.plugin] + +IDENTITY = "user:alice@test.com" + + +# ------------------------------------------------------------------ +# Mock loader (minimal) +# ------------------------------------------------------------------ + +class MockLoader(ExternalDataLoader): + """In-memory loader for testing vault integration.""" + + def __init__(self, params: dict[str, Any]): + self.params = params + self._connected = True + + def test_connection(self) -> bool: + return self._connected + + @staticmethod + def catalog_hierarchy(): + return [{"key": "database", "label": "Database"}] + + def ls(self, path=None, filter=None): + return [] + + def get_metadata(self, path): + return {} + + def list_tables(self, table_filter=None): + return [{"name": "t1", "metadata": {"columns": [], "row_count": 0}}] + + def fetch_data_as_arrow(self, source_table, import_options=None): + return pa.table({"a": [1, 2, 3]}) + + @staticmethod + def list_params(): + return [ + {"name": "host", "type": "string", "required": True}, + {"name": "password", "type": "password", "required": True}, + ] + + @staticmethod + def auth_instructions(): + return "Connect with host and password." + + +# ------------------------------------------------------------------ +# Mock vault +# ------------------------------------------------------------------ + +class InMemoryVault(CredentialVault): + """In-memory vault for testing.""" + + def __init__(self): + self._store: dict[tuple[str, str], dict] = {} + + def store(self, user_id: str, source_key: str, credentials: dict) -> None: + self._store[(user_id, source_key)] = credentials + + def retrieve(self, user_id: str, source_key: str): + return self._store.get((user_id, source_key)) + + def delete(self, user_id: str, source_key: str) -> None: + self._store.pop((user_id, source_key), None) + + def list_sources(self, user_id: str) -> list[str]: + return [sk for (uid, sk) in self._store if uid == user_id] + + +# ------------------------------------------------------------------ +# Fixtures +# ------------------------------------------------------------------ + +@pytest.fixture +def vault(): + return InMemoryVault() + + +@pytest.fixture +def source(): + return DataConnector.from_loader( + MockLoader, + source_id="test_db", + display_name="Test DB", + default_params={"host": "localhost"}, + ) + + +@pytest.fixture +def app(source): + _app = flask.Flask(__name__) + _app.config["TESTING"] = True + _app.secret_key = "test-secret" + bp = source.create_blueprint() + _app.register_blueprint(bp) + return _app + + +@pytest.fixture +def client(app): + return app.test_client() + + +# ================================================================== +# Tests: Vault store/retrieve/delete helpers +# ================================================================== + +class TestVaultHelpers: + + def test_vault_store_and_retrieve(self, source, vault): + with patch.object(DataConnector, "_get_vault", return_value=vault): + stored = source._vault_store(IDENTITY, {"host": "db", "password": "secret"}) + assert stored is True + retrieved = source._vault_retrieve(IDENTITY) + assert retrieved == {"host": "db", "password": "secret"} + + def test_vault_retrieve_when_empty(self, source, vault): + with patch.object(DataConnector, "_get_vault", return_value=vault): + assert source._vault_retrieve(IDENTITY) is None + + def test_vault_delete(self, source, vault): + with patch.object(DataConnector, "_get_vault", return_value=vault): + source._vault_store(IDENTITY, {"host": "db"}) + source._vault_delete(IDENTITY) + assert source._vault_retrieve(IDENTITY) is None + + def test_vault_unavailable_returns_false(self, source): + with patch.object(DataConnector, "_get_vault", return_value=None): + assert source._vault_store(IDENTITY, {"host": "db"}) is False + assert source._vault_retrieve(IDENTITY) is None + + def test_has_stored_credentials(self, source, vault): + with patch.object(DataConnector, "_get_vault", return_value=vault): + assert source.has_stored_credentials(IDENTITY) is False + source._vault_store(IDENTITY, {"host": "db"}) + assert source.has_stored_credentials(IDENTITY) is True + + def test_vault_exception_is_caught(self, source): + """Vault errors should be logged but not propagated.""" + broken_vault = MagicMock(spec=CredentialVault) + broken_vault.store.side_effect = RuntimeError("disk full") + broken_vault.retrieve.side_effect = RuntimeError("disk full") + broken_vault.delete.side_effect = RuntimeError("disk full") + broken_vault.list_sources.side_effect = RuntimeError("disk full") + + with patch.object(DataConnector, "_get_vault", return_value=broken_vault): + assert source._vault_store(IDENTITY, {"x": 1}) is False + assert source._vault_retrieve(IDENTITY) is None + source._vault_delete(IDENTITY) # should not raise + assert source.has_stored_credentials(IDENTITY) is False + + +# ================================================================== +# Tests: Connect stores credentials +# ================================================================== + +class TestConnectStoresCredentials: + + def test_connect_does_not_auto_persist(self, source, vault): + """_connect no longer stores to vault; caller must persist explicitly.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=vault): + source._connect({"password": "secret"}) + + assert vault.retrieve(IDENTITY, "test_db") is None + # But loader should be in memory + assert source._get_loader(IDENTITY) is not None + + def test_persist_credentials_stores_in_vault(self, source, vault): + """_persist_credentials stores after explicit call.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=vault): + source._connect({"password": "secret"}) + result = source._persist_credentials({"password": "secret"}) + assert result is True + + stored = vault.retrieve(IDENTITY, "test_db") + assert stored is not None + assert stored["user_params"]["password"] == "secret" + + def test_connect_via_route_stores_in_vault(self, client, source, vault): + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=vault): + resp = client.post("/api/connectors/test_db/auth/connect", json={ + "params": {"password": "secret"}, + }) + data = resp.get_json() + assert data["status"] == "connected" + assert data["persisted"] is True + + stored = vault.retrieve(IDENTITY, "test_db") + assert stored is not None + + def test_connect_via_route_persist_false(self, client, source, vault): + """Route with persist=false should not store in vault.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=vault): + resp = client.post("/api/connectors/test_db/auth/connect", json={ + "params": {"password": "secret"}, + "persist": False, + }) + data = resp.get_json() + assert data["status"] == "connected" + assert data["persisted"] is False + + assert vault.retrieve(IDENTITY, "test_db") is None + + def test_connect_persist_false_clears_old_vault_entry(self, client, source, vault): + """Reconnecting with persist=false should delete previously stored credentials.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=vault): + # First connect with persist=true + resp = client.post("/api/connectors/test_db/auth/connect", json={ + "params": {"password": "secret"}, + "persist": True, + }) + assert resp.get_json()["persisted"] is True + assert vault.retrieve(IDENTITY, "test_db") is not None + + # Reconnect with persist=false — old entry must be deleted + resp = client.post("/api/connectors/test_db/auth/connect", json={ + "params": {"password": "secret"}, + "persist": False, + }) + assert resp.get_json()["persisted"] is False + assert vault.retrieve(IDENTITY, "test_db") is None + + +# ================================================================== +# Tests: Disconnect deletes credentials +# ================================================================== + +class TestDisconnectDeletesCredentials: + + def test_disconnect_clears_vault(self, source, vault): + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=vault): + source._connect({"password": "secret"}) + source._persist_credentials({"password": "secret"}) + assert vault.retrieve(IDENTITY, "test_db") is not None + + source._disconnect() + assert vault.retrieve(IDENTITY, "test_db") is None + assert source._get_loader(IDENTITY) is None + + def test_disconnect_via_route_clears_vault(self, client, source, vault): + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=vault): + # Connect first + client.post("/api/connectors/test_db/auth/connect", json={ + "params": {"password": "secret"}, + }) + assert vault.retrieve(IDENTITY, "test_db") is not None + + # Disconnect + resp = client.post("/api/connectors/test_db/auth/disconnect") + assert resp.get_json()["status"] == "disconnected" + assert vault.retrieve(IDENTITY, "test_db") is None + + +# ================================================================== +# Tests: Auto-reconnect from vault +# ================================================================== + +class TestAutoReconnect: + + def test_require_loader_auto_reconnects(self, source, vault): + """When in-memory loader is gone but vault has creds, auto-reconnect.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=vault): + # Store credentials directly in vault + vault.store(IDENTITY, "test_db", { + "user_params": {"password": "secret"}, + "source_id": "test_db", + }) + # No in-memory loader + assert source._get_loader(IDENTITY) is None + + # _require_loader should auto-reconnect + loader = source._require_loader() + assert loader is not None + assert loader.test_connection() is True + # Should now be in memory too + assert source._get_loader(IDENTITY) is not None + + def test_auto_reconnect_cleans_stale_creds(self, source, vault): + """If stored credentials fail to connect, delete them.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=vault): + # Store creds that will fail (test_connection patched to False) + vault.store(IDENTITY, "test_db", { + "user_params": {"password": "wrong"}, + "source_id": "test_db", + }) + + # Patch MockLoader.test_connection to fail + with patch.object(MockLoader, "test_connection", return_value=False): + loader = source._try_auto_reconnect(IDENTITY) + assert loader is None + # Stale credentials should be deleted + assert vault.retrieve(IDENTITY, "test_db") is None + + def test_auto_reconnect_exception_cleans_stale_creds(self, source, vault): + """If auto-reconnect raises, delete stale creds.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=vault): + vault.store(IDENTITY, "test_db", { + "user_params": {"password": "x"}, + "source_id": "test_db", + }) + + with patch.object(MockLoader, "__init__", side_effect=RuntimeError("bad creds")): + loader = source._try_auto_reconnect(IDENTITY) + assert loader is None + assert vault.retrieve(IDENTITY, "test_db") is None + + def test_auth_status_triggers_auto_reconnect(self, client, source, vault): + """GET /auth/status should auto-reconnect from vault if no in-memory loader.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=vault): + # Store credentials in vault (simulating a previous session) + vault.store(IDENTITY, "test_db", { + "user_params": {"password": "secret"}, + "source_id": "test_db", + }) + # Clear any in-memory loader + source._loaders.clear() + + resp = client.get("/api/connectors/test_db/auth/status") + data = resp.get_json() + assert data["connected"] is True + assert data["persisted"] is True + + def test_auth_status_not_connected_no_vault(self, client, source): + """GET /auth/status with no loader and no vault = not connected.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=None): + source._loaders.clear() + resp = client.get("/api/connectors/test_db/auth/status") + data = resp.get_json() + assert data["connected"] is False + assert data.get("has_stored_credentials") is False + + +# ================================================================== +# Tests: Vault unavailable (graceful fallback) +# ================================================================== + +class TestNoVaultFallback: + + def test_connect_without_vault(self, source): + """Connection works fine without vault, just in-memory.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=None): + loader = source._connect({"password": "secret"}) + assert loader is not None + assert source._get_loader(IDENTITY) is not None + + def test_connect_route_without_vault_not_persisted(self, client, source): + """Route returns persisted=False when no vault.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=None): + resp = client.post("/api/connectors/test_db/auth/connect", json={ + "params": {"password": "secret"}, + }) + data = resp.get_json() + assert data["status"] == "connected" + assert data["persisted"] is False + + def test_require_loader_no_vault_raises(self, source): + """Without vault and without in-memory loader, require_loader raises.""" + with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ + patch.object(DataConnector, "_get_vault", return_value=None): + source._loaders.clear() + with pytest.raises(ValueError, match="Not connected"): + source._require_loader() + + +# ================================================================== +# Tests: Identity isolation +# ================================================================== + +class TestIdentityIsolation: + + def test_different_users_separate_vault_entries(self, source, vault): + """Two users connecting store separate vault entries.""" + with patch.object(DataConnector, "_get_vault", return_value=vault): + with patch.object(DataConnector, "_get_identity", return_value="user:alice"): + source._connect({"password": "alice-pw"}) + source._persist_credentials({"password": "alice-pw"}) + + with patch.object(DataConnector, "_get_identity", return_value="user:bob"): + source._connect({"password": "bob-pw"}) + source._persist_credentials({"password": "bob-pw"}) + + alice_creds = vault.retrieve("user:alice", "test_db") + bob_creds = vault.retrieve("user:bob", "test_db") + assert alice_creds["user_params"]["password"] == "alice-pw" + assert bob_creds["user_params"]["password"] == "bob-pw" + + def test_disconnect_only_affects_own_user(self, source, vault): + """Disconnecting one user doesn't affect another.""" + with patch.object(DataConnector, "_get_vault", return_value=vault): + with patch.object(DataConnector, "_get_identity", return_value="user:alice"): + source._connect({"password": "alice-pw"}) + source._persist_credentials({"password": "alice-pw"}) + with patch.object(DataConnector, "_get_identity", return_value="user:bob"): + source._connect({"password": "bob-pw"}) + source._persist_credentials({"password": "bob-pw"}) + + with patch.object(DataConnector, "_get_identity", return_value="user:alice"): + source._disconnect() + + assert vault.retrieve("user:alice", "test_db") is None + assert vault.retrieve("user:bob", "test_db") is not None diff --git a/tests/plugin/test_mysql/test_mysql_data_connector.py b/tests/plugin/test_mysql/test_mysql_data_connector.py new file mode 100644 index 00000000..d158a6de --- /dev/null +++ b/tests/plugin/test_mysql/test_mysql_data_connector.py @@ -0,0 +1,263 @@ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. + +"""End-to-end integration tests for MySQL via DataConnector routes. + +Tests the full lifecycle: connect → browse hierarchy → scope pinning → +import → preview → refresh → disconnect. + +Requires MySQL running (e.g. ./tests/run_test_dbs.sh start mysql). +Environment: MYSQL_HOST, MYSQL_PORT (default 3307), MYSQL_USER, MYSQL_PASSWORD, MYSQL_DATABASE. +""" +from __future__ import annotations + +import os +import shutil +import tempfile +import unittest +from pathlib import Path +from typing import Any, Dict +from unittest.mock import patch + +import flask +import pytest + +pytestmark = [pytest.mark.backend, pytest.mark.plugin] + + +def get_mysql_config() -> Dict[str, Any]: + return { + "host": os.getenv("MYSQL_HOST", "localhost"), + "port": os.getenv("MYSQL_PORT", "3307"), + "user": os.getenv("MYSQL_USER", "root"), + "password": os.getenv("MYSQL_PASSWORD", "rootpassword"), + "database": os.getenv("MYSQL_DATABASE", "testdb"), + } + + +def mysql_available() -> bool: + import socket + cfg = get_mysql_config() + host = cfg.get("host", "localhost") + port = int(cfg.get("port", "3307")) + if host in ("localhost", "127.0.0.1"): + host = "127.0.0.1" + sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) + sock.settimeout(2) + try: + sock.connect((host, port)) + sock.close() + return True + except (socket.error, OSError): + return False + + +def _make_app_and_client(source_id="mysql", default_params=None): + from data_formulator.data_connector import DataConnector + from data_formulator.data_loader.mysql_data_loader import MySQLDataLoader + + app = flask.Flask(__name__) + app.config["TESTING"] = True + app.secret_key = "test-secret" + + source = DataConnector.from_loader( + MySQLDataLoader, + source_id=source_id, + display_name="Test MySQL", + default_params=default_params or {}, + ) + app.register_blueprint(source.create_blueprint()) + return app, app.test_client(), source + + +@unittest.skipUnless( + mysql_available(), + "MySQL not available (start with ./tests/run_test_dbs.sh start mysql).", +) +class TestMySQLConnectedSourceE2E(unittest.TestCase): + """End-to-end tests: DataConnector routes → real MySQL.""" + + def setUp(self): + self._workspace_root = None + self._identity = "test-user-mysql-e2e" + + def tearDown(self): + if self._workspace_root and Path(self._workspace_root).exists(): + shutil.rmtree(self._workspace_root, ignore_errors=True) + + def _workspace_root_path(self): + if self._workspace_root is None: + self._workspace_root = tempfile.mkdtemp(prefix="df_test_mysql_e2e_") + return self._workspace_root + + # ============================================================== + # Auth lifecycle + # ============================================================== + + def test_connect_success(self): + app, client, source = _make_app_and_client() + cfg = get_mysql_config() + with patch.object(type(source), "_get_identity", return_value=self._identity): + resp = client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) + self.assertEqual(resp.status_code, 200) + data = resp.get_json() + self.assertEqual(data["status"], "connected") + # MySQL hierarchy: database → table + keys = [h["key"] for h in data["hierarchy"]] + self.assertEqual(keys, ["database", "table"]) + + def test_disconnect_and_status(self): + app, client, source = _make_app_and_client() + cfg = get_mysql_config() + with patch.object(type(source), "_get_identity", return_value=self._identity): + client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) + resp = client.post("/api/connectors/mysql/auth/disconnect") + self.assertEqual(resp.get_json()["status"], "disconnected") + + resp = client.get("/api/connectors/mysql/auth/status") + self.assertFalse(resp.get_json()["connected"]) + + # ============================================================== + # Catalog browsing — database pinned (default test config) + # ============================================================== + + def test_browse_with_database_pinned(self): + """With database param set, ls([]) should show tables directly.""" + cfg = get_mysql_config() + app, client, source = _make_app_and_client() + + with patch.object(type(source), "_get_identity", return_value=self._identity): + resp = client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) + data = resp.get_json() + # database is pinned → effective_hierarchy only has "table" + eff_keys = [h["key"] for h in data["effective_hierarchy"]] + self.assertNotIn("database", eff_keys) + self.assertIn("table", eff_keys) + + # ls([]) should return tables directly + resp = client.post("/api/connectors/mysql/catalog/ls", json={"path": []}) + nodes = resp.get_json()["nodes"] + self.assertTrue(len(nodes) > 0) + table_names = [n["name"] for n in nodes] + self.assertIn("products", table_names) + for n in nodes: + self.assertEqual(n["node_type"], "table") + + def test_browse_without_database_pinned(self): + """Without database param, ls([]) should list databases first.""" + cfg = get_mysql_config() + unpinned_cfg = {k: v for k, v in cfg.items() if k != "database"} + app, client, source = _make_app_and_client() + + with patch.object(type(source), "_get_identity", return_value=self._identity): + resp = client.post("/api/connectors/mysql/auth/connect", json={ + "params": unpinned_cfg, + }) + data = resp.get_json() + # Should not be pinned + eff_keys = [h["key"] for h in data["effective_hierarchy"]] + self.assertIn("database", eff_keys) + + # ls([]) → databases + resp = client.post("/api/connectors/mysql/catalog/ls", json={"path": []}) + nodes = resp.get_json()["nodes"] + db_names = [n["name"] for n in nodes] + self.assertIn("testdb", db_names) + for n in nodes: + self.assertEqual(n["node_type"], "namespace") + + # ls(["testdb"]) → tables + resp = client.post("/api/connectors/mysql/catalog/ls", json={"path": ["testdb"]}) + nodes = resp.get_json()["nodes"] + table_names = [n["name"] for n in nodes] + self.assertIn("products", table_names) + + # ============================================================== + # Data preview + import + # ============================================================== + + def test_preview(self): + cfg = get_mysql_config() + app, client, source = _make_app_and_client() + + with patch.object(type(source), "_get_identity", return_value=self._identity): + client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) + resp = client.post("/api/connectors/mysql/data/preview", json={ + "source_table": "products", + "size": 5, + }) + data = resp.get_json() + self.assertEqual(data["status"], "success") + self.assertLessEqual(data["row_count"], 5) + + def test_import_and_refresh(self): + cfg = get_mysql_config() + app, client, source = _make_app_and_client() + workspace_root = self._workspace_root_path() + + from data_formulator.datalake.workspace import Workspace + workspace = Workspace(self._identity, root_dir=workspace_root) + + with patch.object(type(source), "_get_identity", return_value=self._identity), \ + patch("data_formulator.data_connector.get_identity_id", return_value=self._identity), \ + patch("data_formulator.data_connector.get_workspace", return_value=workspace): + client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) + + resp = client.post("/api/connectors/mysql/data/import", json={ + "source_table": "products", + "table_name": "mysql_products", + "import_options": {"size": 50}, + }) + data = resp.get_json() + self.assertEqual(data["status"], "success") + self.assertEqual(data["table_name"], "mysql_products") + self.assertGreater(data["row_count"], 0) + + # Verify workspace + self.assertIn("mysql_products", workspace.list_tables()) + + # Refresh + resp = client.post("/api/connectors/mysql/data/refresh", json={ + "table_name": "mysql_products", + }) + self.assertEqual(resp.get_json()["status"], "success") + + # ============================================================== + # Flat listing + # ============================================================== + + def test_list_tables_flat(self): + cfg = get_mysql_config() + app, client, source = _make_app_and_client() + + with patch.object(type(source), "_get_identity", return_value=self._identity): + client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) + resp = client.post("/api/connectors/mysql/catalog/list_tables", json={}) + data = resp.get_json() + names = [t["name"] for t in data["tables"]] + self.assertTrue(any("products" in n for n in names)) + + +class TestMySQLConnectedSourceStatic(unittest.TestCase): + + def test_hierarchy(self): + from data_formulator.data_loader.mysql_data_loader import MySQLDataLoader + h = MySQLDataLoader.catalog_hierarchy() + keys = [l["key"] for l in h] + self.assertEqual(keys, ["database", "table"]) + + def test_frontend_config_with_pinned_database(self): + from data_formulator.data_connector import DataConnector + from data_formulator.data_loader.mysql_data_loader import MySQLDataLoader + + source = DataConnector.from_loader( + MySQLDataLoader, + source_id="mysql_test", + display_name="MySQL Test", + default_params={"host": "db.corp", "database": "analytics"}, + ) + cfg = source.get_frontend_config() + self.assertEqual(cfg["pinned_params"]["database"], "analytics") + eff_keys = [h["key"] for h in cfg["effective_hierarchy"]] + self.assertNotIn("database", eff_keys) + self.assertIn("table", eff_keys) diff --git a/tests/plugin/test_postgres/test_postgresql_data_connector.py b/tests/plugin/test_postgres/test_postgresql_data_connector.py new file mode 100644 index 00000000..426f8b18 --- /dev/null +++ b/tests/plugin/test_postgres/test_postgresql_data_connector.py @@ -0,0 +1,430 @@ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. + +"""End-to-end integration tests for PostgreSQL via DataConnector routes. + +Tests the full lifecycle: connect → browse hierarchy → scope pinning → +import → preview → refresh → disconnect → reconnect. + +Requires PostgreSQL running (e.g. ./tests/run_test_dbs.sh start postgres). +Environment: PG_HOST, PG_PORT (default 5433), PG_USER, PG_PASSWORD, PG_DATABASE. +""" +from __future__ import annotations + +import os +import shutil +import tempfile +import unittest +from pathlib import Path +from typing import Any, Dict +from unittest.mock import patch + +import flask +import pytest + +pytestmark = [pytest.mark.backend, pytest.mark.plugin] + +# ------------------------------------------------------------------ +# Helpers +# ------------------------------------------------------------------ + +def get_pg_config() -> Dict[str, Any]: + return { + "host": os.getenv("PG_HOST", "localhost"), + "port": os.getenv("PG_PORT", "5433"), + "user": os.getenv("PG_USER", "postgres"), + "password": os.getenv("PG_PASSWORD", "postgres"), + "database": os.getenv("PG_DATABASE", "testdb"), + } + + +def postgres_available() -> bool: + import socket + cfg = get_pg_config() + host = cfg.get("host", "localhost") + port = int(cfg.get("port", "5433")) + if host in ("localhost", "127.0.0.1"): + host = "127.0.0.1" + sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) + sock.settimeout(2) + try: + sock.connect((host, port)) + sock.close() + return True + except (socket.error, OSError): + return False + + +def _make_app_and_client(source_id="postgresql", default_params=None): + """Create a Flask test app with DataConnector for PostgreSQL.""" + from data_formulator.data_connector import DataConnector, DATA_CONNECTORS + from data_formulator.data_loader.postgresql_data_loader import PostgreSQLDataLoader + + app = flask.Flask(__name__) + app.config["TESTING"] = True + app.secret_key = "test-secret-key" + + source = DataConnector.from_loader( + PostgreSQLDataLoader, + source_id=source_id, + display_name="Test PostgreSQL", + default_params=default_params or {}, + ) + bp = source.create_blueprint() + app.register_blueprint(bp) + return app, app.test_client(), source + + +# ------------------------------------------------------------------ +# Tests +# ------------------------------------------------------------------ + +@unittest.skipUnless( + postgres_available(), + "PostgreSQL not available (start with ./tests/run_test_dbs.sh start postgres).", +) +class TestPostgreSQLConnectedSourceE2E(unittest.TestCase): + """End-to-end tests: DataConnector routes → real PostgreSQL.""" + + def setUp(self): + self._workspace_root = None + self._identity = "test-user-pg-e2e" + + def tearDown(self): + if self._workspace_root and Path(self._workspace_root).exists(): + shutil.rmtree(self._workspace_root, ignore_errors=True) + + def _workspace_root_path(self): + if self._workspace_root is None: + self._workspace_root = tempfile.mkdtemp(prefix="df_test_pg_e2e_") + return self._workspace_root + + # ============================================================== + # Auth lifecycle + # ============================================================== + + def test_connect_and_status(self): + """Connect → status shows connected with hierarchy.""" + app, client, source = _make_app_and_client() + cfg = get_pg_config() + with patch.object(type(source), "_get_identity", return_value=self._identity): + resp = client.post("/api/connectors/postgresql/auth/connect", json={ + "params": cfg, + }) + self.assertEqual(resp.status_code, 200) + data = resp.get_json() + self.assertEqual(data["status"], "connected") + self.assertIn("hierarchy", data) + # PostgreSQL: database → schema → table + keys = [h["key"] for h in data["hierarchy"]] + self.assertEqual(keys, ["database", "schema", "table"]) + + # Status should show connected + resp = client.get("/api/connectors/postgresql/auth/status") + self.assertEqual(resp.status_code, 200) + self.assertTrue(resp.get_json()["connected"]) + + def test_connect_bad_credentials(self): + """Bad credentials return error without leaking secrets.""" + app, client, source = _make_app_and_client() + cfg = get_pg_config() + cfg["password"] = "wrong-password-xyz" + with patch.object(type(source), "_get_identity", return_value=self._identity): + resp = client.post("/api/connectors/postgresql/auth/connect", json={ + "params": cfg, + }) + self.assertIn(resp.status_code, (400, 500, 502)) + data = resp.get_json() + self.assertEqual(data["status"], "error") + # Must NOT leak the password + import json + self.assertNotIn("wrong-password-xyz", json.dumps(data)) + + def test_disconnect_and_reconnect(self): + app, client, source = _make_app_and_client() + cfg = get_pg_config() + with patch.object(type(source), "_get_identity", return_value=self._identity): + # Connect + client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) + + # Disconnect + resp = client.post("/api/connectors/postgresql/auth/disconnect") + self.assertEqual(resp.get_json()["status"], "disconnected") + + # Status shows disconnected + resp = client.get("/api/connectors/postgresql/auth/status") + self.assertFalse(resp.get_json()["connected"]) + + # Reconnect + resp = client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) + self.assertEqual(resp.get_json()["status"], "connected") + + # ============================================================== + # Catalog browsing — full hierarchy + # ============================================================== + + def test_browse_full_hierarchy(self): + """Connect without database pinned → browse database → schema → table.""" + cfg = get_pg_config() + unpinned_cfg = {k: v for k, v in cfg.items() if k != "database"} + app, client, source = _make_app_and_client(default_params={}) + + with patch.object(type(source), "_get_identity", return_value=self._identity): + # Connect without database pinned + client.post("/api/connectors/postgresql/auth/connect", json={"params": unpinned_cfg}) + + # Level 1: list databases + resp = client.post("/api/connectors/postgresql/catalog/ls", json={"path": []}) + data = resp.get_json() + self.assertEqual(resp.status_code, 200) + db_names = [n["name"] for n in data["nodes"]] + self.assertIn("testdb", db_names) + for node in data["nodes"]: + self.assertEqual(node["node_type"], "namespace") + + # Level 2: list schemas in testdb + resp = client.post("/api/connectors/postgresql/catalog/ls", json={"path": ["testdb"]}) + schemas = resp.get_json()["nodes"] + schema_names = [s["name"] for s in schemas] + self.assertIn("sample", schema_names) + self.assertIn("public", schema_names) + + # Level 3: list tables in sample schema + resp = client.post("/api/connectors/postgresql/catalog/ls", json={ + "path": ["testdb", "sample"], + }) + tables = resp.get_json()["nodes"] + table_names = [t["name"] for t in tables] + self.assertIn("products", table_names) + self.assertIn("customers", table_names) + for t in tables: + self.assertEqual(t["node_type"], "table") + + # ============================================================== + # Catalog browsing — scope pinning + # ============================================================== + + def test_scope_pinning_database(self): + """Connect with database pinned → ls([]) starts at schema level.""" + cfg = get_pg_config() + app, client, source = _make_app_and_client( + default_params={"host": cfg["host"], "port": cfg["port"], "database": cfg["database"]}, + ) + + with patch.object(type(source), "_get_identity", return_value=self._identity): + client.post("/api/connectors/postgresql/auth/connect", json={ + "params": {"user": cfg["user"], "password": cfg["password"]}, + }) + + # Connect response should show pinned scope + resp = client.get("/api/connectors/postgresql/auth/status") + status = resp.get_json() + eff_keys = [h["key"] for h in status["effective_hierarchy"]] + self.assertNotIn("database", eff_keys) + self.assertIn("schema", eff_keys) + self.assertEqual(status["pinned_scope"]["database"], cfg["database"]) + + # ls([]) should show schemas, not databases + resp = client.post("/api/connectors/postgresql/catalog/ls", json={"path": []}) + nodes = resp.get_json()["nodes"] + self.assertTrue(len(nodes) > 0) + # These should be schemas (namespace) not databases + for n in nodes: + self.assertEqual(n["node_type"], "namespace") + names = [n["name"] for n in nodes] + self.assertIn("sample", names) + + # ============================================================== + # Catalog metadata + # ============================================================== + + def test_table_metadata(self): + cfg = get_pg_config() + app, client, source = _make_app_and_client() + + with patch.object(type(source), "_get_identity", return_value=self._identity): + client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) + resp = client.post("/api/connectors/postgresql/catalog/metadata", json={ + "path": ["sample", "products"], + }) + data = resp.get_json() + self.assertEqual(resp.status_code, 200) + meta = data["metadata"] + self.assertIn("columns", meta) + col_names = [c["name"] for c in meta["columns"]] + self.assertIn("name", col_names) + self.assertIn("price", col_names) + self.assertIn("category", col_names) + + # ============================================================== + # Flat list_tables + # ============================================================== + + def test_list_tables_flat(self): + cfg = get_pg_config() + app, client, source = _make_app_and_client() + + with patch.object(type(source), "_get_identity", return_value=self._identity): + client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) + resp = client.post("/api/connectors/postgresql/catalog/list_tables", json={}) + data = resp.get_json() + self.assertEqual(resp.status_code, 200) + names = [t["name"] for t in data["tables"]] + self.assertTrue(any("products" in n for n in names)) + + def test_list_tables_with_filter(self): + cfg = get_pg_config() + app, client, source = _make_app_and_client() + + with patch.object(type(source), "_get_identity", return_value=self._identity): + client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) + resp = client.post("/api/connectors/postgresql/catalog/list_tables", json={ + "filter": "product", + }) + tables = resp.get_json()["tables"] + for t in tables: + self.assertIn("product", t["name"].lower()) + + # ============================================================== + # Data preview + # ============================================================== + + def test_preview(self): + cfg = get_pg_config() + app, client, source = _make_app_and_client() + + with patch.object(type(source), "_get_identity", return_value=self._identity): + client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) + resp = client.post("/api/connectors/postgresql/data/preview", json={ + "source_table": "sample.products", + "size": 5, + }) + data = resp.get_json() + self.assertEqual(resp.status_code, 200) + self.assertEqual(data["status"], "success") + self.assertLessEqual(data["row_count"], 5) + col_names = {c["name"] for c in data["columns"]} + self.assertIn("name", col_names) + self.assertIn("price", col_names) + + # ============================================================== + # Data import + refresh + # ============================================================== + + def test_import_and_refresh(self): + """Import a table via connected source routes, then refresh it.""" + cfg = get_pg_config() + app, client, source = _make_app_and_client() + workspace_root = self._workspace_root_path() + + from data_formulator.datalake.workspace import Workspace + workspace = Workspace(self._identity, root_dir=workspace_root) + + with patch.object(type(source), "_get_identity", return_value=self._identity), \ + patch("data_formulator.data_connector.get_identity_id", return_value=self._identity), \ + patch("data_formulator.data_connector.get_workspace", return_value=workspace): + # Connect + client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) + + # Import + resp = client.post("/api/connectors/postgresql/data/import", json={ + "source_table": "sample.products", + "table_name": "products", + "import_options": {"size": 100}, + }) + data = resp.get_json() + self.assertEqual(resp.status_code, 200) + self.assertEqual(data["status"], "success") + self.assertEqual(data["table_name"], "products") + self.assertGreater(data["row_count"], 0) + self.assertTrue(data["refreshable"]) + + # Verify table exists in workspace + self.assertIn("products", workspace.list_tables()) + + # Refresh + resp = client.post("/api/connectors/postgresql/data/refresh", json={ + "table_name": "products", + }) + data = resp.get_json() + self.assertEqual(resp.status_code, 200) + self.assertEqual(data["status"], "success") + self.assertIn("data_changed", data) + + # ============================================================== + # ls() filter + # ============================================================== + + def test_ls_filter(self): + cfg = get_pg_config() + app, client, source = _make_app_and_client() + + with patch.object(type(source), "_get_identity", return_value=self._identity): + client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) + resp = client.post("/api/connectors/postgresql/catalog/ls", json={ + "path": ["sample"], + "filter": "product", + }) + nodes = resp.get_json()["nodes"] + for n in nodes: + self.assertIn("product", n["name"].lower()) + + # ============================================================== + # Operations without connection return error + # ============================================================== + + def test_ls_without_connect_returns_error(self): + app, client, source = _make_app_and_client() + with patch.object(type(source), "_get_identity", return_value="nobody"): + resp = client.post("/api/connectors/postgresql/catalog/ls", json={"path": []}) + self.assertIn(resp.status_code, (400, 500)) + + def test_import_without_connect_returns_error(self): + app, client, source = _make_app_and_client() + with patch.object(type(source), "_get_identity", return_value="nobody"): + resp = client.post("/api/connectors/postgresql/data/import", json={ + "source_table": "sample.products", + }) + self.assertIn(resp.status_code, (400, 500)) + + +# ------------------------------------------------------------------ +# Static tests (no DB required) +# ------------------------------------------------------------------ + +class TestPostgreSQLConnectedSourceStatic(unittest.TestCase): + + def test_frontend_config(self): + from data_formulator.data_connector import DataConnector + from data_formulator.data_loader.postgresql_data_loader import PostgreSQLDataLoader + + source = DataConnector.from_loader( + PostgreSQLDataLoader, + source_id="pg_test", + display_name="PG Test", + default_params={"host": "db.corp", "database": "prod"}, + ) + cfg = source.get_frontend_config() + + # Pinned params should show host and database + self.assertEqual(cfg["pinned_params"]["host"], "db.corp") + self.assertEqual(cfg["pinned_params"]["database"], "prod") + + # Form should not include pinned params + form_names = {f["name"] for f in cfg["params_form"]} + self.assertNotIn("host", form_names) + self.assertNotIn("database", form_names) + self.assertIn("user", form_names) + self.assertIn("password", form_names) + + # Effective hierarchy should exclude database + eff_keys = [h["key"] for h in cfg["effective_hierarchy"]] + self.assertNotIn("database", eff_keys) + self.assertIn("schema", eff_keys) + self.assertIn("table", eff_keys) + + def test_hierarchy(self): + from data_formulator.data_loader.postgresql_data_loader import PostgreSQLDataLoader + h = PostgreSQLDataLoader.catalog_hierarchy() + keys = [l["key"] for l in h] + self.assertEqual(keys, ["database", "schema", "table"]) diff --git a/tests/superset/.env.superset b/tests/superset/.env.superset index a52561de..1c856f0a 100644 --- a/tests/superset/.env.superset +++ b/tests/superset/.env.superset @@ -7,5 +7,5 @@ PLG_SUPERSET_URL=http://localhost:8088 # -- Optional: credential vault (Fernet-encrypted SQLite) -- # CREDENTIAL_VAULT=local -# -- Optional: SSO login URL (not needed for password testing) -- -# PLG_SUPERSET_SSO_LOGIN_URL=http://localhost:8088/login/?next=/df-sso-bridge/ +# -- Optional: SSO login URL (enabled by default for local test Superset) -- +PLG_SUPERSET_SSO_LOGIN_URL=http://localhost:8088/df-sso-bridge/ diff --git a/tests/superset/README.md b/tests/superset/README.md index 755d1d1a..43ba7ba0 100644 --- a/tests/superset/README.md +++ b/tests/superset/README.md @@ -45,6 +45,19 @@ Plus Superset's built-in example datasets (if `load_examples` succeeds). 5. Click it, then log in with `admin` / `admin` 6. Browse datasets and load one into Data Formulator +### Token-based Login (via Superset) + +The test Docker mounts a custom `superset_config.py` that adds a `/df-sso-bridge/` endpoint. This lets you test the delegated login flow where DF obtains a JWT token by having the user log in directly on Superset: + +1. Start both services: `./tests/superset/start.sh` +2. Log into Superset UI at http://localhost:8088 with `admin` / `admin` (creates a session) +3. Open http://localhost:5567 (or http://localhost:5173 for Vite dev) +4. In the Superset login panel, click the **Login via Superset** button +5. A popup opens → Superset sees the existing session → issues JWT tokens via `postMessage` +6. DF receives the tokens and you're logged in without entering credentials in DF + +> **Note**: You must have an active Superset session (step 2) for the bridge to work. If you're not logged into Superset, the popup will redirect you to Superset's login page first. + ## Manual Setup (without the script) ```bash diff --git a/tests/superset/docker-compose.yml b/tests/superset/docker-compose.yml index 7c2a6bed..517d9cc4 100644 --- a/tests/superset/docker-compose.yml +++ b/tests/superset/docker-compose.yml @@ -24,6 +24,7 @@ services: - superset-data:/app/superset_home - ./init-superset.sh:/docker-entrypoint-initdb.d/init-superset.sh:ro - ./sample_data.py:/tmp/sample_data.py:ro + - ./superset_config.py:/app/pythonpath/superset_config.py:ro # Override entrypoint to run init then start entrypoint: ["/bin/bash", "-c"] command: diff --git a/tests/superset/start.sh b/tests/superset/start.sh index ae72eae0..0879448e 100755 --- a/tests/superset/start.sh +++ b/tests/superset/start.sh @@ -33,11 +33,13 @@ set +a if docker ps --format '{{.Names}}' | grep -q "^df-test-superset$"; then info "Superset already running" else + # Remove stopped container if it exists (avoids port/name conflicts) + docker rm -f df-test-superset 2>/dev/null || true info "Starting Superset (first run takes ~2 min)..." - docker compose -f "$COMPOSE_FILE" up -d + docker compose -f "$COMPOSE_FILE" up -d --force-recreate info "Waiting for Superset..." until curl -sf http://localhost:8088/health > /dev/null 2>&1; do sleep 3; done - info "Superset ready" + info "Superset ready (SSO bridge enabled at /df-sso-bridge/)" fi # --- Start DF backend --- diff --git a/tests/superset/superset_config.py b/tests/superset/superset_config.py new file mode 100644 index 00000000..50787076 --- /dev/null +++ b/tests/superset/superset_config.py @@ -0,0 +1,78 @@ +# Custom Superset config for Data Formulator test instance. +# Adds the SSO bridge endpoint so DF can test token-based login. + +import json +import logging +from urllib.parse import urlencode + +from flask import Blueprint, Response, redirect, request, url_for +from flask_login import current_user + +logger = logging.getLogger(__name__) + +# -- SSO bridge as a plain Flask Blueprint (registered via BLUEPRINTS) -------- + +df_sso_bp = Blueprint("df_sso_bridge", __name__) + + +@df_sso_bp.route("/df-sso-bridge/", methods=["GET"]) +def df_sso_bridge(): + """Issue JWT tokens to an authenticated Superset user and post them + back to the Data Formulator frontend via ``postMessage``. + + Query params: + df_origin: The DF frontend origin (e.g. http://localhost:5567). + """ + df_origin = request.args.get("df_origin", "*") + + if not current_user.is_authenticated: + # Redirect to Superset login, then back here after authentication. + bridge_url = request.url # preserves df_origin query param + return redirect(f"/login/?next={bridge_url}") + + from flask_jwt_extended import create_access_token, create_refresh_token + + access_token = create_access_token(identity=current_user.id, fresh=True) + refresh_token = create_refresh_token(identity=current_user.id) + + user_data = { + "id": current_user.id, + "username": current_user.username, + "first_name": getattr(current_user, "first_name", "") or "", + "last_name": getattr(current_user, "last_name", "") or "", + } + + html = f""" +SSO Bridge + +

Completing login...

+ +""" + return Response(html, mimetype="text/html") + + +# -- Register the blueprint with Superset ------------------------------------ +BLUEPRINTS = [df_sso_bp] + +# Allow embedding in popups from DF dev server origins +TALISMAN_ENABLED = False + +# CORS is configured via environment variables in docker-compose.yml +# (SUPERSET_CORS_ENABLED / SUPERSET_CORS_ORIGINS). +# Do NOT set ENABLE_CORS here — the official image lacks flask-cors. diff --git a/tests/test_plan.md b/tests/test_plan.md index 82911cd9..c78aace0 100644 --- a/tests/test_plan.md +++ b/tests/test_plan.md @@ -62,12 +62,12 @@ pytest tests/plugin/ -v # run all loader tests | Layer | Location | Runner | Count | |-------|----------|--------|-------| -| Backend unit | `tests/backend/unit/` | pytest | 17 files | +| Backend unit | `tests/backend/unit/` | pytest | 20 files | | Backend security | `tests/backend/security/` | pytest | 6 files | -| Backend integration | `tests/backend/integration/` | pytest | 7 files (6 route tests + sandbox) | +| Backend integration | `tests/backend/integration/` | pytest | 8 files (7 route tests + sandbox) | | Backend contract | `tests/backend/contract/` | pytest | 2 files | | Backend benchmarks | `tests/backend/benchmarks/` | manual | 2 files | -| Plugin (data loaders) | `tests/plugin/` | pytest (manual) | 5 suites (requires Docker) | +| Plugin (data loaders) | `tests/plugin/` | pytest (manual) | 7 suites (requires Docker) | | Frontend unit | `tests/frontend/unit/` | vitest | 4 files | `tests/backend/` runs by default with `pytest` — no Docker required. diff --git a/yarn.lock b/yarn.lock index 0c0eb3fe..746457fc 100644 --- a/yarn.lock +++ b/yarn.lock @@ -1,6507 +1,6068 @@ -# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY. -# yarn lockfile v1 - - -"@adobe/css-tools@^4.4.0": - version "4.4.4" - resolved "https://registry.yarnpkg.com/@adobe/css-tools/-/css-tools-4.4.4.tgz#2856c55443d3d461693f32d2b96fb6ea92e1ffa9" - integrity sha512-Elp+iwUx5rN5+Y8xLt5/GRoG20WGoDCQ/1Fb+1LiGtvwbDavuSk0jhD/eZdckHAuzcDzccnkv+rEjyWfRx18gg== - -"@asamuzakjp/css-color@^5.0.1": - version "5.0.1" - resolved "https://registry.yarnpkg.com/@asamuzakjp/css-color/-/css-color-5.0.1.tgz#3b9462a9b52f3c6680a0945a3d0851881017550f" - integrity sha512-2SZFvqMyvboVV1d15lMf7XiI3m7SDqXUuKaTymJYLN6dSGadqp+fVojqJlVoMlbZnlTmu3S0TLwLTJpvBMO1Aw== - dependencies: - "@csstools/css-calc" "^3.1.1" - "@csstools/css-color-parser" "^4.0.2" - "@csstools/css-parser-algorithms" "^4.0.0" - "@csstools/css-tokenizer" "^4.0.0" - lru-cache "^11.2.6" - -"@asamuzakjp/dom-selector@^7.0.3": - version "7.0.4" - resolved "https://registry.yarnpkg.com/@asamuzakjp/dom-selector/-/dom-selector-7.0.4.tgz#744ad572c70b00cc8e92e76d539b14afb7bd99b1" - integrity sha512-jXR6x4AcT3eIrS2fSNAwJpwirOkGcd+E7F7CP3zjdTqz9B/2huHOL8YJZBgekKwLML+u7qB/6P1LXQuMScsx0w== - dependencies: - "@asamuzakjp/nwsapi" "^2.3.9" - bidi-js "^1.0.3" - css-tree "^3.2.1" - is-potential-custom-element-name "^1.0.1" - lru-cache "^11.2.7" - -"@asamuzakjp/nwsapi@^2.3.9": - version "2.3.9" - resolved "https://registry.yarnpkg.com/@asamuzakjp/nwsapi/-/nwsapi-2.3.9.tgz#ad5549322dfe9d153d4b4dd6f7ff2ae234b06e24" - integrity sha512-n8GuYSrI9bF7FFZ/SjhwevlHc8xaVlb/7HmHelnc/PZXBD2ZR49NnN9sMMuDdEGPeeRQ5d0hqlSlEpgCX3Wl0Q== - -"@babel/code-frame@^7.0.0", "@babel/code-frame@^7.10.4", "@babel/code-frame@^7.28.6", "@babel/code-frame@^7.29.0": - version "7.29.0" - resolved "https://registry.yarnpkg.com/@babel/code-frame/-/code-frame-7.29.0.tgz#7cd7a59f15b3cc0dcd803038f7792712a7d0b15c" - integrity sha512-9NhCeYjq9+3uxgdtp20LSiJXJvN0FeCtNGpJxuMFZ1Kv3cWUNb6DOhJwUvcVCzKGR66cw4njwM6hrJLqgOwbcw== - dependencies: - "@babel/helper-validator-identifier" "^7.28.5" - js-tokens "^4.0.0" - picocolors "^1.1.1" - -"@babel/generator@^7.29.0": - version "7.29.1" - resolved "https://registry.yarnpkg.com/@babel/generator/-/generator-7.29.1.tgz#d09876290111abbb00ef962a7b83a5307fba0d50" - integrity sha512-qsaF+9Qcm2Qv8SRIMMscAvG4O3lJ0F1GuMo5HR/Bp02LopNgnZBC/EkbevHFeGs4ls/oPz9v+Bsmzbkbe+0dUw== - dependencies: - "@babel/parser" "^7.29.0" - "@babel/types" "^7.29.0" - "@jridgewell/gen-mapping" "^0.3.12" - "@jridgewell/trace-mapping" "^0.3.28" - jsesc "^3.0.2" - -"@babel/helper-globals@^7.28.0": - version "7.28.0" - resolved "https://registry.yarnpkg.com/@babel/helper-globals/-/helper-globals-7.28.0.tgz#b9430df2aa4e17bc28665eadeae8aa1d985e6674" - integrity sha512-+W6cISkXFa1jXsDEdYA8HeevQT/FULhxzR99pxphltZcVaugps53THCeiWA8SguxxpSp3gKPiuYfSWopkLQ4hw== - -"@babel/helper-module-imports@^7.16.7": - version "7.28.6" - resolved "https://registry.yarnpkg.com/@babel/helper-module-imports/-/helper-module-imports-7.28.6.tgz#60632cbd6ffb70b22823187201116762a03e2d5c" - integrity sha512-l5XkZK7r7wa9LucGw9LwZyyCUscb4x37JWTPz7swwFE/0FMQAGpiWUZn8u9DzkSBWEcK25jmvubfpw2dnAMdbw== - dependencies: - "@babel/traverse" "^7.28.6" - "@babel/types" "^7.28.6" - -"@babel/helper-string-parser@^7.27.1": - version "7.27.1" - resolved "https://registry.yarnpkg.com/@babel/helper-string-parser/-/helper-string-parser-7.27.1.tgz#54da796097ab19ce67ed9f88b47bb2ec49367687" - integrity sha512-qMlSxKbpRlAridDExk92nSobyDdpPijUq2DW6oDnUqd0iOGxmQjyqhMIihI9+zv4LPyZdRje2cavWPbCbWm3eA== - -"@babel/helper-validator-identifier@^7.28.5": - version "7.28.5" - resolved "https://registry.yarnpkg.com/@babel/helper-validator-identifier/-/helper-validator-identifier-7.28.5.tgz#010b6938fab7cb7df74aa2bbc06aa503b8fe5fb4" - integrity sha512-qSs4ifwzKJSV39ucNjsvc6WVHs6b7S03sOh2OcHF9UHfVPqWWALUsNUVzhSBiItjRZoLHx7nIarVjqKVusUZ1Q== - -"@babel/parser@^7.28.6", "@babel/parser@^7.29.0": - version "7.29.2" - resolved "https://registry.yarnpkg.com/@babel/parser/-/parser-7.29.2.tgz#58bd50b9a7951d134988a1ae177a35ef9a703ba1" - integrity sha512-4GgRzy/+fsBa72/RZVJmGKPmZu9Byn8o4MoLpmNe1m8ZfYnz5emHLQz3U4gLud6Zwl0RZIcgiLD7Uq7ySFuDLA== - dependencies: - "@babel/types" "^7.29.0" - -"@babel/runtime@^7.12.1", "@babel/runtime@^7.12.5", "@babel/runtime@^7.18.3", "@babel/runtime@^7.23.2", "@babel/runtime@^7.28.6", "@babel/runtime@^7.29.2", "@babel/runtime@^7.5.5", "@babel/runtime@^7.8.7", "@babel/runtime@^7.9.2": - version "7.29.2" - resolved "https://registry.yarnpkg.com/@babel/runtime/-/runtime-7.29.2.tgz#9a6e2d05f4b6692e1801cd4fb176ad823930ed5e" - integrity sha512-JiDShH45zKHWyGe4ZNVRrCjBz8Nh9TMmZG1kh4QTK8hCBTWBi8Da+i7s1fJw7/lYpM4ccepSNfqzZ/QvABBi5g== - -"@babel/template@^7.28.6": - version "7.28.6" - resolved "https://registry.yarnpkg.com/@babel/template/-/template-7.28.6.tgz#0e7e56ecedb78aeef66ce7972b082fce76a23e57" - integrity sha512-YA6Ma2KsCdGb+WC6UpBVFJGXL58MDA6oyONbjyF/+5sBgxY/dwkhLogbMT2GXXyU84/IhRw/2D1Os1B/giz+BQ== - dependencies: - "@babel/code-frame" "^7.28.6" - "@babel/parser" "^7.28.6" - "@babel/types" "^7.28.6" - -"@babel/traverse@^7.28.6": - version "7.29.0" - resolved "https://registry.yarnpkg.com/@babel/traverse/-/traverse-7.29.0.tgz#f323d05001440253eead3c9c858adbe00b90310a" - integrity sha512-4HPiQr0X7+waHfyXPZpWPfWL/J7dcN1mx9gL6WdQVMbPnF3+ZhSMs8tCxN7oHddJE9fhNE7+lxdnlyemKfJRuA== - dependencies: - "@babel/code-frame" "^7.29.0" - "@babel/generator" "^7.29.0" - "@babel/helper-globals" "^7.28.0" - "@babel/parser" "^7.29.0" - "@babel/template" "^7.28.6" - "@babel/types" "^7.29.0" - debug "^4.3.1" - -"@babel/types@^7.28.6", "@babel/types@^7.29.0": - version "7.29.0" - resolved "https://registry.yarnpkg.com/@babel/types/-/types-7.29.0.tgz#9f5b1e838c446e72cf3cd4b918152b8c605e37c7" - integrity sha512-LwdZHpScM4Qz8Xw2iKSzS+cfglZzJGvofQICy7W7v4caru4EaAmyUuO6BGrbyQ2mYV11W0U8j5mBhd14dd3B0A== - dependencies: - "@babel/helper-string-parser" "^7.27.1" - "@babel/helper-validator-identifier" "^7.28.5" - -"@bramus/specificity@^2.4.2": - version "2.4.2" - resolved "https://registry.yarnpkg.com/@bramus/specificity/-/specificity-2.4.2.tgz#aa8db8eb173fdee7324f82284833106adeecc648" - integrity sha512-ctxtJ/eA+t+6q2++vj5j7FYX3nRu311q1wfYH3xjlLOsczhlhxAg2FWNUXhpGvAw3BWo1xBcvOV6/YLc2r5FJw== - dependencies: - css-tree "^3.0.0" - -"@csstools/color-helpers@^6.0.2": - version "6.0.2" - resolved "https://registry.yarnpkg.com/@csstools/color-helpers/-/color-helpers-6.0.2.tgz#82c59fd30649cf0b4d3c82160489748666e6550b" - integrity sha512-LMGQLS9EuADloEFkcTBR3BwV/CGHV7zyDxVRtVDTwdI2Ca4it0CCVTT9wCkxSgokjE5Ho41hEPgb8OEUwoXr6Q== - -"@csstools/css-calc@^3.1.1": - version "3.1.1" - resolved "https://registry.yarnpkg.com/@csstools/css-calc/-/css-calc-3.1.1.tgz#78b494996dac41a02797dcca18ac3b46d25b3fd7" - integrity sha512-HJ26Z/vmsZQqs/o3a6bgKslXGFAungXGbinULZO3eMsOyNJHeBBZfup5FiZInOghgoM4Hwnmw+OgbJCNg1wwUQ== - -"@csstools/css-color-parser@^4.0.2": - version "4.0.2" - resolved "https://registry.yarnpkg.com/@csstools/css-color-parser/-/css-color-parser-4.0.2.tgz#c27e03a3770d0352db92d668d6dde427a37859e5" - integrity sha512-0GEfbBLmTFf0dJlpsNU7zwxRIH0/BGEMuXLTCvFYxuL1tNhqzTbtnFICyJLTNK4a+RechKP75e7w42ClXSnJQw== - dependencies: - "@csstools/color-helpers" "^6.0.2" - "@csstools/css-calc" "^3.1.1" - -"@csstools/css-parser-algorithms@^4.0.0": - version "4.0.0" - resolved "https://registry.yarnpkg.com/@csstools/css-parser-algorithms/-/css-parser-algorithms-4.0.0.tgz#e1c65dc09378b42f26a111fca7f7075fc2c26164" - integrity sha512-+B87qS7fIG3L5h3qwJ/IFbjoVoOe/bpOdh9hAjXbvx0o8ImEmUsGXN0inFOnk2ChCFgqkkGFQ+TpM5rbhkKe4w== - -"@csstools/css-syntax-patches-for-csstree@^1.1.1": - version "1.1.1" - resolved "https://registry.yarnpkg.com/@csstools/css-syntax-patches-for-csstree/-/css-syntax-patches-for-csstree-1.1.1.tgz#ce4c9a0cbe30590491fcd5c03fe6426d22ba89e4" - integrity sha512-BvqN0AMWNAnLk9G8jnUT77D+mUbY/H2b3uDTvg2isJkHaOufUE2R3AOwxWo7VBQKT1lOdwdvorddo2B/lk64+w== - -"@csstools/css-tokenizer@^4.0.0": - version "4.0.0" - resolved "https://registry.yarnpkg.com/@csstools/css-tokenizer/-/css-tokenizer-4.0.0.tgz#798a33950d11226a0ebb6acafa60f5594424967f" - integrity sha512-QxULHAm7cNu72w97JUNCBFODFaXpbDg+dP8b/oWFAZ2MTRppA3U00Y2L1HqaS4J6yBqxwa/Y3nMBaxVKbB/NsA== - -"@emnapi/core@^1.7.1": - version "1.9.1" - resolved "https://registry.yarnpkg.com/@emnapi/core/-/core-1.9.1.tgz#2143069c744ca2442074f8078462e51edd63c7bd" - integrity sha512-mukuNALVsoix/w1BJwFzwXBN/dHeejQtuVzcDsfOEsdpCumXb/E9j8w11h5S54tT1xhifGfbbSm/ICrObRb3KA== - dependencies: - "@emnapi/wasi-threads" "1.2.0" - tslib "^2.4.0" - -"@emnapi/runtime@^1.7.1": - version "1.9.1" - resolved "https://registry.yarnpkg.com/@emnapi/runtime/-/runtime-1.9.1.tgz#115ff2a0d589865be6bd8e9d701e499c473f2a8d" - integrity sha512-VYi5+ZVLhpgK4hQ0TAjiQiZ6ol0oe4mBx7mVv7IflsiEp0OWoVsp/+f9Vc1hOhE0TtkORVrI1GvzyreqpgWtkA== - dependencies: - tslib "^2.4.0" - -"@emnapi/wasi-threads@1.2.0": - version "1.2.0" - resolved "https://registry.yarnpkg.com/@emnapi/wasi-threads/-/wasi-threads-1.2.0.tgz#a19d9772cc3d195370bf6e2a805eec40aa75e18e" - integrity sha512-N10dEJNSsUx41Z6pZsXU8FjPjpBEplgH24sfkmITrBED1/U2Esum9F3lfLrMjKHHjmi557zQn7kR9R+XWXu5Rg== - dependencies: - tslib "^2.4.0" - -"@emotion/babel-plugin@^11.13.5": - version "11.13.5" - resolved "https://registry.yarnpkg.com/@emotion/babel-plugin/-/babel-plugin-11.13.5.tgz#eab8d65dbded74e0ecfd28dc218e75607c4e7bc0" - integrity sha512-pxHCpT2ex+0q+HH91/zsdHkw/lXd468DIN2zvfvLtPKLLMo6gQj7oLObq8PhkrxOZb/gGCq03S3Z7PDhS8pduQ== - dependencies: - "@babel/helper-module-imports" "^7.16.7" - "@babel/runtime" "^7.18.3" - "@emotion/hash" "^0.9.2" - "@emotion/memoize" "^0.9.0" - "@emotion/serialize" "^1.3.3" - babel-plugin-macros "^3.1.0" - convert-source-map "^1.5.0" - escape-string-regexp "^4.0.0" - find-root "^1.1.0" - source-map "^0.5.7" - stylis "4.2.0" - -"@emotion/cache@^11.14.0": - version "11.14.0" - resolved "https://registry.yarnpkg.com/@emotion/cache/-/cache-11.14.0.tgz#ee44b26986eeb93c8be82bb92f1f7a9b21b2ed76" - integrity sha512-L/B1lc/TViYk4DcpGxtAVbx0ZyiKM5ktoIyafGkH6zg/tj+mA+NE//aPYKG0k8kCHSHVJrpLpcAlOBEXQ3SavA== - dependencies: - "@emotion/memoize" "^0.9.0" - "@emotion/sheet" "^1.4.0" - "@emotion/utils" "^1.4.2" - "@emotion/weak-memoize" "^0.4.0" - stylis "4.2.0" - -"@emotion/hash@^0.9.2": - version "0.9.2" - resolved "https://registry.yarnpkg.com/@emotion/hash/-/hash-0.9.2.tgz#ff9221b9f58b4dfe61e619a7788734bd63f6898b" - integrity sha512-MyqliTZGuOm3+5ZRSaaBGP3USLw6+EGykkwZns2EPC5g8jJ4z9OrdZY9apkl3+UP9+sdz76YYkwCKP5gh8iY3g== - -"@emotion/is-prop-valid@^1.3.0": - version "1.4.0" - resolved "https://registry.yarnpkg.com/@emotion/is-prop-valid/-/is-prop-valid-1.4.0.tgz#e9ad47adff0b5c94c72db3669ce46de33edf28c0" - integrity sha512-QgD4fyscGcbbKwJmqNvUMSE02OsHUa+lAWKdEUIJKgqe5IwRSKd7+KhibEWdaKwgjLj0DRSHA9biAIqGBk05lw== - dependencies: - "@emotion/memoize" "^0.9.0" - -"@emotion/memoize@^0.9.0": - version "0.9.0" - resolved "https://registry.yarnpkg.com/@emotion/memoize/-/memoize-0.9.0.tgz#745969d649977776b43fc7648c556aaa462b4102" - integrity sha512-30FAj7/EoJ5mwVPOWhAyCX+FPfMDrVecJAM+Iw9NRoSl4BBAQeqj4cApHHUXOVvIPgLVDsCFoz/hGD+5QQD1GQ== - -"@emotion/react@^11.14.0": - version "11.14.0" - resolved "https://registry.yarnpkg.com/@emotion/react/-/react-11.14.0.tgz#cfaae35ebc67dd9ef4ea2e9acc6cd29e157dd05d" - integrity sha512-O000MLDBDdk/EohJPFUqvnp4qnHeYkVP5B0xEG0D/L7cOKP9kefu2DXn8dj74cQfsEzUqh+sr1RzFqiL1o+PpA== - dependencies: - "@babel/runtime" "^7.18.3" - "@emotion/babel-plugin" "^11.13.5" - "@emotion/cache" "^11.14.0" - "@emotion/serialize" "^1.3.3" - "@emotion/use-insertion-effect-with-fallbacks" "^1.2.0" - "@emotion/utils" "^1.4.2" - "@emotion/weak-memoize" "^0.4.0" - hoist-non-react-statics "^3.3.1" - -"@emotion/serialize@^1.3.3": - version "1.3.3" - resolved "https://registry.yarnpkg.com/@emotion/serialize/-/serialize-1.3.3.tgz#d291531005f17d704d0463a032fe679f376509e8" - integrity sha512-EISGqt7sSNWHGI76hC7x1CksiXPahbxEOrC5RjmFRJTqLyEK9/9hZvBbiYn70dw4wuwMKiEMCUlR6ZXTSWQqxA== - dependencies: - "@emotion/hash" "^0.9.2" - "@emotion/memoize" "^0.9.0" - "@emotion/unitless" "^0.10.0" - "@emotion/utils" "^1.4.2" - csstype "^3.0.2" - -"@emotion/sheet@^1.4.0": - version "1.4.0" - resolved "https://registry.yarnpkg.com/@emotion/sheet/-/sheet-1.4.0.tgz#c9299c34d248bc26e82563735f78953d2efca83c" - integrity sha512-fTBW9/8r2w3dXWYM4HCB1Rdp8NLibOw2+XELH5m5+AkWiL/KqYX6dc0kKYlaYyKjrQ6ds33MCdMPEwgs2z1rqg== - -"@emotion/styled@^11.14.0": - version "11.14.1" - resolved "https://registry.yarnpkg.com/@emotion/styled/-/styled-11.14.1.tgz#8c34bed2948e83e1980370305614c20955aacd1c" - integrity sha512-qEEJt42DuToa3gurlH4Qqc1kVpNq8wO8cJtDzU46TjlzWjDlsVyevtYCRijVq3SrHsROS+gVQ8Fnea108GnKzw== - dependencies: - "@babel/runtime" "^7.18.3" - "@emotion/babel-plugin" "^11.13.5" - "@emotion/is-prop-valid" "^1.3.0" - "@emotion/serialize" "^1.3.3" - "@emotion/use-insertion-effect-with-fallbacks" "^1.2.0" - "@emotion/utils" "^1.4.2" - -"@emotion/unitless@^0.10.0": - version "0.10.0" - resolved "https://registry.yarnpkg.com/@emotion/unitless/-/unitless-0.10.0.tgz#2af2f7c7e5150f497bdabd848ce7b218a27cf745" - integrity sha512-dFoMUuQA20zvtVTuxZww6OHoJYgrzfKM1t52mVySDJnMSEa08ruEvdYQbhvyu6soU+NeLVd3yKfTfT0NeV6qGg== - -"@emotion/use-insertion-effect-with-fallbacks@^1.2.0": - version "1.2.0" - resolved "https://registry.yarnpkg.com/@emotion/use-insertion-effect-with-fallbacks/-/use-insertion-effect-with-fallbacks-1.2.0.tgz#8a8cb77b590e09affb960f4ff1e9a89e532738bf" - integrity sha512-yJMtVdH59sxi/aVJBpk9FQq+OR8ll5GT8oWd57UpeaKEVGab41JWaCFA7FRLoMLloOZF/c/wsPoe+bfGmRKgDg== - -"@emotion/utils@^1.4.2": - version "1.4.2" - resolved "https://registry.yarnpkg.com/@emotion/utils/-/utils-1.4.2.tgz#6df6c45881fcb1c412d6688a311a98b7f59c1b52" - integrity sha512-3vLclRofFziIa3J2wDh9jjbkUz9qk5Vi3IZ/FSTKViB0k+ef0fPV7dYrUIugbgupYDx7v9ud/SjrtEP8Y4xLoA== - -"@emotion/weak-memoize@^0.4.0": - version "0.4.0" - resolved "https://registry.yarnpkg.com/@emotion/weak-memoize/-/weak-memoize-0.4.0.tgz#5e13fac887f08c44f76b0ccaf3370eb00fec9bb6" - integrity sha512-snKqtPW01tN0ui7yu9rGv69aJXr/a/Ywvl11sUjNtEcRc+ng/mQriFL0wLXMef74iHa/EkftbDzU9F8iFbH+zg== - -"@esbuild/aix-ppc64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/aix-ppc64/-/aix-ppc64-0.21.5.tgz#c7184a326533fcdf1b8ee0733e21c713b975575f" - integrity sha512-1SDgH6ZSPTlggy1yI6+Dbkiz8xzpHJEVAlF/AM1tHPLsf5STom9rwtjE4hKAF20FfXXNTFqEYXyJNWh1GiZedQ== - -"@esbuild/android-arm64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/android-arm64/-/android-arm64-0.21.5.tgz#09d9b4357780da9ea3a7dfb833a1f1ff439b4052" - integrity sha512-c0uX9VAUBQ7dTDCjq+wdyGLowMdtR/GoC2U5IYk/7D1H1JYC0qseD7+11iMP2mRLN9RcCMRcjC4YMclCzGwS/A== - -"@esbuild/android-arm@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/android-arm/-/android-arm-0.21.5.tgz#9b04384fb771926dfa6d7ad04324ecb2ab9b2e28" - integrity sha512-vCPvzSjpPHEi1siZdlvAlsPxXl7WbOVUBBAowWug4rJHb68Ox8KualB+1ocNvT5fjv6wpkX6o/iEpbDrf68zcg== - -"@esbuild/android-x64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/android-x64/-/android-x64-0.21.5.tgz#29918ec2db754cedcb6c1b04de8cd6547af6461e" - integrity sha512-D7aPRUUNHRBwHxzxRvp856rjUHRFW1SdQATKXH2hqA0kAZb1hKmi02OpYRacl0TxIGz/ZmXWlbZgjwWYaCakTA== - -"@esbuild/darwin-arm64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/darwin-arm64/-/darwin-arm64-0.21.5.tgz#e495b539660e51690f3928af50a76fb0a6ccff2a" - integrity sha512-DwqXqZyuk5AiWWf3UfLiRDJ5EDd49zg6O9wclZ7kUMv2WRFr4HKjXp/5t8JZ11QbQfUS6/cRCKGwYhtNAY88kQ== - -"@esbuild/darwin-x64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/darwin-x64/-/darwin-x64-0.21.5.tgz#c13838fa57372839abdddc91d71542ceea2e1e22" - integrity sha512-se/JjF8NlmKVG4kNIuyWMV/22ZaerB+qaSi5MdrXtd6R08kvs2qCN4C09miupktDitvh8jRFflwGFBQcxZRjbw== - -"@esbuild/freebsd-arm64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/freebsd-arm64/-/freebsd-arm64-0.21.5.tgz#646b989aa20bf89fd071dd5dbfad69a3542e550e" - integrity sha512-5JcRxxRDUJLX8JXp/wcBCy3pENnCgBR9bN6JsY4OmhfUtIHe3ZW0mawA7+RDAcMLrMIZaf03NlQiX9DGyB8h4g== - -"@esbuild/freebsd-x64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/freebsd-x64/-/freebsd-x64-0.21.5.tgz#aa615cfc80af954d3458906e38ca22c18cf5c261" - integrity sha512-J95kNBj1zkbMXtHVH29bBriQygMXqoVQOQYA+ISs0/2l3T9/kj42ow2mpqerRBxDJnmkUDCaQT/dfNXWX/ZZCQ== - -"@esbuild/linux-arm64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/linux-arm64/-/linux-arm64-0.21.5.tgz#70ac6fa14f5cb7e1f7f887bcffb680ad09922b5b" - integrity sha512-ibKvmyYzKsBeX8d8I7MH/TMfWDXBF3db4qM6sy+7re0YXya+K1cem3on9XgdT2EQGMu4hQyZhan7TeQ8XkGp4Q== - -"@esbuild/linux-arm@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/linux-arm/-/linux-arm-0.21.5.tgz#fc6fd11a8aca56c1f6f3894f2bea0479f8f626b9" - integrity sha512-bPb5AHZtbeNGjCKVZ9UGqGwo8EUu4cLq68E95A53KlxAPRmUyYv2D6F0uUI65XisGOL1hBP5mTronbgo+0bFcA== - -"@esbuild/linux-ia32@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/linux-ia32/-/linux-ia32-0.21.5.tgz#3271f53b3f93e3d093d518d1649d6d68d346ede2" - integrity sha512-YvjXDqLRqPDl2dvRODYmmhz4rPeVKYvppfGYKSNGdyZkA01046pLWyRKKI3ax8fbJoK5QbxblURkwK/MWY18Tg== - -"@esbuild/linux-loong64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/linux-loong64/-/linux-loong64-0.21.5.tgz#ed62e04238c57026aea831c5a130b73c0f9f26df" - integrity sha512-uHf1BmMG8qEvzdrzAqg2SIG/02+4/DHB6a9Kbya0XDvwDEKCoC8ZRWI5JJvNdUjtciBGFQ5PuBlpEOXQj+JQSg== - -"@esbuild/linux-mips64el@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/linux-mips64el/-/linux-mips64el-0.21.5.tgz#e79b8eb48bf3b106fadec1ac8240fb97b4e64cbe" - integrity sha512-IajOmO+KJK23bj52dFSNCMsz1QP1DqM6cwLUv3W1QwyxkyIWecfafnI555fvSGqEKwjMXVLokcV5ygHW5b3Jbg== - -"@esbuild/linux-ppc64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/linux-ppc64/-/linux-ppc64-0.21.5.tgz#5f2203860a143b9919d383ef7573521fb154c3e4" - integrity sha512-1hHV/Z4OEfMwpLO8rp7CvlhBDnjsC3CttJXIhBi+5Aj5r+MBvy4egg7wCbe//hSsT+RvDAG7s81tAvpL2XAE4w== - -"@esbuild/linux-riscv64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/linux-riscv64/-/linux-riscv64-0.21.5.tgz#07bcafd99322d5af62f618cb9e6a9b7f4bb825dc" - integrity sha512-2HdXDMd9GMgTGrPWnJzP2ALSokE/0O5HhTUvWIbD3YdjME8JwvSCnNGBnTThKGEB91OZhzrJ4qIIxk/SBmyDDA== - -"@esbuild/linux-s390x@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/linux-s390x/-/linux-s390x-0.21.5.tgz#b7ccf686751d6a3e44b8627ababc8be3ef62d8de" - integrity sha512-zus5sxzqBJD3eXxwvjN1yQkRepANgxE9lgOW2qLnmr8ikMTphkjgXu1HR01K4FJg8h1kEEDAqDcZQtbrRnB41A== - -"@esbuild/linux-x64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/linux-x64/-/linux-x64-0.21.5.tgz#6d8f0c768e070e64309af8004bb94e68ab2bb3b0" - integrity sha512-1rYdTpyv03iycF1+BhzrzQJCdOuAOtaqHTWJZCWvijKD2N5Xu0TtVC8/+1faWqcP9iBCWOmjmhoH94dH82BxPQ== - -"@esbuild/netbsd-x64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/netbsd-x64/-/netbsd-x64-0.21.5.tgz#bbe430f60d378ecb88decb219c602667387a6047" - integrity sha512-Woi2MXzXjMULccIwMnLciyZH4nCIMpWQAs049KEeMvOcNADVxo0UBIQPfSmxB3CWKedngg7sWZdLvLczpe0tLg== - -"@esbuild/openbsd-x64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/openbsd-x64/-/openbsd-x64-0.21.5.tgz#99d1cf2937279560d2104821f5ccce220cb2af70" - integrity sha512-HLNNw99xsvx12lFBUwoT8EVCsSvRNDVxNpjZ7bPn947b8gJPzeHWyNVhFsaerc0n3TsbOINvRP2byTZ5LKezow== - -"@esbuild/sunos-x64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/sunos-x64/-/sunos-x64-0.21.5.tgz#08741512c10d529566baba837b4fe052c8f3487b" - integrity sha512-6+gjmFpfy0BHU5Tpptkuh8+uw3mnrvgs+dSPQXQOv3ekbordwnzTVEb4qnIvQcYXq6gzkyTnoZ9dZG+D4garKg== - -"@esbuild/win32-arm64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/win32-arm64/-/win32-arm64-0.21.5.tgz#675b7385398411240735016144ab2e99a60fc75d" - integrity sha512-Z0gOTd75VvXqyq7nsl93zwahcTROgqvuAcYDUr+vOv8uHhNSKROyU961kgtCD1e95IqPKSQKH7tBTslnS3tA8A== - -"@esbuild/win32-ia32@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/win32-ia32/-/win32-ia32-0.21.5.tgz#1bfc3ce98aa6ca9a0969e4d2af72144c59c1193b" - integrity sha512-SWXFF1CL2RVNMaVs+BBClwtfZSvDgtL//G/smwAc5oVK/UPu2Gu9tIaRgFmYFFKrmg3SyAjSrElf0TiJ1v8fYA== - -"@esbuild/win32-x64@0.21.5": - version "0.21.5" - resolved "https://registry.yarnpkg.com/@esbuild/win32-x64/-/win32-x64-0.21.5.tgz#acad351d582d157bb145535db2a6ff53dd514b5c" - integrity sha512-tQd/1efJuzPC6rCFwEvLtci/xNFcTZknmXs98FYDfGE4wP9ClFV98nyKrzJKVPMhdDnjzLhdUyMX4PsQAPjwIw== - -"@eslint-community/eslint-utils@^4.8.0", "@eslint-community/eslint-utils@^4.9.1": - version "4.9.1" - resolved "https://registry.yarnpkg.com/@eslint-community/eslint-utils/-/eslint-utils-4.9.1.tgz#4e90af67bc51ddee6cdef5284edf572ec376b595" - integrity sha512-phrYmNiYppR7znFEdqgfWHXR6NCkZEK7hwWDHZUjit/2/U0r6XvkDl0SYnoM51Hq7FhCGdLDT6zxCCOY1hexsQ== - dependencies: - eslint-visitor-keys "^3.4.3" - -"@eslint-community/regexpp@^4.12.1", "@eslint-community/regexpp@^4.12.2": - version "4.12.2" - resolved "https://registry.yarnpkg.com/@eslint-community/regexpp/-/regexpp-4.12.2.tgz#bccdf615bcf7b6e8db830ec0b8d21c9a25de597b" - integrity sha512-EriSTlt5OC9/7SXkRSCAhfSxxoSUgBm33OH+IkwbdpgoqsSsUg7y3uh+IICI/Qg4BBWr3U2i39RpmycbxMq4ew== - -"@eslint/config-array@^0.21.2": - version "0.21.2" - resolved "https://registry.yarnpkg.com/@eslint/config-array/-/config-array-0.21.2.tgz#f29e22057ad5316cf23836cee9a34c81fffcb7e6" - integrity sha512-nJl2KGTlrf9GjLimgIru+V/mzgSK0ABCDQRvxw5BjURL7WfH5uoWmizbH7QB6MmnMBd8cIC9uceWnezL1VZWWw== - dependencies: - "@eslint/object-schema" "^2.1.7" - debug "^4.3.1" - minimatch "^3.1.5" - -"@eslint/config-helpers@^0.4.2": - version "0.4.2" - resolved "https://registry.yarnpkg.com/@eslint/config-helpers/-/config-helpers-0.4.2.tgz#1bd006ceeb7e2e55b2b773ab318d300e1a66aeda" - integrity sha512-gBrxN88gOIf3R7ja5K9slwNayVcZgK6SOUORm2uBzTeIEfeVaIhOpCtTox3P6R7o2jLFwLFTLnC7kU/RGcYEgw== - dependencies: - "@eslint/core" "^0.17.0" - -"@eslint/core@^0.17.0": - version "0.17.0" - resolved "https://registry.yarnpkg.com/@eslint/core/-/core-0.17.0.tgz#77225820413d9617509da9342190a2019e78761c" - integrity sha512-yL/sLrpmtDaFEiUj1osRP4TI2MDz1AddJL+jZ7KSqvBuliN4xqYY54IfdN8qD8Toa6g1iloph1fxQNkjOxrrpQ== - dependencies: - "@types/json-schema" "^7.0.15" - -"@eslint/eslintrc@^3.3.5": - version "3.3.5" - resolved "https://registry.yarnpkg.com/@eslint/eslintrc/-/eslintrc-3.3.5.tgz#c131793cfc1a7b96f24a83e0a8bbd4b881558c60" - integrity sha512-4IlJx0X0qftVsN5E+/vGujTRIFtwuLbNsVUe7TO6zYPDR1O6nFwvwhIKEKSrl6dZchmYBITazxKoUYOjdtjlRg== - dependencies: - ajv "^6.14.0" - debug "^4.3.2" - espree "^10.0.1" - globals "^14.0.0" - ignore "^5.2.0" - import-fresh "^3.2.1" - js-yaml "^4.1.1" - minimatch "^3.1.5" - strip-json-comments "^3.1.1" - -"@eslint/js@9.39.4", "@eslint/js@^9.15.0": - version "9.39.4" - resolved "https://registry.yarnpkg.com/@eslint/js/-/js-9.39.4.tgz#a3f83bfc6fd9bf33a853dfacd0b49b398eb596c1" - integrity sha512-nE7DEIchvtiFTwBw4Lfbu59PG+kCofhjsKaCWzxTpt4lfRjRMqG6uMBzKXuEcyXhOHoUp9riAm7/aWYGhXZ9cw== - -"@eslint/object-schema@^2.1.7": - version "2.1.7" - resolved "https://registry.yarnpkg.com/@eslint/object-schema/-/object-schema-2.1.7.tgz#6e2126a1347e86a4dedf8706ec67ff8e107ebbad" - integrity sha512-VtAOaymWVfZcmZbp6E2mympDIHvyjXs/12LqWYjVw6qjrfF+VK+fyG33kChz3nnK+SU5/NeHOqrTEHS8sXO3OA== - -"@eslint/plugin-kit@^0.4.1": - version "0.4.1" - resolved "https://registry.yarnpkg.com/@eslint/plugin-kit/-/plugin-kit-0.4.1.tgz#9779e3fd9b7ee33571a57435cf4335a1794a6cb2" - integrity sha512-43/qtrDUokr7LJqoF2c3+RInu/t4zfrpYdoSDfYyhg52rwLV6TnOvdG4fXm7IkSB3wErkcmJS9iEhjVtOSEjjA== - dependencies: - "@eslint/core" "^0.17.0" - levn "^0.4.1" - -"@exodus/bytes@^1.11.0", "@exodus/bytes@^1.15.0", "@exodus/bytes@^1.6.0": - version "1.15.0" - resolved "https://registry.yarnpkg.com/@exodus/bytes/-/bytes-1.15.0.tgz#54479e0f406cbad024d6fe1c3190ecca4468df3b" - integrity sha512-UY0nlA+feH81UGSHv92sLEPLCeZFjXOuHhrIo0HQydScuQc8s0A7kL/UdgwgDq8g8ilksmuoF35YVTNphV2aBQ== - -"@fast-csv/format@4.3.5": - version "4.3.5" - resolved "https://registry.yarnpkg.com/@fast-csv/format/-/format-4.3.5.tgz#90d83d1b47b6aaf67be70d6118f84f3e12ee1ff3" - integrity sha512-8iRn6QF3I8Ak78lNAa+Gdl5MJJBM5vRHivFtMRUWINdevNo00K7OXxS2PshawLKTejVwieIlPmK5YlLu6w4u8A== - dependencies: - "@types/node" "^14.0.1" - lodash.escaperegexp "^4.1.2" - lodash.isboolean "^3.0.3" - lodash.isequal "^4.5.0" - lodash.isfunction "^3.0.9" - lodash.isnil "^4.0.0" - -"@fast-csv/parse@4.3.6": - version "4.3.6" - resolved "https://registry.yarnpkg.com/@fast-csv/parse/-/parse-4.3.6.tgz#ee47d0640ca0291034c7aa94039a744cfb019264" - integrity sha512-uRsLYksqpbDmWaSmzvJcuApSEe38+6NQZBUsuAyMZKqHxH0g1wcJgsKUvN3WC8tewaqFjBMMGrkHmC+T7k8LvA== - dependencies: - "@types/node" "^14.0.1" - lodash.escaperegexp "^4.1.2" - lodash.groupby "^4.6.0" - lodash.isfunction "^3.0.9" - lodash.isnil "^4.0.0" - lodash.isundefined "^3.0.1" - lodash.uniq "^4.5.0" - -"@floating-ui/core@^1.7.5": - version "1.7.5" - resolved "https://registry.npmjs.org/@floating-ui/core/-/core-1.7.5.tgz#d4af157a03330af5a60e69da7a4692507ada0622" - integrity sha512-1Ih4WTWyw0+lKyFMcBHGbb5U5FtuHJuujoyyr5zTaWS5EYMeT6Jb2AuDeftsCsEuchO+mM2ij5+q9crhydzLhQ== - dependencies: - "@floating-ui/utils" "^0.2.11" - -"@floating-ui/dom@^1.0.0": - version "1.7.6" - resolved "https://registry.npmjs.org/@floating-ui/dom/-/dom-1.7.6.tgz#f915bba5abbb177e1f227cacee1b4d0634b187bf" - integrity sha512-9gZSAI5XM36880PPMm//9dfiEngYoC6Am2izES1FF406YFsjvyBMmeJ2g4SAju3xWwtuynNRFL2s9hgxpLI5SQ== - dependencies: - "@floating-ui/core" "^1.7.5" - "@floating-ui/utils" "^0.2.11" - -"@floating-ui/utils@^0.2.11": - version "0.2.11" - resolved "https://registry.npmjs.org/@floating-ui/utils/-/utils-0.2.11.tgz#a269e055e40e2f45873bae9d1a2fdccbd314ea3f" - integrity sha512-RiB/yIh78pcIxl6lLMG0CgBXAZ2Y0eVHqMPYugu+9U0AeT6YBeiJpf7lbdJNIugFP5SIjwNRgo4DhR1Qxi26Gg== - -"@fontsource/roboto@^4.5.5": - version "4.5.8" - resolved "https://registry.yarnpkg.com/@fontsource/roboto/-/roboto-4.5.8.tgz#56347764786079838faf43f0eeda22dd7328437f" - integrity sha512-CnD7zLItIzt86q4Sj3kZUiLcBk1dSk81qcqgMGaZe7SQ1P8hFNxhMl5AZthK1zrDM5m74VVhaOpuMGIL4gagaA== - -"@humanfs/core@^0.19.1": - version "0.19.1" - resolved "https://registry.yarnpkg.com/@humanfs/core/-/core-0.19.1.tgz#17c55ca7d426733fe3c561906b8173c336b40a77" - integrity sha512-5DyQ4+1JEUzejeK1JGICcideyfUbGixgS9jNgex5nqkW+cY7WZhxBigmieN5Qnw9ZosSNVC9KQKyb+GUaGyKUA== - -"@humanfs/node@^0.16.6": - version "0.16.7" - resolved "https://registry.yarnpkg.com/@humanfs/node/-/node-0.16.7.tgz#822cb7b3a12c5a240a24f621b5a2413e27a45f26" - integrity sha512-/zUx+yOsIrG4Y43Eh2peDeKCxlRt/gET6aHfaKpuq267qXdYDFViVHfMaLyygZOnl0kGWxFIgsBy8QFuTLUXEQ== - dependencies: - "@humanfs/core" "^0.19.1" - "@humanwhocodes/retry" "^0.4.0" - -"@humanwhocodes/module-importer@^1.0.1": - version "1.0.1" - resolved "https://registry.yarnpkg.com/@humanwhocodes/module-importer/-/module-importer-1.0.1.tgz#af5b2691a22b44be847b0ca81641c5fb6ad0172c" - integrity sha512-bxveV4V8v5Yb4ncFTT3rPSgZBOpCkjfK0y4oVVVJwIuDVBRMDXrPyXRL988i5ap9m9bnyEEjWfm5WkBmtffLfA== - -"@humanwhocodes/retry@^0.4.0", "@humanwhocodes/retry@^0.4.2": - version "0.4.3" - resolved "https://registry.yarnpkg.com/@humanwhocodes/retry/-/retry-0.4.3.tgz#c2b9d2e374ee62c586d3adbea87199b1d7a7a6ba" - integrity sha512-bV0Tgo9K4hfPCek+aMAn81RppFKv2ySDQeMoSZuvTASywNTnVJCArCZE2FWqpvIatKu7VMRLWlR1EazvVhDyhQ== - -"@jridgewell/gen-mapping@^0.3.12": - version "0.3.13" - resolved "https://registry.yarnpkg.com/@jridgewell/gen-mapping/-/gen-mapping-0.3.13.tgz#6342a19f44347518c93e43b1ac69deb3c4656a1f" - integrity sha512-2kkt/7niJ6MgEPxF0bYdQ6etZaA+fQvDcLKckhy1yIQOzaoKjBBjSj63/aLVjYE3qhRt5dvM+uUyfCg6UKCBbA== - dependencies: - "@jridgewell/sourcemap-codec" "^1.5.0" - "@jridgewell/trace-mapping" "^0.3.24" - -"@jridgewell/resolve-uri@^3.1.0": - version "3.1.2" - resolved "https://registry.yarnpkg.com/@jridgewell/resolve-uri/-/resolve-uri-3.1.2.tgz#7a0ee601f60f99a20c7c7c5ff0c80388c1189bd6" - integrity sha512-bRISgCIjP20/tbWSPWMEi54QVPRZExkuD9lJL+UIxUKtwVJA8wW1Trb1jMs1RFXo1CBTNZ/5hpC9QvmKWdopKw== - -"@jridgewell/sourcemap-codec@^1.4.14", "@jridgewell/sourcemap-codec@^1.5.0", "@jridgewell/sourcemap-codec@^1.5.5": - version "1.5.5" - resolved "https://registry.yarnpkg.com/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.5.5.tgz#6912b00d2c631c0d15ce1a7ab57cd657f2a8f8ba" - integrity sha512-cYQ9310grqxueWbl+WuIUIaiUaDcj7WOq5fVhEljNVgRfOUhY9fy2zTvfoqWsnebh8Sl70VScFbICvJnLKB0Og== - -"@jridgewell/trace-mapping@^0.3.24", "@jridgewell/trace-mapping@^0.3.28": - version "0.3.31" - resolved "https://registry.yarnpkg.com/@jridgewell/trace-mapping/-/trace-mapping-0.3.31.tgz#db15d6781c931f3a251a3dac39501c98a6082fd0" - integrity sha512-zzNR+SdQSDJzc8joaeP8QQoCQr8NuYx2dIIytl1QeBEZHJ9uW6hebsrYgbz8hJwUQao3TWCMtmfV8Nu1twOLAw== - dependencies: - "@jridgewell/resolve-uri" "^3.1.0" - "@jridgewell/sourcemap-codec" "^1.4.14" - -"@kurkle/color@^0.3.0": - version "0.3.4" - resolved "https://registry.yarnpkg.com/@kurkle/color/-/color-0.3.4.tgz#4d4ff677e1609214fc71c580125ddddd86abcabf" - integrity sha512-M5UknZPHRu3DEDWoipU6sE8PdkZ6Z/S+v4dD+Ke8IaNlpdSQah50lz1KtcFBa2vsdOnwbbnxJwVM4wty6udA5w== - -"@mui/core-downloads-tracker@^7.3.9": - version "7.3.9" - resolved "https://registry.yarnpkg.com/@mui/core-downloads-tracker/-/core-downloads-tracker-7.3.9.tgz#d944e385f8f7f5e680e5ba479b39ff8602bd4939" - integrity sha512-MOkOCTfbMJwLshlBCKJ59V2F/uaLYfmKnN76kksj6jlGUVdI25A9Hzs08m+zjBRdLv+sK7Rqdsefe8X7h/6PCw== - -"@mui/icons-material@^7.1.1": - version "7.3.9" - resolved "https://registry.yarnpkg.com/@mui/icons-material/-/icons-material-7.3.9.tgz#4f6dc62bfe8954f3848b0eecb3650cff10f6a7ec" - integrity sha512-BT+zPJXss8Hg/oEMRmHl17Q97bPACG4ufFSfGEdhiE96jOyR5Dz1ty7ZWt1fVGR0y1p+sSgEwQT/MNZQmoWDCw== - dependencies: - "@babel/runtime" "^7.28.6" - -"@mui/lab@^7.0.1-beta.18": - version "7.0.1-beta.23" - resolved "https://registry.yarnpkg.com/@mui/lab/-/lab-7.0.1-beta.23.tgz#43a7b8f7bb352b3ceaf3e20b4e6d2e275772cb03" - integrity sha512-661LhBtL33DWeRk7DXXu4LvbHUmTRkoybiVgKkdLx6gA4Nbr1r6B1U+yZGcTm5GfY25nrtS083aoy3P0wuuJ3A== - dependencies: - "@babel/runtime" "^7.28.6" - "@mui/system" "^7.3.9" - "@mui/types" "^7.4.12" - "@mui/utils" "^7.3.9" - clsx "^2.1.1" - prop-types "^15.8.1" - -"@mui/material@^7.1.1": - version "7.3.9" - resolved "https://registry.yarnpkg.com/@mui/material/-/material-7.3.9.tgz#b3bf698b4b82ad630422df55f3d5f4e44a7bad11" - integrity sha512-I8yO3t4T0y7bvDiR1qhIN6iBWZOTBfVOnmLlM7K6h3dx5YX2a7rnkuXzc2UkZaqhxY9NgTnEbdPlokR1RxCNRQ== - dependencies: - "@babel/runtime" "^7.28.6" - "@mui/core-downloads-tracker" "^7.3.9" - "@mui/system" "^7.3.9" - "@mui/types" "^7.4.12" - "@mui/utils" "^7.3.9" - "@popperjs/core" "^2.11.8" - "@types/react-transition-group" "^4.4.12" - clsx "^2.1.1" - csstype "^3.2.3" - prop-types "^15.8.1" - react-is "^19.2.3" - react-transition-group "^4.4.5" - -"@mui/private-theming@^7.3.9": - version "7.3.9" - resolved "https://registry.yarnpkg.com/@mui/private-theming/-/private-theming-7.3.9.tgz#c785dc429b7ed62cf3952140be703cbe95704a13" - integrity sha512-ErIyRQvsiQEq7Yvcvfw9UDHngaqjMy9P3JDPnRAaKG5qhpl2C4tX/W1S4zJvpu+feihmZJStjIyvnv6KDbIrlw== - dependencies: - "@babel/runtime" "^7.28.6" - "@mui/utils" "^7.3.9" - prop-types "^15.8.1" - -"@mui/styled-engine@^7.3.9": - version "7.3.9" - resolved "https://registry.yarnpkg.com/@mui/styled-engine/-/styled-engine-7.3.9.tgz#e425ca7b5cb559bde01b8fa4a7a842e9b5916f53" - integrity sha512-JqujWt5bX4okjUPGpVof/7pvgClqh7HvIbsIBIOOlCh2u3wG/Bwp4+E1bc1dXSwkrkp9WUAoNdI5HEC+5HKvMw== - dependencies: - "@babel/runtime" "^7.28.6" - "@emotion/cache" "^11.14.0" - "@emotion/serialize" "^1.3.3" - "@emotion/sheet" "^1.4.0" - csstype "^3.2.3" - prop-types "^15.8.1" - -"@mui/system@^7.3.9": - version "7.3.9" - resolved "https://registry.yarnpkg.com/@mui/system/-/system-7.3.9.tgz#d8181dd9ad8c5e9afdf50eb7009062c506976ab1" - integrity sha512-aL1q9am8XpRrSabv9qWf5RHhJICJql34wnrc1nz0MuOglPRYF/liN+c8VqZdTvUn9qg+ZjRVbKf4sJVFfIDtmg== - dependencies: - "@babel/runtime" "^7.28.6" - "@mui/private-theming" "^7.3.9" - "@mui/styled-engine" "^7.3.9" - "@mui/types" "^7.4.12" - "@mui/utils" "^7.3.9" - clsx "^2.1.1" - csstype "^3.2.3" - prop-types "^15.8.1" - -"@mui/types@^7.4.12": - version "7.4.12" - resolved "https://registry.yarnpkg.com/@mui/types/-/types-7.4.12.tgz#e4eba37a7506419ea5c5e0604322ba82b271bf46" - integrity sha512-iKNAF2u9PzSIj40CjvKJWxFXJo122jXVdrmdh0hMYd+FR+NuJMkr/L88XwWLCRiJ5P1j+uyac25+Kp6YC4hu6w== - dependencies: - "@babel/runtime" "^7.28.6" - -"@mui/utils@^7.3.9": - version "7.3.9" - resolved "https://registry.yarnpkg.com/@mui/utils/-/utils-7.3.9.tgz#8af5093fc93c2e582fa3d047f561c7b690509bc2" - integrity sha512-U6SdZaGbfb65fqTsH3V5oJdFj9uYwyLE2WVuNvmbggTSDBb8QHrFsqY8BN3taK9t3yJ8/BPHD/kNvLNyjwM7Yw== - dependencies: - "@babel/runtime" "^7.28.6" - "@mui/types" "^7.4.12" - "@types/prop-types" "^15.7.15" - clsx "^2.1.1" - prop-types "^15.8.1" - react-is "^19.2.3" - -"@napi-rs/wasm-runtime@^1.1.1": - version "1.1.1" - resolved "https://registry.yarnpkg.com/@napi-rs/wasm-runtime/-/wasm-runtime-1.1.1.tgz#c3705ab549d176b8dc5172723d6156c3dc426af2" - integrity sha512-p64ah1M1ld8xjWv3qbvFwHiFVWrq1yFvV4f7w+mzaqiR4IlSgkqhcRdHwsGgomwzBH51sRY4NEowLxnaBjcW/A== - dependencies: - "@emnapi/core" "^1.7.1" - "@emnapi/runtime" "^1.7.1" - "@tybys/wasm-util" "^0.10.1" - -"@oxc-project/types@=0.122.0": - version "0.122.0" - resolved "https://registry.yarnpkg.com/@oxc-project/types/-/types-0.122.0.tgz#2f4e77a3b183c87b2a326affd703ef71ba836601" - integrity sha512-oLAl5kBpV4w69UtFZ9xqcmTi+GENWOcPF7FCrczTiBbmC0ibXxCwyvZGbO39rCVEuLGAZM84DH0pUIyyv/YJzA== - -"@parcel/watcher-android-arm64@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-android-arm64/-/watcher-android-arm64-2.5.6.tgz#5f32e0dba356f4ac9a11068d2a5c134ca3ba6564" - integrity sha512-YQxSS34tPF/6ZG7r/Ih9xy+kP/WwediEUsqmtf0cuCV5TPPKw/PQHRhueUo6JdeFJaqV3pyjm0GdYjZotbRt/A== - -"@parcel/watcher-darwin-arm64@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-darwin-arm64/-/watcher-darwin-arm64-2.5.6.tgz#88d3e720b59b1eceffce98dac46d7c40e8be5e8e" - integrity sha512-Z2ZdrnwyXvvvdtRHLmM4knydIdU9adO3D4n/0cVipF3rRiwP+3/sfzpAwA/qKFL6i1ModaabkU7IbpeMBgiVEA== - -"@parcel/watcher-darwin-x64@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-darwin-x64/-/watcher-darwin-x64-2.5.6.tgz#bf05d76a78bc15974f15ec3671848698b0838063" - integrity sha512-HgvOf3W9dhithcwOWX9uDZyn1lW9R+7tPZ4sug+NGrGIo4Rk1hAXLEbcH1TQSqxts0NYXXlOWqVpvS1SFS4fRg== - -"@parcel/watcher-freebsd-x64@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-freebsd-x64/-/watcher-freebsd-x64-2.5.6.tgz#8bc26e9848e7303ac82922a5ae1b1ef1bdb48a53" - integrity sha512-vJVi8yd/qzJxEKHkeemh7w3YAn6RJCtYlE4HPMoVnCpIXEzSrxErBW5SJBgKLbXU3WdIpkjBTeUNtyBVn8TRng== - -"@parcel/watcher-linux-arm-glibc@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-linux-arm-glibc/-/watcher-linux-arm-glibc-2.5.6.tgz#1328fee1deb0c2d7865079ef53a2ba4cc2f8b40a" - integrity sha512-9JiYfB6h6BgV50CCfasfLf/uvOcJskMSwcdH1PHH9rvS1IrNy8zad6IUVPVUfmXr+u+Km9IxcfMLzgdOudz9EQ== - -"@parcel/watcher-linux-arm-musl@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-linux-arm-musl/-/watcher-linux-arm-musl-2.5.6.tgz#bad0f45cb3e2157746db8b9d22db6a125711f152" - integrity sha512-Ve3gUCG57nuUUSyjBq/MAM0CzArtuIOxsBdQ+ftz6ho8n7s1i9E1Nmk/xmP323r2YL0SONs1EuwqBp2u1k5fxg== - -"@parcel/watcher-linux-arm64-glibc@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-linux-arm64-glibc/-/watcher-linux-arm64-glibc-2.5.6.tgz#b75913fbd501d9523c5f35d420957bf7d0204809" - integrity sha512-f2g/DT3NhGPdBmMWYoxixqYr3v/UXcmLOYy16Bx0TM20Tchduwr4EaCbmxh1321TABqPGDpS8D/ggOTaljijOA== - -"@parcel/watcher-linux-arm64-musl@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-linux-arm64-musl/-/watcher-linux-arm64-musl-2.5.6.tgz#da5621a6a576070c8c0de60dea8b46dc9c3827d4" - integrity sha512-qb6naMDGlbCwdhLj6hgoVKJl2odL34z2sqkC7Z6kzir8b5W65WYDpLB6R06KabvZdgoHI/zxke4b3zR0wAbDTA== - -"@parcel/watcher-linux-x64-glibc@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-linux-x64-glibc/-/watcher-linux-x64-glibc-2.5.6.tgz#ce437accdc4b30f93a090b4a221fd95cd9b89639" - integrity sha512-kbT5wvNQlx7NaGjzPFu8nVIW1rWqV780O7ZtkjuWaPUgpv2NMFpjYERVi0UYj1msZNyCzGlaCWEtzc+exjMGbQ== - -"@parcel/watcher-linux-x64-musl@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-linux-x64-musl/-/watcher-linux-x64-musl-2.5.6.tgz#02400c54b4a67efcc7e2327b249711920ac969e2" - integrity sha512-1JRFeC+h7RdXwldHzTsmdtYR/Ku8SylLgTU/reMuqdVD7CtLwf0VR1FqeprZ0eHQkO0vqsbvFLXUmYm/uNKJBg== - -"@parcel/watcher-win32-arm64@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-win32-arm64/-/watcher-win32-arm64-2.5.6.tgz#caae3d3c7583ca0a7171e6bd142c34d20ea1691e" - integrity sha512-3ukyebjc6eGlw9yRt678DxVF7rjXatWiHvTXqphZLvo7aC5NdEgFufVwjFfY51ijYEWpXbqF5jtrK275z52D4Q== - -"@parcel/watcher-win32-ia32@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-win32-ia32/-/watcher-win32-ia32-2.5.6.tgz#9ac922550896dfe47bfc5ae3be4f1bcaf8155d6d" - integrity sha512-k35yLp1ZMwwee3Ez/pxBi5cf4AoBKYXj00CZ80jUz5h8prpiaQsiRPKQMxoLstNuqe2vR4RNPEAEcjEFzhEz/g== - -"@parcel/watcher-win32-x64@2.5.6": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher-win32-x64/-/watcher-win32-x64-2.5.6.tgz#73fdafba2e21c448f0e456bbe13178d8fe11739d" - integrity sha512-hbQlYcCq5dlAX9Qx+kFb0FHue6vbjlf0FrNzSKdYK2APUf7tGfGxQCk2ihEREmbR6ZMc0MVAD5RIX/41gpUzTw== - -"@parcel/watcher@^2.4.1": - version "2.5.6" - resolved "https://registry.yarnpkg.com/@parcel/watcher/-/watcher-2.5.6.tgz#3f932828c894f06d0ad9cfefade1756ecc6ef1f1" - integrity sha512-tmmZ3lQxAe/k/+rNnXQRawJ4NjxO2hqiOLTHvWchtGZULp4RyFeh6aU4XdOYBFe2KE1oShQTv4AblOs2iOrNnQ== - dependencies: - detect-libc "^2.0.3" - is-glob "^4.0.3" - node-addon-api "^7.0.0" - picomatch "^4.0.3" - optionalDependencies: - "@parcel/watcher-android-arm64" "2.5.6" - "@parcel/watcher-darwin-arm64" "2.5.6" - "@parcel/watcher-darwin-x64" "2.5.6" - "@parcel/watcher-freebsd-x64" "2.5.6" - "@parcel/watcher-linux-arm-glibc" "2.5.6" - "@parcel/watcher-linux-arm-musl" "2.5.6" - "@parcel/watcher-linux-arm64-glibc" "2.5.6" - "@parcel/watcher-linux-arm64-musl" "2.5.6" - "@parcel/watcher-linux-x64-glibc" "2.5.6" - "@parcel/watcher-linux-x64-musl" "2.5.6" - "@parcel/watcher-win32-arm64" "2.5.6" - "@parcel/watcher-win32-ia32" "2.5.6" - "@parcel/watcher-win32-x64" "2.5.6" - -"@popperjs/core@^2.11.8": - version "2.11.8" - resolved "https://registry.yarnpkg.com/@popperjs/core/-/core-2.11.8.tgz#6b79032e760a0899cd4204710beede972a3a185f" - integrity sha512-P1st0aksCrn9sGZhp8GMYwBnQsbvAWsZAX44oXNNvLHGqAOcoVxmjZiohstwQ7SqKnbR47akdNi+uleWD8+g6A== - -"@react-dnd/asap@^5.0.1": - version "5.0.2" - resolved "https://registry.yarnpkg.com/@react-dnd/asap/-/asap-5.0.2.tgz#1f81f124c1cd6f39511c11a881cfb0f715343488" - integrity sha512-WLyfoHvxhs0V9U+GTsGilGgf2QsPl6ZZ44fnv0/b8T3nQyvzxidxsg/ZltbWssbsRDlYW8UKSQMTGotuTotZ6A== - -"@react-dnd/invariant@^4.0.1": - version "4.0.2" - resolved "https://registry.yarnpkg.com/@react-dnd/invariant/-/invariant-4.0.2.tgz#b92edffca10a26466643349fac7cdfb8799769df" - integrity sha512-xKCTqAK/FFauOM9Ta2pswIyT3D8AQlfrYdOi/toTPEhqCuAs1v5tcJ3Y08Izh1cJ5Jchwy9SeAXmMg6zrKs2iw== - -"@react-dnd/shallowequal@^4.0.1": - version "4.0.2" - resolved "https://registry.yarnpkg.com/@react-dnd/shallowequal/-/shallowequal-4.0.2.tgz#d1b4befa423f692fa4abf1c79209702e7d8ae4b4" - integrity sha512-/RVXdLvJxLg4QKvMoM5WlwNR9ViO9z8B/qPcc+C0Sa/teJY7QG7kJ441DwzOjMYEY7GmU4dj5EcGHIkKZiQZCA== - -"@reduxjs/toolkit@^1.8.6": - version "1.9.7" - resolved "https://registry.yarnpkg.com/@reduxjs/toolkit/-/toolkit-1.9.7.tgz#7fc07c0b0ebec52043f8cb43510cf346405f78a6" - integrity sha512-t7v8ZPxhhKgOKtU+uyJT13lu4vL7az5aFi4IdoDs/eS548edn2M8Ik9h8fxgvMjGoAUVFSt6ZC1P5cWmQ014QQ== - dependencies: - immer "^9.0.21" - redux "^4.2.1" - redux-thunk "^2.4.2" - reselect "^4.1.8" - -"@remirror/core-constants@3.0.0": - version "3.0.0" - resolved "https://registry.npmjs.org/@remirror/core-constants/-/core-constants-3.0.0.tgz#96fdb89d25c62e7b6a5d08caf0ce5114370e3b8f" - integrity sha512-42aWfPrimMfDKDi4YegyS7x+/0tlzaqwPQCULLanv3DMIlu96KTJR0fM5isWX2UViOqlGnX6YFgqWepcX+XMNg== - -"@remix-run/router@1.23.2": - version "1.23.2" - resolved "https://registry.yarnpkg.com/@remix-run/router/-/router-1.23.2.tgz#156c4b481c0bee22a19f7924728a67120de06971" - integrity sha512-Ic6m2U/rMjTkhERIa/0ZtXJP17QUi2CbWE7cqx4J58M8aA3QTfW+2UlQ4psvTX9IO1RfNVhK3pcpdjej7L+t2w== - -"@rolldown/binding-android-arm64@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-android-arm64/-/binding-android-arm64-1.0.0-rc.11.tgz#25a584227ed97239fd564451c0db2c359751b42a" - integrity sha512-SJ+/g+xNnOh6NqYxD0V3uVN4W3VfnrGsC9/hoglicgTNfABFG9JjISvkkU0dNY84MNHLWyOgxP9v9Y9pX4S7+A== - -"@rolldown/binding-darwin-arm64@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-darwin-arm64/-/binding-darwin-arm64-1.0.0-rc.11.tgz#dcfa96c4d8c7baa47f5b90294ce8ebf1b0b1dbf9" - integrity sha512-7WQgR8SfOPwmDZGFkThUvsmd/nwAWv91oCO4I5LS7RKrssPZmOt7jONN0cW17ydGC1n/+puol1IpoieKqQidmg== - -"@rolldown/binding-darwin-x64@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-darwin-x64/-/binding-darwin-x64-1.0.0-rc.11.tgz#6e751ea2067cacee0c94f0e8b087761dde62f9ea" - integrity sha512-39Ks6UvIHq4rEogIfQBoBRusj0Q0nPVWIvqmwBLaT6aqQGIakHdESBVOPRRLacy4WwUPIx4ZKzfZ9PMW+IeyUQ== - -"@rolldown/binding-freebsd-x64@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-freebsd-x64/-/binding-freebsd-x64-1.0.0-rc.11.tgz#b7582b959398c5871034b94ba0a8ecde0425a8e7" - integrity sha512-jfsm0ZHfhiqrvWjJAmzsqiIFPz5e7mAoCOPBNTcNgkiid/LaFKiq92+0ojH+nmJmKYkre4t71BWXUZDNp7vsag== - -"@rolldown/binding-linux-arm-gnueabihf@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-linux-arm-gnueabihf/-/binding-linux-arm-gnueabihf-1.0.0-rc.11.tgz#3b8c5e071d6a0ed1cb1880c1948c6fece553502a" - integrity sha512-zjQaUtSyq1nVe3nxmlSCuR96T1LPlpvmJ0SZy0WJFEsV4kFbXcq2u68L4E6O0XeFj4aex9bEauqjW8UQBeAvfQ== - -"@rolldown/binding-linux-arm64-gnu@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-linux-arm64-gnu/-/binding-linux-arm64-gnu-1.0.0-rc.11.tgz#2533165620137b077ae4ede92b752a63cd85cfcb" - integrity sha512-WMW1yE6IOnehTcFE9eipFkm3XN63zypWlrJQ2iF7NrQ9b2LDRjumFoOGJE8RJJTJCTBAdmLMnJ8uVitACUUo1Q== - -"@rolldown/binding-linux-arm64-musl@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-linux-arm64-musl/-/binding-linux-arm64-musl-1.0.0-rc.11.tgz#b04cf5b806a012027a4e8b139e0f86b2ff7621c0" - integrity sha512-jfndI9tsfm4APzjNt6QdBkYwre5lRPUgHeDHoI7ydKUuJvz3lZeCfMsI56BZj+7BYqiKsJm7cfd/6KYV7ubrBg== - -"@rolldown/binding-linux-ppc64-gnu@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-linux-ppc64-gnu/-/binding-linux-ppc64-gnu-1.0.0-rc.11.tgz#bda9c11fe03482033d5dac6a943802b3e7579550" - integrity sha512-ZlFgw46NOAGMgcdvdYwAGu2Q+SLFA9LzbJLW+iyMOJyhj5wk6P3KEE9Gct4xWwSzFoPI7JCdYmYMzVtlgQ+zfw== - -"@rolldown/binding-linux-s390x-gnu@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-linux-s390x-gnu/-/binding-linux-s390x-gnu-1.0.0-rc.11.tgz#55daa2d35f92f62e958fc44e12db1c16e1f271c5" - integrity sha512-hIOYmuT6ofM4K04XAZd3OzMySEO4K0/nc9+jmNcxNAxRi6c5UWpqfw3KMFV4MVFWL+jQsSh+bGw2VqmaPMTLyw== - -"@rolldown/binding-linux-x64-gnu@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-linux-x64-gnu/-/binding-linux-x64-gnu-1.0.0-rc.11.tgz#8ca1abf607bbe2f7fdd6f6416192937dc9ea1e54" - integrity sha512-qXBQQO9OvkjjQPLdUVr7Nr2t3QTZI7s4KZtfw7HzBgjbmAPSFwSv4rmET9lLSgq3rH/ndA3ngv3Qb8l2njoPNA== - -"@rolldown/binding-linux-x64-musl@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-linux-x64-musl/-/binding-linux-x64-musl-1.0.0-rc.11.tgz#36a52beee8ac97a79d1ed8f1b94fab677e3e4d11" - integrity sha512-/tpFfoSTzUkH9LPY+cYbqZBDyyX62w5fICq9qzsHLL8uTI6BHip3Q9Uzft0wylk/i8OOwKik8OxW+QAhDmzwmg== - -"@rolldown/binding-openharmony-arm64@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-openharmony-arm64/-/binding-openharmony-arm64-1.0.0-rc.11.tgz#91c74fd23b3f3f3942fe4b3aefc9428ecbaa55fd" - integrity sha512-mcp3Rio2w72IvdZG0oQ4bM2c2oumtwHfUfKncUM6zGgz0KgPz4YmDPQfnXEiY5t3+KD/i8HG2rOB/LxdmieK2g== - -"@rolldown/binding-wasm32-wasi@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-wasm32-wasi/-/binding-wasm32-wasi-1.0.0-rc.11.tgz#6520bafe57ff1cd2fb45f8f22b1cb6d57be44e79" - integrity sha512-LXk5Hii1Ph9asuGRjBuz8TUxdc1lWzB7nyfdoRgI0WGPZKmCxvlKk8KfYysqtr4MfGElu/f/pEQRh8fcEgkrWw== - dependencies: - "@napi-rs/wasm-runtime" "^1.1.1" - -"@rolldown/binding-win32-arm64-msvc@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-win32-arm64-msvc/-/binding-win32-arm64-msvc-1.0.0-rc.11.tgz#73dd1c4737473c8270b61cd2e42b05a34453ffc0" - integrity sha512-dDwf5otnx0XgRY1yqxOC4ITizcdzS/8cQ3goOWv3jFAo4F+xQYni+hnMuO6+LssHHdJW7+OCVL3CoU4ycnh35Q== - -"@rolldown/binding-win32-x64-msvc@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/binding-win32-x64-msvc/-/binding-win32-x64-msvc-1.0.0-rc.11.tgz#4d922aa6dd6bf27c73eba93fec9a0aed62549095" - integrity sha512-LN4/skhSggybX71ews7dAj6r2geaMJfm3kMbK2KhFMg9B10AZXnKoLCVVgzhMHL0S+aKtr4p8QbAW8k+w95bAA== - -"@rolldown/pluginutils@1.0.0-beta.27": - version "1.0.0-beta.27" - resolved "https://registry.yarnpkg.com/@rolldown/pluginutils/-/pluginutils-1.0.0-beta.27.tgz#47d2bf4cef6d470b22f5831b420f8964e0bf755f" - integrity sha512-+d0F4MKMCbeVUJwG96uQ4SgAznZNSq93I3V+9NHA4OpvqG8mRCpGdKmK8l/dl02h2CCDHwW2FqilnTyDcAnqjA== - -"@rolldown/pluginutils@1.0.0-rc.11": - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/@rolldown/pluginutils/-/pluginutils-1.0.0-rc.11.tgz#110d8cc72990c4e36a79791eeafe7cca979e00c9" - integrity sha512-xQO9vbwBecJRv9EUcQ/y0dzSTJgA7Q6UVN7xp6B81+tBGSLVAK03yJ9NkJaUA7JFD91kbjxRSC/mDnmvXzbHoQ== - -"@rollup/rollup-android-arm-eabi@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.60.0.tgz#7e158ddfc16f78da99c0d5ccbae6cae403ef3284" - integrity sha512-WOhNW9K8bR3kf4zLxbfg6Pxu2ybOUbB2AjMDHSQx86LIF4rH4Ft7vmMwNt0loO0eonglSNy4cpD3MKXXKQu0/A== - -"@rollup/rollup-android-arm64@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-android-arm64/-/rollup-android-arm64-4.60.0.tgz#49f4ae0e22b6f9ffbcd3818b9a0758fa2d10b1cd" - integrity sha512-u6JHLll5QKRvjciE78bQXDmqRqNs5M/3GVqZeMwvmjaNODJih/WIrJlFVEihvV0MiYFmd+ZyPr9wxOVbPAG2Iw== - -"@rollup/rollup-darwin-arm64@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-darwin-arm64/-/rollup-darwin-arm64-4.60.0.tgz#bb200269069acf5c1c4d79ad142524f77e8b8236" - integrity sha512-qEF7CsKKzSRc20Ciu2Zw1wRrBz4g56F7r/vRwY430UPp/nt1x21Q/fpJ9N5l47WWvJlkNCPJz3QRVw008fi7yA== - -"@rollup/rollup-darwin-x64@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-darwin-x64/-/rollup-darwin-x64-4.60.0.tgz#1bf7a92b27ebdd5e0d1d48503c7811160773be1a" - integrity sha512-WADYozJ4QCnXCH4wPB+3FuGmDPoFseVCUrANmA5LWwGmC6FL14BWC7pcq+FstOZv3baGX65tZ378uT6WG8ynTw== - -"@rollup/rollup-freebsd-arm64@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-freebsd-arm64/-/rollup-freebsd-arm64-4.60.0.tgz#5ccf537b99c5175008444702193ad0b1c36f7f16" - integrity sha512-6b8wGHJlDrGeSE3aH5mGNHBjA0TTkxdoNHik5EkvPHCt351XnigA4pS7Wsj/Eo9Y8RBU6f35cjN9SYmCFBtzxw== - -"@rollup/rollup-freebsd-x64@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-freebsd-x64/-/rollup-freebsd-x64-4.60.0.tgz#1196ecd7bf4e128624ef83cd1f9d785114474a77" - integrity sha512-h25Ga0t4jaylMB8M/JKAyrvvfxGRjnPQIR8lnCayyzEjEOx2EJIlIiMbhpWxDRKGKF8jbNH01NnN663dH638mA== - -"@rollup/rollup-linux-arm-gnueabihf@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm-gnueabihf/-/rollup-linux-arm-gnueabihf-4.60.0.tgz#cc147633a4af229fee83a737bf2334fbac3dc28e" - integrity sha512-RzeBwv0B3qtVBWtcuABtSuCzToo2IEAIQrcyB/b2zMvBWVbjo8bZDjACUpnaafaxhTw2W+imQbP2BD1usasK4g== - -"@rollup/rollup-linux-arm-musleabihf@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm-musleabihf/-/rollup-linux-arm-musleabihf-4.60.0.tgz#3559f9f060153ea54594a42c3b87a297bedcc26e" - integrity sha512-Sf7zusNI2CIU1HLzuu9Tc5YGAHEZs5Lu7N1ssJG4Tkw6e0MEsN7NdjUDDfGNHy2IU+ENyWT+L2obgWiguWibWQ== - -"@rollup/rollup-linux-arm64-gnu@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm64-gnu/-/rollup-linux-arm64-gnu-4.60.0.tgz#e91f887b154123485cfc4b59befe2080fcd8f2df" - integrity sha512-DX2x7CMcrJzsE91q7/O02IJQ5/aLkVtYFryqCjduJhUfGKG6yJV8hxaw8pZa93lLEpPTP/ohdN4wFz7yp/ry9A== - -"@rollup/rollup-linux-arm64-musl@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm64-musl/-/rollup-linux-arm64-musl-4.60.0.tgz#660752f040df9ba44a24765df698928917c0bf21" - integrity sha512-09EL+yFVbJZlhcQfShpswwRZ0Rg+z/CsSELFCnPt3iK+iqwGsI4zht3secj5vLEs957QvFFXnzAT0FFPIxSrkQ== - -"@rollup/rollup-linux-loong64-gnu@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-loong64-gnu/-/rollup-linux-loong64-gnu-4.60.0.tgz#cb0e939a5fa479ccef264f3f45b31971695f869c" - integrity sha512-i9IcCMPr3EXm8EQg5jnja0Zyc1iFxJjZWlb4wr7U2Wx/GrddOuEafxRdMPRYVaXjgbhvqalp6np07hN1w9kAKw== - -"@rollup/rollup-linux-loong64-musl@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-loong64-musl/-/rollup-linux-loong64-musl-4.60.0.tgz#42f86fbc82cd1a81be2d346476dd3231cf5ee442" - integrity sha512-DGzdJK9kyJ+B78MCkWeGnpXJ91tK/iKA6HwHxF4TAlPIY7GXEvMe8hBFRgdrR9Ly4qebR/7gfUs9y2IoaVEyog== - -"@rollup/rollup-linux-ppc64-gnu@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-ppc64-gnu/-/rollup-linux-ppc64-gnu-4.60.0.tgz#39776a647a789dc95ea049277c5ef8f098df77f9" - integrity sha512-RwpnLsqC8qbS8z1H1AxBA1H6qknR4YpPR9w2XX0vo2Sz10miu57PkNcnHVaZkbqyw/kUWfKMI73jhmfi9BRMUQ== - -"@rollup/rollup-linux-ppc64-musl@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-ppc64-musl/-/rollup-linux-ppc64-musl-4.60.0.tgz#466f20029a8e8b3bb2954c7ddebc9586420cac2c" - integrity sha512-Z8pPf54Ly3aqtdWC3G4rFigZgNvd+qJlOE52fmko3KST9SoGfAdSRCwyoyG05q1HrrAblLbk1/PSIV+80/pxLg== - -"@rollup/rollup-linux-riscv64-gnu@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-riscv64-gnu/-/rollup-linux-riscv64-gnu-4.60.0.tgz#cff9877c78f12e7aa6246f6902ad913e99edb2b7" - integrity sha512-3a3qQustp3COCGvnP4SvrMHnPQ9d1vzCakQVRTliaz8cIp/wULGjiGpbcqrkv0WrHTEp8bQD/B3HBjzujVWLOA== - -"@rollup/rollup-linux-riscv64-musl@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-riscv64-musl/-/rollup-linux-riscv64-musl-4.60.0.tgz#9a762fb99b5a82a921017f56491b7e892b9fb17d" - integrity sha512-pjZDsVH/1VsghMJ2/kAaxt6dL0psT6ZexQVrijczOf+PeP2BUqTHYejk3l6TlPRydggINOeNRhvpLa0AYpCWSQ== - -"@rollup/rollup-linux-s390x-gnu@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-s390x-gnu/-/rollup-linux-s390x-gnu-4.60.0.tgz#9d25ad8ac7dab681935baf78ac5ea92d14629cdf" - integrity sha512-3ObQs0BhvPgiUVZrN7gqCSvmFuMWvWvsjG5ayJ3Lraqv+2KhOsp+pUbigqbeWqueGIsnn+09HBw27rJ+gYK4VQ== - -"@rollup/rollup-linux-x64-gnu@4.60.0", "@rollup/rollup-linux-x64-gnu@^4.24.4": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-x64-gnu/-/rollup-linux-x64-gnu-4.60.0.tgz#5e5139e11819fa38a052368da79422cb4afcf466" - integrity sha512-EtylprDtQPdS5rXvAayrNDYoJhIz1/vzN2fEubo3yLE7tfAw+948dO0g4M0vkTVFhKojnF+n6C8bDNe+gDRdTg== - -"@rollup/rollup-linux-x64-musl@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-x64-musl/-/rollup-linux-x64-musl-4.60.0.tgz#b6211d46e11b1f945f5504cc794fce839331ed08" - integrity sha512-k09oiRCi/bHU9UVFqD17r3eJR9bn03TyKraCrlz5ULFJGdJGi7VOmm9jl44vOJvRJ6P7WuBi/s2A97LxxHGIdw== - -"@rollup/rollup-openbsd-x64@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-openbsd-x64/-/rollup-openbsd-x64-4.60.0.tgz#e6e09eebaa7012bb9c7331b437a9e992bd94ca35" - integrity sha512-1o/0/pIhozoSaDJoDcec+IVLbnRtQmHwPV730+AOD29lHEEo4F5BEUB24H0OBdhbBBDwIOSuf7vgg0Ywxdfiiw== - -"@rollup/rollup-openharmony-arm64@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-openharmony-arm64/-/rollup-openharmony-arm64-4.60.0.tgz#f7d99ae857032498e57a5e7259fb7100fd24a87e" - integrity sha512-pESDkos/PDzYwtyzB5p/UoNU/8fJo68vcXM9ZW2V0kjYayj1KaaUfi1NmTUTUpMn4UhU4gTuK8gIaFO4UGuMbA== - -"@rollup/rollup-win32-arm64-msvc@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-arm64-msvc/-/rollup-win32-arm64-msvc-4.60.0.tgz#41e392f5d9f3bf1253fdaf2f6d6f6b1bfc452856" - integrity sha512-hj1wFStD7B1YBeYmvY+lWXZ7ey73YGPcViMShYikqKT1GtstIKQAtfUI6yrzPjAy/O7pO0VLXGmUVWXQMaYgTQ== - -"@rollup/rollup-win32-ia32-msvc@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-ia32-msvc/-/rollup-win32-ia32-msvc-4.60.0.tgz#f41b0490be0e5d3cf459b4dc076a192b532adea9" - integrity sha512-SyaIPFoxmUPlNDq5EHkTbiKzmSEmq/gOYFI/3HHJ8iS/v1mbugVa7dXUzcJGQfoytp9DJFLhHH4U3/eTy2Bq4w== - -"@rollup/rollup-win32-x64-gnu@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-x64-gnu/-/rollup-win32-x64-gnu-4.60.0.tgz#0fcf9f1fcb750f0317b13aac3b3231687e6397a5" - integrity sha512-RdcryEfzZr+lAr5kRm2ucN9aVlCCa2QNq4hXelZxb8GG0NJSazq44Z3PCCc8wISRuCVnGs0lQJVX5Vp6fKA+IA== - -"@rollup/rollup-win32-x64-msvc@4.60.0": - version "4.60.0" - resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.60.0.tgz#3afdb30405f6d4248df5e72e1ca86c5eab55fab8" - integrity sha512-PrsWNQ8BuE00O3Xsx3ALh2Df8fAj9+cvvX9AIA6o4KpATR98c9mud4XtDWVvsEuyia5U4tVSTKygawyJkjm60w== - -"@standard-schema/spec@^1.1.0": - version "1.1.0" - resolved "https://registry.yarnpkg.com/@standard-schema/spec/-/spec-1.1.0.tgz#a79b55dbaf8604812f52d140b2c9ab41bc150bb8" - integrity sha512-l2aFy5jALhniG5HgqrD6jXLi/rUWrKvqN/qJx6yoJsgKhblVd+iqqU4RCXavm/jPityDo5TCvKMnpjKnOriy0w== - -"@swc/core-darwin-arm64@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-darwin-arm64/-/core-darwin-arm64-1.15.21.tgz#7153201537954b5f3b5748c315cdf0e0dcd533a8" - integrity sha512-SA8SFg9dp0qKRH8goWsax6bptFE2EdmPf2YRAQW9WoHGf3XKM1bX0nd5UdwxmC5hXsBUZAYf7xSciCler6/oyA== - -"@swc/core-darwin-x64@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-darwin-x64/-/core-darwin-x64-1.15.21.tgz#05ff28c00a7045d9760c847e19604fff02b6e3ea" - integrity sha512-//fOVntgowz9+V90lVsNCtyyrtbHp3jWH6Rch7MXHXbcvbLmbCTmssl5DeedUWLLGiAAW1wksBdqdGYOTjaNLw== - -"@swc/core-linux-arm-gnueabihf@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-linux-arm-gnueabihf/-/core-linux-arm-gnueabihf-1.15.21.tgz#d52a0fac1933fe4e4180a196417053571d6c255f" - integrity sha512-meNI4Sh6h9h8DvIfEc0l5URabYMSuNvyisLmG6vnoYAS43s8ON3NJR8sDHvdP7NJTrLe0q/x2XCn6yL/BeHcZg== - -"@swc/core-linux-arm64-gnu@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-linux-arm64-gnu/-/core-linux-arm64-gnu-1.15.21.tgz#32cd1b9d0d4be4d53ccfbc122ac61289f37735b9" - integrity sha512-QrXlNQnHeXqU2EzLlnsPoWEh8/GtNJLvfMiPsDhk+ht6Xv8+vhvZ5YZ/BokNWSIZiWPKLAqR0M7T92YF5tmD3g== - -"@swc/core-linux-arm64-musl@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-linux-arm64-musl/-/core-linux-arm64-musl-1.15.21.tgz#0993e8b2ffac4f1141fa7b158e8dd982c2476c1a" - integrity sha512-8/yGCMO333ultDaMQivE5CjO6oXDPeeg1IV4sphojPkb0Pv0i6zvcRIkgp60xDB+UxLr6VgHgt+BBgqS959E9g== - -"@swc/core-linux-ppc64-gnu@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-linux-ppc64-gnu/-/core-linux-ppc64-gnu-1.15.21.tgz#5f6765d9a36235d95fd5c69f6d848973e85d8180" - integrity sha512-ucW0HzPx0s1dgRvcvuLSPSA/2Kk/VYTv9st8qe1Kc22Gu0Q0rH9+6TcBTmMuNIp0Xs4BPr1uBttmbO1wEGI49Q== - -"@swc/core-linux-s390x-gnu@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-linux-s390x-gnu/-/core-linux-s390x-gnu-1.15.21.tgz#f96779dc2ba8d47298bca3ceaa961e0f460aa0bd" - integrity sha512-ulTnOGc5I7YRObE/9NreAhQg94QkiR5qNhhcUZ1iFAYjzg/JGAi1ch+s/Ixe61pMIr8bfVrF0NOaB0f8wjaAfA== - -"@swc/core-linux-x64-gnu@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-linux-x64-gnu/-/core-linux-x64-gnu-1.15.21.tgz#0ffe779d5fd060bfb7992176f51d317c81c6aaaf" - integrity sha512-D0RokxtM+cPvSqJIKR6uja4hbD+scI9ezo95mBhfSyLUs9wnPPl26sLp1ZPR/EXRdYm3F3S6RUtVi+8QXhT24Q== - -"@swc/core-linux-x64-musl@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-linux-x64-musl/-/core-linux-x64-musl-1.15.21.tgz#2ea9fab26555d27c715aed6a08604a8296e4af50" - integrity sha512-nER8u7VeRfmU6fMDzl1NQAbbB/G7O2avmvCOwIul1uGkZ2/acbPH+DCL9h5+0yd/coNcxMBTL6NGepIew+7C2w== - -"@swc/core-win32-arm64-msvc@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-win32-arm64-msvc/-/core-win32-arm64-msvc-1.15.21.tgz#b401f34f38d744ca2b800bf2574ef5f7b20ca52f" - integrity sha512-+/AgNBnjYugUA8C0Do4YzymgvnGbztv7j8HKSQLvR/DQgZPoXQ2B3PqB2mTtGh/X5DhlJWiqnunN35JUgWcAeQ== - -"@swc/core-win32-ia32-msvc@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-win32-ia32-msvc/-/core-win32-ia32-msvc-1.15.21.tgz#c761e981725d137abd7abcecff88d1dc2d76baad" - integrity sha512-IkSZj8PX/N4HcaFhMQtzmkV8YSnuNoJ0E6OvMwFiOfejPhiKXvl7CdDsn1f4/emYEIDO3fpgZW9DTaCRMDxaDA== - -"@swc/core-win32-x64-msvc@1.15.21": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core-win32-x64-msvc/-/core-win32-x64-msvc-1.15.21.tgz#4878cd851b4f98033e19fca78953201aef736edd" - integrity sha512-zUyWso7OOENB6e1N1hNuNn8vbvLsTdKQ5WKLgt/JcBNfJhKy/6jmBmqI3GXk/MyvQKd5SLvP7A0F36p7TeDqvw== - -"@swc/core@^1.12.11": - version "1.15.21" - resolved "https://registry.yarnpkg.com/@swc/core/-/core-1.15.21.tgz#84e1a2dded1372efda7036a86749ded817d05ea2" - integrity sha512-fkk7NJcBscrR3/F8jiqlMptRHP650NxqDnspBMrRe5d8xOoCy9MLL5kOBLFXjFLfMo3KQQHhk+/jUULOMlR1uQ== - dependencies: - "@swc/counter" "^0.1.3" - "@swc/types" "^0.1.25" - optionalDependencies: - "@swc/core-darwin-arm64" "1.15.21" - "@swc/core-darwin-x64" "1.15.21" - "@swc/core-linux-arm-gnueabihf" "1.15.21" - "@swc/core-linux-arm64-gnu" "1.15.21" - "@swc/core-linux-arm64-musl" "1.15.21" - "@swc/core-linux-ppc64-gnu" "1.15.21" - "@swc/core-linux-s390x-gnu" "1.15.21" - "@swc/core-linux-x64-gnu" "1.15.21" - "@swc/core-linux-x64-musl" "1.15.21" - "@swc/core-win32-arm64-msvc" "1.15.21" - "@swc/core-win32-ia32-msvc" "1.15.21" - "@swc/core-win32-x64-msvc" "1.15.21" - -"@swc/counter@^0.1.3": - version "0.1.3" - resolved "https://registry.yarnpkg.com/@swc/counter/-/counter-0.1.3.tgz#cc7463bd02949611c6329596fccd2b0ec782b0e9" - integrity sha512-e2BR4lsJkkRlKZ/qCHPw9ZaSxc0MVUd7gtbtaB7aMvHeJVYe8sOB8DBZkP2DtISHGSku9sCK6T6cnY0CtXrOCQ== - -"@swc/types@^0.1.25": - version "0.1.26" - resolved "https://registry.yarnpkg.com/@swc/types/-/types-0.1.26.tgz#2a976a1870caef1992316dda1464150ee36968b5" - integrity sha512-lyMwd7WGgG79RS7EERZV3T8wMdmPq3xwyg+1nmAM64kIhx5yl+juO2PYIHb7vTiPgPCj8LYjsNV2T5wiQHUEaw== - dependencies: - "@swc/counter" "^0.1.3" - -"@testing-library/dom@^10.4.1": - version "10.4.1" - resolved "https://registry.yarnpkg.com/@testing-library/dom/-/dom-10.4.1.tgz#d444f8a889e9a46e9a3b4f3b88e0fcb3efb6cf95" - integrity sha512-o4PXJQidqJl82ckFaXUeoAW+XysPLauYI43Abki5hABd853iMhitooc6znOnczgbTYmEP6U6/y1ZyKAIsvMKGg== - dependencies: - "@babel/code-frame" "^7.10.4" - "@babel/runtime" "^7.12.5" - "@types/aria-query" "^5.0.1" - aria-query "5.3.0" - dom-accessibility-api "^0.5.9" - lz-string "^1.5.0" - picocolors "1.1.1" - pretty-format "^27.0.2" - -"@testing-library/jest-dom@^6.9.1": - version "6.9.1" - resolved "https://registry.yarnpkg.com/@testing-library/jest-dom/-/jest-dom-6.9.1.tgz#7613a04e146dd2976d24ddf019730d57a89d56c2" - integrity sha512-zIcONa+hVtVSSep9UT3jZ5rizo2BsxgyDYU7WFD5eICBE7no3881HGeb/QkGfsJs6JTkY1aQhT7rIPC7e+0nnA== - dependencies: - "@adobe/css-tools" "^4.4.0" - aria-query "^5.0.0" - css.escape "^1.5.1" - dom-accessibility-api "^0.6.3" - picocolors "^1.1.1" - redent "^3.0.0" - -"@testing-library/react@^16.3.2": - version "16.3.2" - resolved "https://registry.yarnpkg.com/@testing-library/react/-/react-16.3.2.tgz#672883b7acb8e775fc0492d9e9d25e06e89786d0" - integrity sha512-XU5/SytQM+ykqMnAnvB2umaJNIOsLF3PVv//1Ew4CTcpz0/BRyy/af40qqrt7SjKpDdT1saBMc42CUok5gaw+g== - dependencies: - "@babel/runtime" "^7.12.5" - -"@tiptap/core@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/core/-/core-3.22.2.tgz#2352ffa67bfa1a3528898524d13ba9bde5c74b37" - integrity sha512-atq35NkpeEphH6vNYJ0pTLLBA73FAbvTV9Ovd3AaTC5s99/KF5Q86zVJXvml8xPRcMGM6dLp+eSSd06oTscMSA== - -"@tiptap/extension-blockquote@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-blockquote/-/extension-blockquote-3.22.2.tgz#bfa2db6f9d65bd411a74ca5f3610f5094adc322e" - integrity sha512-iTdlmGFcgxi4LKaOW2Rc9/yD83qTXgRm5BN3vCHWy5+TbEnReYxYqU5qKsbtTbKy30sO8TJTdAXTZ29uomShQQ== - -"@tiptap/extension-bold@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-bold/-/extension-bold-3.22.2.tgz#980484072b2f45cb8794869283af67017cefcc1a" - integrity sha512-bqsPJyKcT/RWse4e16U2EKhraR8a2+98TUuk1amG3yCyFJZStoO/j+pN0IqZdZZjr3WtxFyvwWp7Kc59UN+jUA== - -"@tiptap/extension-bubble-menu@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-bubble-menu/-/extension-bubble-menu-3.22.2.tgz#3945c6cc7b403b732aa590debf79bbfa2a0d5f50" - integrity sha512-5hbyDOSkJwA2uh0v9Mm0Dd9bb9inx6tHBEDSH2tCB9Rm23poz3yOreB7SNX8xDMe5L0/PQesfWC14RitcmhKPg== - dependencies: - "@floating-ui/dom" "^1.0.0" - -"@tiptap/extension-bullet-list@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-bullet-list/-/extension-bullet-list-3.22.2.tgz#b3dc949be2600a6692363038aeca71ae38ce4e4e" - integrity sha512-llrTJnA72RGcWLLO+ro0QN4sjHynhaCerhpV+GZE/ATd8BqV/ekQFdBLJrvC/09My2XQfCwLsyCh92NPXUdELA== - -"@tiptap/extension-code-block@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-code-block/-/extension-code-block-3.22.2.tgz#e8c9827e9fe817ac55a9e280f7b86f54dd4f8473" - integrity sha512-PEwFlDyvtKF19WCrOFg77qJV9WqhvjCY4ZoXlHP9Hx0KTcOA8W39mtw8d4NWU5pLRK94yHKF1DVVL8UUkEOnww== - -"@tiptap/extension-code@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-code/-/extension-code-3.22.2.tgz#ad59ddd20feef71fcfbff7c4e2389f7111a2f4a6" - integrity sha512-iYFY+yzfYA9MKt7nupyW/PzqL9XC2D0mC8l1z2Y10i0/fGL8NbqIYjhNUAyXGqH3QWcI+DirI66842y2OadPOg== - -"@tiptap/extension-document@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-document/-/extension-document-3.22.2.tgz#8567a2df5a0e7b32cb350f90849a8a9ada82bbe5" - integrity sha512-yPw9pQeVC4QDh86TuyKCZxxM4g0NAw7mEtGnAo6EpxaBQr1wyBr9yFpys+QTsQpRTmyTf1VHp4iTTLuWHMljIw== - -"@tiptap/extension-dropcursor@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-dropcursor/-/extension-dropcursor-3.22.2.tgz#79a74011eb03df1f6057fca93fe359286d51ad03" - integrity sha512-sDv3fv4LtX0X4nqwh9Gn3C/aZXT+C2JlK7tJovPOpaYP/a6hr03Sn35X5moAfgMCSiWFygEvlTriqwmCsJuxog== - -"@tiptap/extension-floating-menu@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-floating-menu/-/extension-floating-menu-3.22.2.tgz#345b8ad14aa8130c80ccb016a35b50f0f4071ecc" - integrity sha512-r0ZTeh9rNtj9Api+G0YyaB+tAKPDn7aYWg+qSrmAC5EyUPee6Zjn3zlw0q4renCeQflvNRK20xHM8zokC41jOA== - -"@tiptap/extension-gapcursor@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-gapcursor/-/extension-gapcursor-3.22.2.tgz#b8ec46740dc6b5060abde6b4359410d36602583b" - integrity sha512-rR2OLrl/k2kj7xehaZHq0Y7T+1wy2DOTabir9LsTrktTFEcklrh9qY1KC6rEBkwMKaWrmignR1l39kS6RlKFNw== - -"@tiptap/extension-hard-break@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-hard-break/-/extension-hard-break-3.22.2.tgz#d1fc488660b33d76b8773bfea98265939e670b95" - integrity sha512-ChsoqF4XRp6EWatTRlXL4LMFh/ggwRVCyt09brSfjJV5knFaXlECSa5/+rKLMLMULaj6dVlJqoAD15exgu2HHA== - -"@tiptap/extension-heading@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-heading/-/extension-heading-3.22.2.tgz#b4404f040c10f2de17ed4ed7b1e338de4b5e3c2f" - integrity sha512-QPHLef+ikAyf7RVc4EdGeKxH4OEGb3ueCEwJ41RcYPtZ1BX9ueei7FC936guTdL1U7w3vQ65qfy86HznzkYgvw== - -"@tiptap/extension-horizontal-rule@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-horizontal-rule/-/extension-horizontal-rule-3.22.2.tgz#01a299823c07df99d9f8045d6b9ac2209fb3d0c0" - integrity sha512-Oz8KN5KJAWV1mFNE9UIWXdMD6xa5zPf/0yLsT8V4sgaRm+VsdFKllN58BY9qCZf/kIZbaOez5KkaoeAcm0MAZg== - -"@tiptap/extension-image@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-image/-/extension-image-3.22.2.tgz#f34f57963bfe18130ea9cc6ce2428811764a94e9" - integrity sha512-xFCgwreF6sn5mQ/hFDQKn41NIbbfks/Ou9j763Djf3pWsastgzdgwifQOpXVI3aSsqlKUO3o8/8R/yQczvZcwg== - -"@tiptap/extension-italic@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-italic/-/extension-italic-3.22.2.tgz#3631598c4a0ae357f81774d83aad6b09a25d9072" - integrity sha512-fmtQu2HDnV3sOZPdz0+1lOLI7UtrIhusohJj2UwOLQxG8qqhLwbvWx2OQTlfblgY0z+CjLRr6ANbNDxOTIblfg== - -"@tiptap/extension-link@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-link/-/extension-link-3.22.2.tgz#3477af30aa558b9efc14dbb95ea3901c9f61f94c" - integrity sha512-TXfSoKmng5pecvQUZqdsx6ICeob5V5hhYOj2vCEtjfcjWsyCndqFIl1w+Nt/yI5ehrFNOVPyj3ZvcELuuAW6pw== - dependencies: - linkifyjs "^4.3.2" - -"@tiptap/extension-list-item@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-list-item/-/extension-list-item-3.22.2.tgz#91ac0771858ef3cf1aacaedb39eb403640922f0f" - integrity sha512-Mk+iiLIFh8Pfuarr6mWfTO7QJbd2ZQd0nGNhNWXlGAO7DJCb4BP9nj4bEIJ17SbcykGRjsi4WMqY50z4MHXqKQ== - -"@tiptap/extension-list-keymap@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-list-keymap/-/extension-list-keymap-3.22.2.tgz#9fc2d63e62d4bda58171d265a400dbf2fd38ea72" - integrity sha512-TozU9V2vldMUPpTXnfLCO33EO06jLxn7uEJTMBnN4iX/dLV3cBVCbE4kHyDKS0sLd7joUeekS06vYP9uQb1hFw== - -"@tiptap/extension-list@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-list/-/extension-list-3.22.2.tgz#aa62a648f1e9087c316e3fd1e7d4878eb3e9f3f7" - integrity sha512-Vq9xScgkA2A3Zj9dQ4WUBKK7u7UCzeSFRz9FcKTQVZHRPbZoqFGnlRUVngqsE7JXrCOthXQ1dXxgk40nAsBFRw== - -"@tiptap/extension-ordered-list@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-ordered-list/-/extension-ordered-list-3.22.2.tgz#664585d4fac27439a03257db9db5aebbcdc9cc53" - integrity sha512-K7qxoBKmsVkAd3kW64ZRCUPFrDcNGpXRDUBx9YgAO/bTfsfxtH2oil+igsUWGXPczpP4yoHPKjTfhpBpLjGl6Q== - -"@tiptap/extension-paragraph@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-paragraph/-/extension-paragraph-3.22.2.tgz#38ba161093094860dff9be3dbd5c565c5cb2eb70" - integrity sha512-EHZZzxVhvzEPDPWtRBF1YKhB+WCUjd1C2NhjHfL3Dl71PBqM3ZWA6qN7NDGPyNyGGWauui/NR/4X+5AfPqlHyA== - -"@tiptap/extension-strike@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-strike/-/extension-strike-3.22.2.tgz#e6ca0685355df5ed443ac3641f857867a0472f6d" - integrity sha512-YFC3elKU1L8PiGbcB6tqd/7vWPF5IbydJz0POJpHzSjstX+VfT8VsvS7ubxVuSIWQ11kGkH3mzX6LX8JHsHZxg== - -"@tiptap/extension-text@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-text/-/extension-text-3.22.2.tgz#a93a0ba750196060c07a96ded678866b427223ce" - integrity sha512-J1w7JwijfSD7ah0WfiwZ/DVWCIGT9x369RM4RJc57i44mIBElj7tl1dh+N5KPGOXKUup4gr7sSJAE38lgeaDMg== - -"@tiptap/extension-underline@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extension-underline/-/extension-underline-3.22.2.tgz#eedd60cdf25b4e60343ee294bb79268621779557" - integrity sha512-BaV6WOowxdkGTLWiU7DdZ3Twh633O4RGqwUM5dDas5LvaqL8AMWGTO8Wg9yAaaKXzd9MtKI1ZCqS/+MtzusgkQ== - -"@tiptap/extensions@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/extensions/-/extensions-3.22.2.tgz#9bae5ee6f6b426df38dbdf314cce7a5bec652cc2" - integrity sha512-s7MZmm2Xdq+8feIXgY3v7gVpQ5ClqBZi20KheouS7KSbBlrY4fu2irYR1EGc6r1UUVaHMxEa+cx5knhx+mIPUw== - -"@tiptap/pm@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/pm/-/pm-3.22.2.tgz#4866e1a14a0ba5e354d855d60f19960fd32d4194" - integrity sha512-G2ENwIazoSKkAnN5MN5yN91TIZNFm6TxB74kPf3Empr2k9W51Hkcier70jHGpArhgcEaL4BVreuU1PRDRwCeGw== - dependencies: - prosemirror-changeset "^2.3.0" - prosemirror-collab "^1.3.1" - prosemirror-commands "^1.6.2" - prosemirror-dropcursor "^1.8.1" - prosemirror-gapcursor "^1.3.2" - prosemirror-history "^1.4.1" - prosemirror-inputrules "^1.4.0" - prosemirror-keymap "^1.2.2" - prosemirror-markdown "^1.13.1" - prosemirror-menu "^1.2.4" - prosemirror-model "^1.24.1" - prosemirror-schema-basic "^1.2.3" - prosemirror-schema-list "^1.5.0" - prosemirror-state "^1.4.3" - prosemirror-tables "^1.6.4" - prosemirror-trailing-node "^3.0.0" - prosemirror-transform "^1.10.2" - prosemirror-view "^1.38.1" - -"@tiptap/react@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/react/-/react-3.22.2.tgz#ddeed95154b084580922a7918e94547bfaac7996" - integrity sha512-tyGKG69e/MkpoD/JTpVPz0XydEHxh1MSAYnLb3gRvyvBDv2r/veLea+cApkmjQaCfkKC/CWwTFXBYlOB0caSBA== - dependencies: - "@types/use-sync-external-store" "^0.0.6" - fast-equals "^5.3.3" - use-sync-external-store "^1.4.0" - optionalDependencies: - "@tiptap/extension-bubble-menu" "^3.22.2" - "@tiptap/extension-floating-menu" "^3.22.2" - -"@tiptap/starter-kit@^3.22.2": - version "3.22.2" - resolved "https://registry.npmjs.org/@tiptap/starter-kit/-/starter-kit-3.22.2.tgz#165f17d7f13a81a59b1798bab5093b5520ef30a2" - integrity sha512-+CCKX8tOQ/ZPb2k/z6em4AQCFYAcdd8+0TOzPWiuLxRyCHRPBBVhnPsXOKgKwE4OO3E8BsezquuYRYRwsyzCqg== - dependencies: - "@tiptap/core" "^3.22.2" - "@tiptap/extension-blockquote" "^3.22.2" - "@tiptap/extension-bold" "^3.22.2" - "@tiptap/extension-bullet-list" "^3.22.2" - "@tiptap/extension-code" "^3.22.2" - "@tiptap/extension-code-block" "^3.22.2" - "@tiptap/extension-document" "^3.22.2" - "@tiptap/extension-dropcursor" "^3.22.2" - "@tiptap/extension-gapcursor" "^3.22.2" - "@tiptap/extension-hard-break" "^3.22.2" - "@tiptap/extension-heading" "^3.22.2" - "@tiptap/extension-horizontal-rule" "^3.22.2" - "@tiptap/extension-italic" "^3.22.2" - "@tiptap/extension-link" "^3.22.2" - "@tiptap/extension-list" "^3.22.2" - "@tiptap/extension-list-item" "^3.22.2" - "@tiptap/extension-list-keymap" "^3.22.2" - "@tiptap/extension-ordered-list" "^3.22.2" - "@tiptap/extension-paragraph" "^3.22.2" - "@tiptap/extension-strike" "^3.22.2" - "@tiptap/extension-text" "^3.22.2" - "@tiptap/extension-underline" "^3.22.2" - "@tiptap/extensions" "^3.22.2" - "@tiptap/pm" "^3.22.2" - -"@tybys/wasm-util@^0.10.1": - version "0.10.1" - resolved "https://registry.yarnpkg.com/@tybys/wasm-util/-/wasm-util-0.10.1.tgz#ecddd3205cf1e2d5274649ff0eedd2991ed7f414" - integrity sha512-9tTaPJLSiejZKx+Bmog4uSubteqTvFrVrURwkmHixBo0G4seD0zUxp98E1DzUBJxLQ3NPwXrGKDiVjwx/DpPsg== - dependencies: - tslib "^2.4.0" - -"@types/aria-query@^5.0.1": - version "5.0.4" - resolved "https://registry.yarnpkg.com/@types/aria-query/-/aria-query-5.0.4.tgz#1a31c3d378850d2778dabb6374d036dcba4ba708" - integrity sha512-rfT93uj5s0PRL7EzccGMs3brplhcrghnDoV26NqKhCAS1hVo+WdNsPvE/yb6ilfr5hi2MEk6d5EWJTKdxg8jVw== - -"@types/chai@^5.2.2": - version "5.2.3" - resolved "https://registry.yarnpkg.com/@types/chai/-/chai-5.2.3.tgz#8e9cd9e1c3581fa6b341a5aed5588eb285be0b4a" - integrity sha512-Mw558oeA9fFbv65/y4mHtXDs9bPnFMZAL/jxdPFUpOHHIXX91mcgEHbS5Lahr+pwZFR8A7GQleRWeI6cGFC2UA== - dependencies: - "@types/deep-eql" "*" - assertion-error "^2.0.1" - -"@types/d3-array@*", "@types/d3-array@^3.2.1": - version "3.2.2" - resolved "https://registry.yarnpkg.com/@types/d3-array/-/d3-array-3.2.2.tgz#e02151464d02d4a1b44646d0fcdb93faf88fde8c" - integrity sha512-hOLWVbm7uRza0BYXpIIW5pxfrKe0W+D5lrFiAEYR+pb6w3N2SwSMaJbXdUfSEv+dT4MfHBLtn5js0LAWaO6otw== - -"@types/d3-axis@*": - version "3.0.6" - resolved "https://registry.yarnpkg.com/@types/d3-axis/-/d3-axis-3.0.6.tgz#e760e5765b8188b1defa32bc8bb6062f81e4c795" - integrity sha512-pYeijfZuBd87T0hGn0FO1vQ/cgLk6E1ALJjfkC0oJ8cbwkZl3TpgS8bVBLZN+2jjGgg38epgxb2zmoGtSfvgMw== - dependencies: - "@types/d3-selection" "*" - -"@types/d3-brush@*": - version "3.0.6" - resolved "https://registry.yarnpkg.com/@types/d3-brush/-/d3-brush-3.0.6.tgz#c2f4362b045d472e1b186cdbec329ba52bdaee6c" - integrity sha512-nH60IZNNxEcrh6L1ZSMNA28rj27ut/2ZmI3r96Zd+1jrZD++zD3LsMIjWlvg4AYrHn/Pqz4CF3veCxGjtbqt7A== - dependencies: - "@types/d3-selection" "*" - -"@types/d3-chord@*": - version "3.0.6" - resolved "https://registry.yarnpkg.com/@types/d3-chord/-/d3-chord-3.0.6.tgz#1706ca40cf7ea59a0add8f4456efff8f8775793d" - integrity sha512-LFYWWd8nwfwEmTZG9PfQxd17HbNPksHBiJHaKuY1XeqscXacsS2tyoo6OdRsjf+NQYeB6XrNL3a25E3gH69lcg== - -"@types/d3-color@*": - version "3.1.3" - resolved "https://registry.yarnpkg.com/@types/d3-color/-/d3-color-3.1.3.tgz#368c961a18de721da8200e80bf3943fb53136af2" - integrity sha512-iO90scth9WAbmgv7ogoq57O9YpKmFBbmoEoCHDB2xMBY0+/KVrqAaCDyCE16dUspeOvIxFFRI+0sEtqDqy2b4A== - -"@types/d3-contour@*": - version "3.0.6" - resolved "https://registry.yarnpkg.com/@types/d3-contour/-/d3-contour-3.0.6.tgz#9ada3fa9c4d00e3a5093fed0356c7ab929604231" - integrity sha512-BjzLgXGnCWjUSYGfH1cpdo41/hgdWETu4YxpezoztawmqsvCeep+8QGfiY6YbDvfgHz/DkjeIkkZVJavB4a3rg== - dependencies: - "@types/d3-array" "*" - "@types/geojson" "*" - -"@types/d3-delaunay@*": - version "6.0.4" - resolved "https://registry.yarnpkg.com/@types/d3-delaunay/-/d3-delaunay-6.0.4.tgz#185c1a80cc807fdda2a3fe960f7c11c4a27952e1" - integrity sha512-ZMaSKu4THYCU6sV64Lhg6qjf1orxBthaC161plr5KuPHo3CNm8DTHiLw/5Eq2b6TsNP0W0iJrUOFscY6Q450Hw== - -"@types/d3-dispatch@*": - version "3.0.7" - resolved "https://registry.yarnpkg.com/@types/d3-dispatch/-/d3-dispatch-3.0.7.tgz#ef004d8a128046cfce434d17182f834e44ef95b2" - integrity sha512-5o9OIAdKkhN1QItV2oqaE5KMIiXAvDWBDPrD85e58Qlz1c1kI/J0NcqbEG88CoTwJrYe7ntUCVfeUl2UJKbWgA== - -"@types/d3-drag@*": - version "3.0.7" - resolved "https://registry.yarnpkg.com/@types/d3-drag/-/d3-drag-3.0.7.tgz#b13aba8b2442b4068c9a9e6d1d82f8bcea77fc02" - integrity sha512-HE3jVKlzU9AaMazNufooRJ5ZpWmLIoc90A37WU2JMmeq28w1FQqCZswHZ3xR+SuxYftzHq6WU6KJHvqxKzTxxQ== - dependencies: - "@types/d3-selection" "*" - -"@types/d3-dsv@*": - version "3.0.7" - resolved "https://registry.yarnpkg.com/@types/d3-dsv/-/d3-dsv-3.0.7.tgz#0a351f996dc99b37f4fa58b492c2d1c04e3dac17" - integrity sha512-n6QBF9/+XASqcKK6waudgL0pf/S5XHPPI8APyMLLUHd8NqouBGLsU8MgtO7NINGtPBtk9Kko/W4ea0oAspwh9g== - -"@types/d3-ease@*": - version "3.0.2" - resolved "https://registry.yarnpkg.com/@types/d3-ease/-/d3-ease-3.0.2.tgz#e28db1bfbfa617076f7770dd1d9a48eaa3b6c51b" - integrity sha512-NcV1JjO5oDzoK26oMzbILE6HW7uVXOHLQvHshBUW4UMdZGfiY6v5BeQwh9a9tCzv+CeefZQHJt5SRgK154RtiA== - -"@types/d3-fetch@*": - version "3.0.7" - resolved "https://registry.yarnpkg.com/@types/d3-fetch/-/d3-fetch-3.0.7.tgz#c04a2b4f23181aa376f30af0283dbc7b3b569980" - integrity sha512-fTAfNmxSb9SOWNB9IoG5c8Hg6R+AzUHDRlsXsDZsNp6sxAEOP0tkP3gKkNSO/qmHPoBFTxNrjDprVHDQDvo5aA== - dependencies: - "@types/d3-dsv" "*" - -"@types/d3-force@*": - version "3.0.10" - resolved "https://registry.yarnpkg.com/@types/d3-force/-/d3-force-3.0.10.tgz#6dc8fc6e1f35704f3b057090beeeb7ac674bff1a" - integrity sha512-ZYeSaCF3p73RdOKcjj+swRlZfnYpK1EbaDiYICEEp5Q6sUiqFaFQ9qgoshp5CzIyyb/yD09kD9o2zEltCexlgw== - -"@types/d3-format@*": - version "3.0.4" - resolved "https://registry.yarnpkg.com/@types/d3-format/-/d3-format-3.0.4.tgz#b1e4465644ddb3fdf3a263febb240a6cd616de90" - integrity sha512-fALi2aI6shfg7vM5KiR1wNJnZ7r6UuggVqtDA+xiEdPZQwy/trcQaHnwShLuLdta2rTymCNpxYTiMZX/e09F4g== - -"@types/d3-geo@*": - version "3.1.0" - resolved "https://registry.yarnpkg.com/@types/d3-geo/-/d3-geo-3.1.0.tgz#b9e56a079449174f0a2c8684a9a4df3f60522440" - integrity sha512-856sckF0oP/diXtS4jNsiQw/UuK5fQG8l/a9VVLeSouf1/PPbBE1i1W852zVwKwYCBkFJJB7nCFTbk6UMEXBOQ== - dependencies: - "@types/geojson" "*" - -"@types/d3-hierarchy@*": - version "3.1.7" - resolved "https://registry.yarnpkg.com/@types/d3-hierarchy/-/d3-hierarchy-3.1.7.tgz#6023fb3b2d463229f2d680f9ac4b47466f71f17b" - integrity sha512-tJFtNoYBtRtkNysX1Xq4sxtjK8YgoWUNpIiUee0/jHGRwqvzYxkq0hGVbbOGSz+JgFxxRu4K8nb3YpG3CMARtg== - -"@types/d3-interpolate@*": - version "3.0.4" - resolved "https://registry.yarnpkg.com/@types/d3-interpolate/-/d3-interpolate-3.0.4.tgz#412b90e84870285f2ff8a846c6eb60344f12a41c" - integrity sha512-mgLPETlrpVV1YRJIglr4Ez47g7Yxjl1lj7YKsiMCb27VJH9W8NVM6Bb9d8kkpG/uAQS5AmbA48q2IAolKKo1MA== - dependencies: - "@types/d3-color" "*" - -"@types/d3-path@*": - version "3.1.1" - resolved "https://registry.yarnpkg.com/@types/d3-path/-/d3-path-3.1.1.tgz#f632b380c3aca1dba8e34aa049bcd6a4af23df8a" - integrity sha512-VMZBYyQvbGmWyWVea0EHs/BwLgxc+MKi1zLDCONksozI4YJMcTt8ZEuIR4Sb1MMTE8MMW49v0IwI5+b7RmfWlg== - -"@types/d3-polygon@*": - version "3.0.2" - resolved "https://registry.yarnpkg.com/@types/d3-polygon/-/d3-polygon-3.0.2.tgz#dfae54a6d35d19e76ac9565bcb32a8e54693189c" - integrity sha512-ZuWOtMaHCkN9xoeEMr1ubW2nGWsp4nIql+OPQRstu4ypeZ+zk3YKqQT0CXVe/PYqrKpZAi+J9mTs05TKwjXSRA== - -"@types/d3-quadtree@*": - version "3.0.6" - resolved "https://registry.yarnpkg.com/@types/d3-quadtree/-/d3-quadtree-3.0.6.tgz#d4740b0fe35b1c58b66e1488f4e7ed02952f570f" - integrity sha512-oUzyO1/Zm6rsxKRHA1vH0NEDG58HrT5icx/azi9MF1TWdtttWl0UIUsjEQBBh+SIkrpd21ZjEv7ptxWys1ncsg== - -"@types/d3-random@*": - version "3.0.3" - resolved "https://registry.yarnpkg.com/@types/d3-random/-/d3-random-3.0.3.tgz#ed995c71ecb15e0cd31e22d9d5d23942e3300cfb" - integrity sha512-Imagg1vJ3y76Y2ea0871wpabqp613+8/r0mCLEBfdtqC7xMSfj9idOnmBYyMoULfHePJyxMAw3nWhJxzc+LFwQ== - -"@types/d3-scale-chromatic@*": - version "3.1.0" - resolved "https://registry.yarnpkg.com/@types/d3-scale-chromatic/-/d3-scale-chromatic-3.1.0.tgz#dc6d4f9a98376f18ea50bad6c39537f1b5463c39" - integrity sha512-iWMJgwkK7yTRmWqRB5plb1kadXyQ5Sj8V/zYlFGMUBbIPKQScw+Dku9cAAMgJG+z5GYDoMjWGLVOvjghDEFnKQ== - -"@types/d3-scale@*": - version "4.0.9" - resolved "https://registry.yarnpkg.com/@types/d3-scale/-/d3-scale-4.0.9.tgz#57a2f707242e6fe1de81ad7bfcccaaf606179afb" - integrity sha512-dLmtwB8zkAeO/juAMfnV+sItKjlsw2lKdZVVy6LRr0cBmegxSABiLEpGVmSJJ8O08i4+sGR6qQtb6WtuwJdvVw== - dependencies: - "@types/d3-time" "*" - -"@types/d3-selection@*": - version "3.0.11" - resolved "https://registry.yarnpkg.com/@types/d3-selection/-/d3-selection-3.0.11.tgz#bd7a45fc0a8c3167a631675e61bc2ca2b058d4a3" - integrity sha512-bhAXu23DJWsrI45xafYpkQ4NtcKMwWnAC/vKrd2l+nxMFuvOT3XMYTIj2opv8vq8AO5Yh7Qac/nSeP/3zjTK0w== - -"@types/d3-shape@*": - version "3.1.8" - resolved "https://registry.yarnpkg.com/@types/d3-shape/-/d3-shape-3.1.8.tgz#d1516cc508753be06852cd06758e3bb54a22b0e3" - integrity sha512-lae0iWfcDeR7qt7rA88BNiqdvPS5pFVPpo5OfjElwNaT2yyekbM0C9vK+yqBqEmHr6lDkRnYNoTBYlAgJa7a4w== - dependencies: - "@types/d3-path" "*" - -"@types/d3-time-format@*": - version "4.0.3" - resolved "https://registry.yarnpkg.com/@types/d3-time-format/-/d3-time-format-4.0.3.tgz#d6bc1e6b6a7db69cccfbbdd4c34b70632d9e9db2" - integrity sha512-5xg9rC+wWL8kdDj153qZcsJ0FWiFt0J5RB6LYUNZjwSnesfblqrI/bJ1wBdJ8OQfncgbJG5+2F+qfqnqyzYxyg== - -"@types/d3-time@*": - version "3.0.4" - resolved "https://registry.yarnpkg.com/@types/d3-time/-/d3-time-3.0.4.tgz#8472feecd639691450dd8000eb33edd444e1323f" - integrity sha512-yuzZug1nkAAaBlBBikKZTgzCeA+k1uy4ZFwWANOfKw5z5LRhV0gNA7gNkKm7HoK+HRN0wX3EkxGk0fpbWhmB7g== - -"@types/d3-timer@*": - version "3.0.2" - resolved "https://registry.yarnpkg.com/@types/d3-timer/-/d3-timer-3.0.2.tgz#70bbda77dc23aa727413e22e214afa3f0e852f70" - integrity sha512-Ps3T8E8dZDam6fUyNiMkekK3XUsaUEik+idO9/YjPtfj2qruF8tFBXS7XhtE4iIXBLxhmLjP3SXpLhVf21I9Lw== - -"@types/d3-transition@*": - version "3.0.9" - resolved "https://registry.yarnpkg.com/@types/d3-transition/-/d3-transition-3.0.9.tgz#1136bc57e9ddb3c390dccc9b5ff3b7d2b8d94706" - integrity sha512-uZS5shfxzO3rGlu0cC3bjmMFKsXv+SmZZcgp0KD22ts4uGXp5EVYGzu/0YdwZeKmddhcAccYtREJKkPfXkZuCg== - dependencies: - "@types/d3-selection" "*" - -"@types/d3-zoom@*": - version "3.0.8" - resolved "https://registry.yarnpkg.com/@types/d3-zoom/-/d3-zoom-3.0.8.tgz#dccb32d1c56b1e1c6e0f1180d994896f038bc40b" - integrity sha512-iqMC4/YlFCSlO8+2Ii1GGGliCAY4XdeG748w5vQUbevlbDu0zSjH/+jojorQVBK/se0j6DUFNPBGSqD3YWYnDw== - dependencies: - "@types/d3-interpolate" "*" - "@types/d3-selection" "*" - -"@types/d3@^7.4.3": - version "7.4.3" - resolved "https://registry.yarnpkg.com/@types/d3/-/d3-7.4.3.tgz#d4550a85d08f4978faf0a4c36b848c61eaac07e2" - integrity sha512-lZXZ9ckh5R8uiFVt8ogUNf+pIrK4EsWrx2Np75WvF/eTpJ0FMHNhjXk8CKEx/+gpHbNQyJWehbFaTvqmHWB3ww== - dependencies: - "@types/d3-array" "*" - "@types/d3-axis" "*" - "@types/d3-brush" "*" - "@types/d3-chord" "*" - "@types/d3-color" "*" - "@types/d3-contour" "*" - "@types/d3-delaunay" "*" - "@types/d3-dispatch" "*" - "@types/d3-drag" "*" - "@types/d3-dsv" "*" - "@types/d3-ease" "*" - "@types/d3-fetch" "*" - "@types/d3-force" "*" - "@types/d3-format" "*" - "@types/d3-geo" "*" - "@types/d3-hierarchy" "*" - "@types/d3-interpolate" "*" - "@types/d3-path" "*" - "@types/d3-polygon" "*" - "@types/d3-quadtree" "*" - "@types/d3-random" "*" - "@types/d3-scale" "*" - "@types/d3-scale-chromatic" "*" - "@types/d3-selection" "*" - "@types/d3-shape" "*" - "@types/d3-time" "*" - "@types/d3-time-format" "*" - "@types/d3-timer" "*" - "@types/d3-transition" "*" - "@types/d3-zoom" "*" - -"@types/deep-eql@*": - version "4.0.2" - resolved "https://registry.yarnpkg.com/@types/deep-eql/-/deep-eql-4.0.2.tgz#334311971d3a07121e7eb91b684a605e7eea9cbd" - integrity sha512-c9h9dVVMigMPc4bwTvC5dxqtqJZwQPePsWjPlpSOnojbor6pGqdk541lfA7AqFQr5pB1BRdq0juY9db81BwyFw== - -"@types/dompurify@^3.0.5": - version "3.2.0" - resolved "https://registry.yarnpkg.com/@types/dompurify/-/dompurify-3.2.0.tgz#56610bf3e4250df57744d61fbd95422e07dfb840" - integrity sha512-Fgg31wv9QbLDA0SpTOXO3MaxySc4DKGLi8sna4/Utjo4r3ZRPdCt4UQee8BWr+Q5z21yifghREPJGYaEOEIACg== - dependencies: - dompurify "*" - -"@types/estree@1.0.8", "@types/estree@^1.0.0", "@types/estree@^1.0.6", "@types/estree@^1.0.8": - version "1.0.8" - resolved "https://registry.yarnpkg.com/@types/estree/-/estree-1.0.8.tgz#958b91c991b1867ced318bedea0e215ee050726e" - integrity sha512-dWHzHa2WqEXI/O1E9OjrocMTKJl2mSrEolh1Iomrv6U+JuNwaHXsXx9bLu5gG7BUWFIN0skIQJQ/L1rIex4X6w== - -"@types/geojson@*", "@types/geojson@7946.0.16": - version "7946.0.16" - resolved "https://registry.yarnpkg.com/@types/geojson/-/geojson-7946.0.16.tgz#8ebe53d69efada7044454e3305c19017d97ced2a" - integrity sha512-6C8nqWur3j98U6+lXDfTUWIfgvZU+EumvpHKcYjujKH7woYyLj2sUmff0tRhrqM7BohUw7Pz3ZB1jj2gW9Fvmg== - -"@types/hoist-non-react-statics@^3.3.1": - version "3.3.7" - resolved "https://registry.yarnpkg.com/@types/hoist-non-react-statics/-/hoist-non-react-statics-3.3.7.tgz#306e3a3a73828522efa1341159da4846e7573a6c" - integrity sha512-PQTyIulDkIDro8P+IHbKCsw7U2xxBYflVzW/FgWdCAePD9xGSidgA76/GeJ6lBKoblyhf9pBY763gbrN+1dI8g== - dependencies: - hoist-non-react-statics "^3.3.0" - -"@types/json-schema@^7.0.15": - version "7.0.15" - resolved "https://registry.yarnpkg.com/@types/json-schema/-/json-schema-7.0.15.tgz#596a1747233694d50f6ad8a7869fcb6f56cf5841" - integrity sha512-5+fP8P8MFNC+AyZCDxrB2pkZFPGzqQWUzpSeuuVLvm8VMcorNYavBqoFcxK8bQz4Qsbn4oUEEem4wDLfcysGHA== - -"@types/linkify-it@^3": - version "3.0.5" - resolved "https://registry.npmjs.org/@types/linkify-it/-/linkify-it-3.0.5.tgz#1e78a3ac2428e6d7e6c05c1665c242023a4601d8" - integrity sha512-yg6E+u0/+Zjva+buc3EIb+29XEg4wltq7cSmd4Uc2EE/1nUVmxyzpX6gUXD0V8jIrG0r7YeOGVIbYRkxeooCtw== - -"@types/linkify-it@^5": - version "5.0.0" - resolved "https://registry.npmjs.org/@types/linkify-it/-/linkify-it-5.0.0.tgz#21413001973106cda1c3a9b91eedd4ccd5469d76" - integrity sha512-sVDA58zAw4eWAffKOaQH5/5j3XeayukzDk+ewSsnv3p4yJEZHCCzMDiZM8e0OUrRvmpGZ85jf4yDHkHsgBNr9Q== - -"@types/lodash@^4.17.7": - version "4.17.24" - resolved "https://registry.yarnpkg.com/@types/lodash/-/lodash-4.17.24.tgz#4ae334fc62c0e915ca8ed8e35dcc6d4eeb29215f" - integrity sha512-gIW7lQLZbue7lRSWEFql49QJJWThrTFFeIMJdp3eH4tKoxm1OvEPg02rm4wCCSHS0cL3/Fizimb35b7k8atwsQ== - -"@types/markdown-it@^13.0.7": - version "13.0.9" - resolved "https://registry.npmjs.org/@types/markdown-it/-/markdown-it-13.0.9.tgz#df79221eae698df5b4e982c7e91128dd8e525743" - integrity sha512-1XPwR0+MgXLWfTn9gCsZ55AHOKW1WN+P9vr0PaQh5aerR9LLQXUbjfEAFhjmEmyoYFWAyuN2Mqkn40MZ4ukjBw== - dependencies: - "@types/linkify-it" "^3" - "@types/mdurl" "^1" - -"@types/markdown-it@^14.0.0": - version "14.1.2" - resolved "https://registry.npmjs.org/@types/markdown-it/-/markdown-it-14.1.2.tgz#57f2532a0800067d9b934f3521429a2e8bfb4c61" - integrity sha512-promo4eFwuiW+TfGxhi+0x3czqTYJkG8qB17ZUJiVF10Xm7NLVRSLUsfRTU/6h1e24VvRnXCx+hG7li58lkzog== - dependencies: - "@types/linkify-it" "^5" - "@types/mdurl" "^2" - -"@types/mdurl@^1": - version "1.0.5" - resolved "https://registry.npmjs.org/@types/mdurl/-/mdurl-1.0.5.tgz#3e0d2db570e9fb6ccb2dc8fde0be1d79ac810d39" - integrity sha512-6L6VymKTzYSrEf4Nev4Xa1LCHKrlTlYCBMTlQKFuddo1CvQcE52I0mwfOJayueUC7MJuXOeHTcIU683lzd0cUA== - -"@types/mdurl@^2": - version "2.0.0" - resolved "https://registry.npmjs.org/@types/mdurl/-/mdurl-2.0.0.tgz#d43878b5b20222682163ae6f897b20447233bdfd" - integrity sha512-RGdgjQUZba5p6QEFAVx2OGb8rQDL/cPRG7GiedRzMcJ1tYnUANBncjbSB1NRGwbvjcPeikRABz2nshyPk1bhWg== - -"@types/node@^14.0.1": - version "14.18.63" - resolved "https://registry.yarnpkg.com/@types/node/-/node-14.18.63.tgz#1788fa8da838dbb5f9ea994b834278205db6ca2b" - integrity sha512-fAtCfv4jJg+ExtXhvCkCqUKZ+4ok/JQk01qDKhL5BDDoS3AxKXhV5/MAVUZyQnSEd2GT92fkgZl0pz0Q0AzcIQ== - -"@types/node@^20.14.10": - version "20.19.37" - resolved "https://registry.yarnpkg.com/@types/node/-/node-20.19.37.tgz#b4fb4033408dd97becce63ec932c9ec57a9e2919" - integrity sha512-8kzdPJ3FsNsVIurqBs7oodNnCEVbni9yUEkaHbgptDACOPW04jimGagZ51E6+lXUwJjgnBw+hyko/lkFWCldqw== - dependencies: - undici-types "~6.21.0" - -"@types/parse-json@^4.0.0": - version "4.0.2" - resolved "https://registry.yarnpkg.com/@types/parse-json/-/parse-json-4.0.2.tgz#5950e50960793055845e956c427fc2b0d70c5239" - integrity sha512-dISoDXWWQwUquiKsyZ4Ng+HX2KsPL7LyHKHQwgGFEA3IaKac4Obd+h2a/a6waisAoepJlBcx9paWqjA8/HVjCw== - -"@types/prismjs@^1.26.0": - version "1.26.6" - resolved "https://registry.yarnpkg.com/@types/prismjs/-/prismjs-1.26.6.tgz#6ea27c126d645319ae4f7055eda63a9e835c0187" - integrity sha512-vqlvI7qlMvcCBbVe0AKAb4f97//Hy0EBTaiW8AalRnG/xAN5zOiWWyrNqNXeq8+KAuvRewjCVY1+IPxk4RdNYw== - -"@types/prop-types@*", "@types/prop-types@^15.7.15": - version "15.7.15" - resolved "https://registry.yarnpkg.com/@types/prop-types/-/prop-types-15.7.15.tgz#e6e5a86d602beaca71ce5163fadf5f95d70931c7" - integrity sha512-F6bEyamV9jKGAFBEmlQnesRPGOQqS2+Uwi0Em15xenOxHaf2hv6L8YCVn3rPdPJOiJfPiCnLIRyvwVaqMY3MIw== - -"@types/react-dom@^18.3.0": - version "18.3.7" - resolved "https://registry.yarnpkg.com/@types/react-dom/-/react-dom-18.3.7.tgz#b89ddf2cd83b4feafcc4e2ea41afdfb95a0d194f" - integrity sha512-MEe3UeoENYVFXzoXEWsvcpg6ZvlrFNlOQ7EOsvhI3CfAXwzPfO8Qwuxd40nepsYKqyyVQnTdEfv68q91yLcKrQ== - -"@types/react-katex@^3.0.4": - version "3.0.4" - resolved "https://registry.yarnpkg.com/@types/react-katex/-/react-katex-3.0.4.tgz#2b60eebf76938bb385337fd850d99cc53ad6ef67" - integrity sha512-aLkykKzSKLpXI6REJ3uClao6P47HAFfR1gcHOZwDeTuALsyjgMhz+oynLV4gX0kiJVnvHrBKF/TLXqyNTpHDUg== - dependencies: - "@types/react" "*" - -"@types/react-transition-group@^4.4.12": - version "4.4.12" - resolved "https://registry.yarnpkg.com/@types/react-transition-group/-/react-transition-group-4.4.12.tgz#b5d76568485b02a307238270bfe96cb51ee2a044" - integrity sha512-8TV6R3h2j7a91c+1DXdJi3Syo69zzIZbz7Lg5tORM5LEJG7X/E6a1V3drRyBRZq7/utz7A+c4OgYLiLcYGHG6w== - -"@types/react@*": - version "19.2.14" - resolved "https://registry.yarnpkg.com/@types/react/-/react-19.2.14.tgz#39604929b5e3957e3a6fa0001dafb17c7af70bad" - integrity sha512-ilcTH/UniCkMdtexkoCN0bI7pMcJDvmQFPvuPvmEaYA/NSfFTAgdUSLAoVjaRJm7+6PvcM+q1zYOwS4wTYMF9w== - dependencies: - csstype "^3.2.2" - -"@types/react@^18.3.3": - version "18.3.28" - resolved "https://registry.yarnpkg.com/@types/react/-/react-18.3.28.tgz#0a85b1a7243b4258d9f626f43797ba18eb5f8781" - integrity sha512-z9VXpC7MWrhfWipitjNdgCauoMLRdIILQsAEV+ZesIzBq/oUlxk0m3ApZuMFCXdnS4U7KrI+l3WRUEGQ8K1QKw== - dependencies: - "@types/prop-types" "*" - csstype "^3.2.2" - -"@types/trusted-types@^2.0.7": - version "2.0.7" - resolved "https://registry.yarnpkg.com/@types/trusted-types/-/trusted-types-2.0.7.tgz#baccb07a970b91707df3a3e8ba6896c57ead2d11" - integrity sha512-ScaPdn1dQczgbl0QFTeTOmVHFULt394XJgOQNoyVhZ6r2vLnMLJfBPd53SB52T/3G36VI1/g2MZaX0cwDuXsfw== - -"@types/use-sync-external-store@^0.0.3": - version "0.0.3" - resolved "https://registry.yarnpkg.com/@types/use-sync-external-store/-/use-sync-external-store-0.0.3.tgz#b6725d5f4af24ace33b36fafd295136e75509f43" - integrity sha512-EwmlvuaxPNej9+T4v5AuBPJa2x2UOJVdjCtDHgcDqitUeOtjnJKJ+apYjVcAoBEMjKW1VVFGZLUb5+qqa09XFA== - -"@types/use-sync-external-store@^0.0.6": - version "0.0.6" - resolved "https://registry.npmjs.org/@types/use-sync-external-store/-/use-sync-external-store-0.0.6.tgz#60be8d21baab8c305132eb9cb912ed497852aadc" - integrity sha512-zFDAD+tlpf2r4asuHEj0XH6pY6i0g5NeAHPn+15wk3BV6JA69eERFXC1gyGThDkVa1zCyKr5jox1+2LbV/AMLg== - -"@types/validator@^13.12.2": - version "13.15.10" - resolved "https://registry.yarnpkg.com/@types/validator/-/validator-13.15.10.tgz#742b77ec34d58554b94a76a14cef30d59e3c16b9" - integrity sha512-T8L6i7wCuyoK8A/ZeLYt1+q0ty3Zb9+qbSSvrIVitzT3YjZqkTZ40IbRsPanlB4h1QB3JVL1SYCdR6ngtFYcuA== - -"@typescript-eslint/eslint-plugin@8.57.2", "@typescript-eslint/eslint-plugin@^8.16.0": - version "8.57.2" - resolved "https://registry.yarnpkg.com/@typescript-eslint/eslint-plugin/-/eslint-plugin-8.57.2.tgz#ad0dcefeca9c2ecbe09f730d478063666aee010b" - integrity sha512-NZZgp0Fm2IkD+La5PR81sd+g+8oS6JwJje+aRWsDocxHkjyRw0J5L5ZTlN3LI1LlOcGL7ph3eaIUmTXMIjLk0w== - dependencies: - "@eslint-community/regexpp" "^4.12.2" - "@typescript-eslint/scope-manager" "8.57.2" - "@typescript-eslint/type-utils" "8.57.2" - "@typescript-eslint/utils" "8.57.2" - "@typescript-eslint/visitor-keys" "8.57.2" - ignore "^7.0.5" - natural-compare "^1.4.0" - ts-api-utils "^2.4.0" - -"@typescript-eslint/parser@8.57.2", "@typescript-eslint/parser@^8.16.0": - version "8.57.2" - resolved "https://registry.yarnpkg.com/@typescript-eslint/parser/-/parser-8.57.2.tgz#b819955e39f976c0d4f95b5ed67fe22f85cd6898" - integrity sha512-30ScMRHIAD33JJQkgfGW1t8CURZtjc2JpTrq5n2HFhOefbAhb7ucc7xJwdWcrEtqUIYJ73Nybpsggii6GtAHjA== - dependencies: - "@typescript-eslint/scope-manager" "8.57.2" - "@typescript-eslint/types" "8.57.2" - "@typescript-eslint/typescript-estree" "8.57.2" - "@typescript-eslint/visitor-keys" "8.57.2" - debug "^4.4.3" - -"@typescript-eslint/project-service@8.57.2": - version "8.57.2" - resolved "https://registry.yarnpkg.com/@typescript-eslint/project-service/-/project-service-8.57.2.tgz#dfbc7777f9f633f2b06b558cda3836e76f856e3c" - integrity sha512-FuH0wipFywXRTHf+bTTjNyuNQQsQC3qh/dYzaM4I4W0jrCqjCVuUh99+xd9KamUfmCGPvbO8NDngo/vsnNVqgw== - dependencies: - "@typescript-eslint/tsconfig-utils" "^8.57.2" - "@typescript-eslint/types" "^8.57.2" - debug "^4.4.3" - -"@typescript-eslint/scope-manager@8.57.2": - version "8.57.2" - resolved "https://registry.yarnpkg.com/@typescript-eslint/scope-manager/-/scope-manager-8.57.2.tgz#734dcde40677f430b5d963108337295bdbc09dae" - integrity sha512-snZKH+W4WbWkrBqj4gUNRIGb/jipDW3qMqVJ4C9rzdFc+wLwruxk+2a5D+uoFcKPAqyqEnSb4l2ULuZf95eSkw== - dependencies: - "@typescript-eslint/types" "8.57.2" - "@typescript-eslint/visitor-keys" "8.57.2" - -"@typescript-eslint/tsconfig-utils@8.57.2", "@typescript-eslint/tsconfig-utils@^8.57.2": - version "8.57.2" - resolved "https://registry.yarnpkg.com/@typescript-eslint/tsconfig-utils/-/tsconfig-utils-8.57.2.tgz#cf82dc11e884103ec13188a7352591efaa1a887e" - integrity sha512-3Lm5DSM+DCowsUOJC+YqHHnKEfFh5CoGkj5Z31NQSNF4l5wdOwqGn99wmwN/LImhfY3KJnmordBq/4+VDe2eKw== - -"@typescript-eslint/type-utils@8.57.2": - version "8.57.2" - resolved "https://registry.yarnpkg.com/@typescript-eslint/type-utils/-/type-utils-8.57.2.tgz#3ec65a94e73776252991a3cf0a15d220734c28f5" - integrity sha512-Co6ZCShm6kIbAM/s+oYVpKFfW7LBc6FXoPXjTRQ449PPNBY8U0KZXuevz5IFuuUj2H9ss40atTaf9dlGLzbWZg== - dependencies: - "@typescript-eslint/types" "8.57.2" - "@typescript-eslint/typescript-estree" "8.57.2" - "@typescript-eslint/utils" "8.57.2" - debug "^4.4.3" - ts-api-utils "^2.4.0" - -"@typescript-eslint/types@8.57.2", "@typescript-eslint/types@^8.57.2": - version "8.57.2" - resolved "https://registry.yarnpkg.com/@typescript-eslint/types/-/types-8.57.2.tgz#efe0da4c28b505ed458f113aa960dce2c5c671f4" - integrity sha512-/iZM6FnM4tnx9csuTxspMW4BOSegshwX5oBDznJ7S4WggL7Vczz5d2W11ecc4vRrQMQHXRSxzrCsyG5EsPPTbA== - -"@typescript-eslint/typescript-estree@8.57.2": - version "8.57.2" - resolved "https://registry.yarnpkg.com/@typescript-eslint/typescript-estree/-/typescript-estree-8.57.2.tgz#432e61a6cf2ab565837da387e5262c159672abea" - integrity sha512-2MKM+I6g8tJxfSmFKOnHv2t8Sk3T6rF20A1Puk0svLK+uVapDZB/4pfAeB7nE83uAZrU6OxW+HmOd5wHVdXwXA== - dependencies: - "@typescript-eslint/project-service" "8.57.2" - "@typescript-eslint/tsconfig-utils" "8.57.2" - "@typescript-eslint/types" "8.57.2" - "@typescript-eslint/visitor-keys" "8.57.2" - debug "^4.4.3" - minimatch "^10.2.2" - semver "^7.7.3" - tinyglobby "^0.2.15" - ts-api-utils "^2.4.0" - -"@typescript-eslint/utils@8.57.2": - version "8.57.2" - resolved "https://registry.yarnpkg.com/@typescript-eslint/utils/-/utils-8.57.2.tgz#46a8974c24326fb8899486728428a0f1a3115014" - integrity sha512-krRIbvPK1ju1WBKIefiX+bngPs+odIQUtR7kymzPfo1POVw3jlF+nLkmexdSSd4UCbDcQn+wMBATOOmpBbqgKg== - dependencies: - "@eslint-community/eslint-utils" "^4.9.1" - "@typescript-eslint/scope-manager" "8.57.2" - "@typescript-eslint/types" "8.57.2" - "@typescript-eslint/typescript-estree" "8.57.2" - -"@typescript-eslint/visitor-keys@8.57.2": - version "8.57.2" - resolved "https://registry.yarnpkg.com/@typescript-eslint/visitor-keys/-/visitor-keys-8.57.2.tgz#a5c9605774247336c0412beb7dc288ab2a07c11e" - integrity sha512-zhahknjobV2FiD6Ee9iLbS7OV9zi10rG26odsQdfBO/hjSzUQbkIYgda+iNKK1zNiW2ey+Lf8MU5btN17V3dUw== - dependencies: - "@typescript-eslint/types" "8.57.2" - eslint-visitor-keys "^5.0.0" - -"@vitejs/plugin-react-swc@^3.7.0": - version "3.11.0" - resolved "https://registry.yarnpkg.com/@vitejs/plugin-react-swc/-/plugin-react-swc-3.11.0.tgz#d82cc307d530197a77b50238860cf319890ffc17" - integrity sha512-YTJCGFdNMHCMfjODYtxRNVAYmTWQ1Lb8PulP/2/f/oEEtglw8oKxKIZmmRkyXrVrHfsKOaVkAc3NT9/dMutO5w== - dependencies: - "@rolldown/pluginutils" "1.0.0-beta.27" - "@swc/core" "^1.12.11" - -"@vitest/expect@4.1.1": - version "4.1.1" - resolved "https://registry.yarnpkg.com/@vitest/expect/-/expect-4.1.1.tgz#875b3fcfa3e8803d6a69cf6ddb58613eab7ae772" - integrity sha512-xAV0fqBTk44Rn6SjJReEQkHP3RrqbJo6JQ4zZ7/uVOiJZRarBtblzrOfFIZeYUrukp2YD6snZG6IBqhOoHTm+A== - dependencies: - "@standard-schema/spec" "^1.1.0" - "@types/chai" "^5.2.2" - "@vitest/spy" "4.1.1" - "@vitest/utils" "4.1.1" - chai "^6.2.2" - tinyrainbow "^3.0.3" - -"@vitest/mocker@4.1.1": - version "4.1.1" - resolved "https://registry.yarnpkg.com/@vitest/mocker/-/mocker-4.1.1.tgz#fe804fbb561e6638864ea8ac4f16a71f5a6f1b91" - integrity sha512-h3BOylsfsCLPeceuCPAAJ+BvNwSENgJa4hXoXu4im0bs9Lyp4URc4JYK4pWLZ4pG/UQn7AT92K6IByi6rE6g3A== - dependencies: - "@vitest/spy" "4.1.1" - estree-walker "^3.0.3" - magic-string "^0.30.21" - -"@vitest/pretty-format@4.1.1": - version "4.1.1" - resolved "https://registry.yarnpkg.com/@vitest/pretty-format/-/pretty-format-4.1.1.tgz#ec0e5e7c1def39c5fac037429166278ef2f85de8" - integrity sha512-GM+TEQN5WhOygr1lp7skeVjdLPqqWMHsfzXrcHAqZJi/lIVh63H0kaRCY8MDhNWikx19zBUK8ceaLB7X5AH9NQ== - dependencies: - tinyrainbow "^3.0.3" - -"@vitest/runner@4.1.1": - version "4.1.1" - resolved "https://registry.yarnpkg.com/@vitest/runner/-/runner-4.1.1.tgz#bedc6eef9f932788a0253c97f7bee82c0b52334c" - integrity sha512-f7+FPy75vN91QGWsITueq0gedwUZy1fLtHOCMeQpjs8jTekAHeKP80zfDEnhrleviLHzVSDXIWuCIOFn3D3f8A== - dependencies: - "@vitest/utils" "4.1.1" - pathe "^2.0.3" - -"@vitest/snapshot@4.1.1": - version "4.1.1" - resolved "https://registry.yarnpkg.com/@vitest/snapshot/-/snapshot-4.1.1.tgz#ffc080d1aa15ce976bf61dcef29dbe489099be69" - integrity sha512-kMVSgcegWV2FibXEx9p9WIKgje58lcTbXgnJixfcg15iK8nzCXhmalL0ZLtTWLW9PH1+1NEDShiFFedB3tEgWg== - dependencies: - "@vitest/pretty-format" "4.1.1" - "@vitest/utils" "4.1.1" - magic-string "^0.30.21" - pathe "^2.0.3" - -"@vitest/spy@4.1.1": - version "4.1.1" - resolved "https://registry.yarnpkg.com/@vitest/spy/-/spy-4.1.1.tgz#7eb25de32a3d65810cb9adb31a2872c1e8341be5" - integrity sha512-6Ti/KT5OVaiupdIZEuZN7l3CZcR0cxnxt70Z0//3CtwgObwA6jZhmVBA3yrXSVN3gmwjgd7oDNLlsXz526gpRA== - -"@vitest/utils@4.1.1": - version "4.1.1" - resolved "https://registry.yarnpkg.com/@vitest/utils/-/utils-4.1.1.tgz#1c8b1f3405008a10bc2cf74eb8ba1b32c1dbc789" - integrity sha512-cNxAlaB3sHoCdL6pj6yyUXv9Gry1NHNg0kFTXdvSIZXLHsqKH7chiWOkwJ5s5+d/oMwcoG9T0bKU38JZWKusrQ== - dependencies: - "@vitest/pretty-format" "4.1.1" - convert-source-map "^2.0.0" - tinyrainbow "^3.0.3" - -acorn-jsx@^5.3.2: - version "5.3.2" - resolved "https://registry.yarnpkg.com/acorn-jsx/-/acorn-jsx-5.3.2.tgz#7ed5bb55908b3b2f1bc55c6af1653bada7f07937" - integrity sha512-rq9s+JNhf0IChjtDXxllJ7g41oZk5SlXtp0LHwyA5cejwn7vKmKp4pPri6YEePv2PU65sAsegbXtIinmDFDXgQ== - -acorn@^8.15.0: - version "8.16.0" - resolved "https://registry.yarnpkg.com/acorn/-/acorn-8.16.0.tgz#4ce79c89be40afe7afe8f3adb902a1f1ce9ac08a" - integrity sha512-UVJyE9MttOsBQIDKw1skb9nAwQuR5wuGD3+82K6JgJlm/Y+KI92oNsMNGZCYdDsVtRHSak0pcV5Dno5+4jh9sw== - -ajv@^6.14.0: - version "6.14.0" - resolved "https://registry.yarnpkg.com/ajv/-/ajv-6.14.0.tgz#fd067713e228210636ebb08c60bd3765d6dbe73a" - integrity sha512-IWrosm/yrn43eiKqkfkHis7QioDleaXQHdDVPKg0FSwwd/DuvyX79TZnFOnYpB7dcsFAMmtFztZuXPDvSePkFw== - dependencies: - fast-deep-equal "^3.1.1" - fast-json-stable-stringify "^2.0.0" - json-schema-traverse "^0.4.1" - uri-js "^4.2.2" - -allotment@^1.20.4: - version "1.20.5" - resolved "https://registry.yarnpkg.com/allotment/-/allotment-1.20.5.tgz#f3458f8b77309cd6046e45e50bf70f3318bf7afd" - integrity sha512-7i4NT7ieXEyAd5lBrXmE7WHz/e7hRuo97+j+TwrPE85ha6kyFURoc76nom0dWSZ1pTKVEAMJy/+f3/Isfu/41A== - dependencies: - classnames "^2.3.0" - eventemitter3 "^5.0.0" - fast-deep-equal "^3.1.3" - lodash.clamp "^4.0.0" - lodash.debounce "^4.0.0" - usehooks-ts "^3.1.1" - -ansi-regex@^5.0.1: - version "5.0.1" - resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-5.0.1.tgz#082cb2c89c9fe8659a311a53bd6a4dc5301db304" - integrity sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ== - -ansi-regex@^6.2.2: - version "6.2.2" - resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-6.2.2.tgz#60216eea464d864597ce2832000738a0589650c1" - integrity sha512-Bq3SmSpyFHaWjPk8If9yc6svM8c56dB5BAtW4Qbw5jHTwwXXcTLoRMkpDJp6VL0XzlWaCHTXrkFURMYmD0sLqg== - -ansi-styles@^4.1.0: - version "4.3.0" - resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-4.3.0.tgz#edd803628ae71c04c85ae7a0906edad34b648937" - integrity sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg== - dependencies: - color-convert "^2.0.1" - -ansi-styles@^5.0.0: - version "5.2.0" - resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-5.2.0.tgz#07449690ad45777d1924ac2abb2fc8895dba836b" - integrity sha512-Cxwpt2SfTzTtXcfOlzGEee8O+c+MmUgGrNiBcXnuWxuFJHe6a5Hz7qwhwe5OgaSYI0IJvkLqWX1ASG+cJOkEiA== - -ansi-styles@^6.2.1: - version "6.2.3" - resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-6.2.3.tgz#c044d5dcc521a076413472597a1acb1f103c4041" - integrity sha512-4Dj6M28JB+oAH8kFkTLUo+a2jwOFkuqb3yucU0CANcRRUbxS0cP0nZYCGjcc3BNXwRIsUVmDGgzawme7zvJHvg== - -archiver-utils@^2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/archiver-utils/-/archiver-utils-2.1.0.tgz#e8a460e94b693c3e3da182a098ca6285ba9249e2" - integrity sha512-bEL/yUb/fNNiNTuUz979Z0Yg5L+LzLxGJz8x79lYmR54fmTIb6ob/hNQgkQnIUDWIFjZVQwl9Xs356I6BAMHfw== - dependencies: - glob "^7.1.4" - graceful-fs "^4.2.0" - lazystream "^1.0.0" - lodash.defaults "^4.2.0" - lodash.difference "^4.5.0" - lodash.flatten "^4.4.0" - lodash.isplainobject "^4.0.6" - lodash.union "^4.6.0" - normalize-path "^3.0.0" - readable-stream "^2.0.0" - -archiver-utils@^3.0.4: - version "3.0.4" - resolved "https://registry.yarnpkg.com/archiver-utils/-/archiver-utils-3.0.4.tgz#a0d201f1cf8fce7af3b5a05aea0a337329e96ec7" - integrity sha512-KVgf4XQVrTjhyWmx6cte4RxonPLR9onExufI1jhvw/MQ4BB6IsZD5gT8Lq+u/+pRkWna/6JoHpiQioaqFP5Rzw== - dependencies: - glob "^7.2.3" - graceful-fs "^4.2.0" - lazystream "^1.0.0" - lodash.defaults "^4.2.0" - lodash.difference "^4.5.0" - lodash.flatten "^4.4.0" - lodash.isplainobject "^4.0.6" - lodash.union "^4.6.0" - normalize-path "^3.0.0" - readable-stream "^3.6.0" - -archiver@^5.0.0: - version "5.3.2" - resolved "https://registry.yarnpkg.com/archiver/-/archiver-5.3.2.tgz#99991d5957e53bd0303a392979276ac4ddccf3b0" - integrity sha512-+25nxyyznAXF7Nef3y0EbBeqmGZgeN/BxHX29Rs39djAfaFalmQ89SE6CWyDCHzGL0yt/ycBtNOmGTW0FyGWNw== - dependencies: - archiver-utils "^2.1.0" - async "^3.2.4" - buffer-crc32 "^0.2.1" - readable-stream "^3.6.0" - readdir-glob "^1.1.2" - tar-stream "^2.2.0" - zip-stream "^4.1.0" - -argparse@^2.0.1: - version "2.0.1" - resolved "https://registry.yarnpkg.com/argparse/-/argparse-2.0.1.tgz#246f50f3ca78a3240f6c997e8a9bd1eac49e4b38" - integrity sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q== - -aria-query@5.3.0: - version "5.3.0" - resolved "https://registry.yarnpkg.com/aria-query/-/aria-query-5.3.0.tgz#650c569e41ad90b51b3d7df5e5eed1c7549c103e" - integrity sha512-b0P0sZPKtyu8HkeRAfCq0IfURZK+SuwMjY1UXGBU27wpAiTwQAIlq56IbIO+ytk/JjS1fMR14ee5WBBfKi5J6A== - dependencies: - dequal "^2.0.3" - -aria-query@^5.0.0, aria-query@^5.3.2: - version "5.3.2" - resolved "https://registry.yarnpkg.com/aria-query/-/aria-query-5.3.2.tgz#93f81a43480e33a338f19163a3d10a50c01dcd59" - integrity sha512-COROpnaoap1E2F000S62r6A60uHZnmlvomhfyT2DlTcrY1OrBKn2UhH7qn5wTC9zMvD0AY7csdPSNwKP+7WiQw== - -array-buffer-byte-length@^1.0.1, array-buffer-byte-length@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/array-buffer-byte-length/-/array-buffer-byte-length-1.0.2.tgz#384d12a37295aec3769ab022ad323a18a51ccf8b" - integrity sha512-LHE+8BuR7RYGDKvnrmcuSq3tDcKv9OFEXQt/HpbZhY7V6h0zlUXutnAD82GiFx9rdieCMjkvtcsPqBwgUl1Iiw== - dependencies: - call-bound "^1.0.3" - is-array-buffer "^3.0.5" - -array-includes@^3.1.6, array-includes@^3.1.8: - version "3.1.9" - resolved "https://registry.yarnpkg.com/array-includes/-/array-includes-3.1.9.tgz#1f0ccaa08e90cdbc3eb433210f903ad0f17c3f3a" - integrity sha512-FmeCCAenzH0KH381SPT5FZmiA/TmpndpcaShhfgEN9eCVjnFBqq3l1xrI42y8+PPLI6hypzou4GXw00WHmPBLQ== - dependencies: - call-bind "^1.0.8" - call-bound "^1.0.4" - define-properties "^1.2.1" - es-abstract "^1.24.0" - es-object-atoms "^1.1.1" - get-intrinsic "^1.3.0" - is-string "^1.1.1" - math-intrinsics "^1.1.0" - -array.prototype.findlast@^1.2.5: - version "1.2.5" - resolved "https://registry.yarnpkg.com/array.prototype.findlast/-/array.prototype.findlast-1.2.5.tgz#3e4fbcb30a15a7f5bf64cf2faae22d139c2e4904" - integrity sha512-CVvd6FHg1Z3POpBLxO6E6zr+rSKEQ9L6rZHAaY7lLfhKsWYUBBOuMs0e9o24oopj6H+geRCX0YJ+TJLBK2eHyQ== - dependencies: - call-bind "^1.0.7" - define-properties "^1.2.1" - es-abstract "^1.23.2" - es-errors "^1.3.0" - es-object-atoms "^1.0.0" - es-shim-unscopables "^1.0.2" - -array.prototype.flat@^1.3.1: - version "1.3.3" - resolved "https://registry.yarnpkg.com/array.prototype.flat/-/array.prototype.flat-1.3.3.tgz#534aaf9e6e8dd79fb6b9a9917f839ef1ec63afe5" - integrity sha512-rwG/ja1neyLqCuGZ5YYrznA62D4mZXg0i1cIskIUKSiqF3Cje9/wXAls9B9s1Wa2fomMsIv8czB8jZcPmxCXFg== - dependencies: - call-bind "^1.0.8" - define-properties "^1.2.1" - es-abstract "^1.23.5" - es-shim-unscopables "^1.0.2" - -array.prototype.flatmap@^1.3.2, array.prototype.flatmap@^1.3.3: - version "1.3.3" - resolved "https://registry.yarnpkg.com/array.prototype.flatmap/-/array.prototype.flatmap-1.3.3.tgz#712cc792ae70370ae40586264629e33aab5dd38b" - integrity sha512-Y7Wt51eKJSyi80hFrJCePGGNo5ktJCslFuboqJsbf57CCPcm5zztluPlc4/aD8sWsKvlwatezpV4U1efk8kpjg== - dependencies: - call-bind "^1.0.8" - define-properties "^1.2.1" - es-abstract "^1.23.5" - es-shim-unscopables "^1.0.2" - -array.prototype.tosorted@^1.1.4: - version "1.1.4" - resolved "https://registry.yarnpkg.com/array.prototype.tosorted/-/array.prototype.tosorted-1.1.4.tgz#fe954678ff53034e717ea3352a03f0b0b86f7ffc" - integrity sha512-p6Fx8B7b7ZhL/gmUsAy0D15WhvDccw3mnGNbZpi3pmeJdxtWsj2jEaI4Y6oo3XiHfzuSgPwKc04MYt6KgvC/wA== - dependencies: - call-bind "^1.0.7" - define-properties "^1.2.1" - es-abstract "^1.23.3" - es-errors "^1.3.0" - es-shim-unscopables "^1.0.2" - -arraybuffer.prototype.slice@^1.0.4: - version "1.0.4" - resolved "https://registry.yarnpkg.com/arraybuffer.prototype.slice/-/arraybuffer.prototype.slice-1.0.4.tgz#9d760d84dbdd06d0cbf92c8849615a1a7ab3183c" - integrity sha512-BNoCY6SXXPQ7gF2opIP4GBE+Xw7U+pHMYKuzjgCN3GwiaIR09UUeKfheyIry77QtrCBlC0KK0q5/TER/tYh3PQ== - dependencies: - array-buffer-byte-length "^1.0.1" - call-bind "^1.0.8" - define-properties "^1.2.1" - es-abstract "^1.23.5" - es-errors "^1.3.0" - get-intrinsic "^1.2.6" - is-array-buffer "^3.0.4" - -assertion-error@^2.0.1: - version "2.0.1" - resolved "https://registry.yarnpkg.com/assertion-error/-/assertion-error-2.0.1.tgz#f641a196b335690b1070bf00b6e7593fec190bf7" - integrity sha512-Izi8RQcffqCeNVgFigKli1ssklIbpHnCYc6AknXGYoB6grJqyeby7jv12JUQgmTAnIDnbck1uxksT4dzN3PWBA== - -ast-types-flow@^0.0.8: - version "0.0.8" - resolved "https://registry.yarnpkg.com/ast-types-flow/-/ast-types-flow-0.0.8.tgz#0a85e1c92695769ac13a428bb653e7538bea27d6" - integrity sha512-OH/2E5Fg20h2aPrbe+QL8JZQFko0YZaF+j4mnQ7BGhfavO7OpSLa8a0y9sBwomHdSbkhTS8TQNayBfnW5DwbvQ== - -async-function@^1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/async-function/-/async-function-1.0.0.tgz#509c9fca60eaf85034c6829838188e4e4c8ffb2b" - integrity sha512-hsU18Ae8CDTR6Kgu9DYf0EbCr/a5iGL0rytQDobUcdpYOKokk8LEjVphnXkDkgpi0wYVsqrXuP0bZxJaTqdgoA== - -async@^3.2.4: - version "3.2.6" - resolved "https://registry.yarnpkg.com/async/-/async-3.2.6.tgz#1b0728e14929d51b85b449b7f06e27c1145e38ce" - integrity sha512-htCUDlxyyCLMgaM3xXg0C0LW2xqfuQ6p05pCEIsXuyQ+a1koYKTuBMzRNwmybfLgvJDMd0r1LTn4+E0Ti6C2AA== - -available-typed-arrays@^1.0.7: - version "1.0.7" - resolved "https://registry.yarnpkg.com/available-typed-arrays/-/available-typed-arrays-1.0.7.tgz#a5cc375d6a03c2efc87a553f3e0b1522def14846" - integrity sha512-wvUjBtSGN7+7SjNpq/9M2Tg350UZD3q62IFZLbRAR1bSMlCo1ZaeW+BJ+D090e4hIIZLBcTDWe4Mh4jvUDajzQ== - dependencies: - possible-typed-array-names "^1.0.0" - -axe-core@^4.10.0: - version "4.11.1" - resolved "https://registry.yarnpkg.com/axe-core/-/axe-core-4.11.1.tgz#052ff9b2cbf543f5595028b583e4763b40c78ea7" - integrity sha512-BASOg+YwO2C+346x3LZOeoovTIoTrRqEsqMa6fmfAV0P+U9mFr9NsyOEpiYvFjbc64NMrSswhV50WdXzdb/Z5A== - -axobject-query@^4.1.0: - version "4.1.0" - resolved "https://registry.yarnpkg.com/axobject-query/-/axobject-query-4.1.0.tgz#28768c76d0e3cff21bc62a9e2d0b6ac30042a1ee" - integrity sha512-qIj0G9wZbMGNLjLmg1PT6v2mE9AH2zlnADJD/2tC6E00hgmhUOfEB6greHPAfLRSufHqROIUTkw6E+M3lH0PTQ== - -babel-plugin-macros@^3.1.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/babel-plugin-macros/-/babel-plugin-macros-3.1.0.tgz#9ef6dc74deb934b4db344dc973ee851d148c50c1" - integrity sha512-Cg7TFGpIr01vOQNODXOOaGz2NpCU5gl8x1qJFbb6hbZxR7XrcE2vtbAsTAbJ7/xwJtUuJEw8K8Zr/AE0LHlesg== - dependencies: - "@babel/runtime" "^7.12.5" - cosmiconfig "^7.0.0" - resolve "^1.19.0" - -balanced-match@^1.0.0: - version "1.0.2" - resolved "https://registry.yarnpkg.com/balanced-match/-/balanced-match-1.0.2.tgz#e83e3a7e3f300b34cb9d87f615fa0cbf357690ee" - integrity sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw== - -balanced-match@^4.0.2: - version "4.0.4" - resolved "https://registry.yarnpkg.com/balanced-match/-/balanced-match-4.0.4.tgz#bfb10662feed8196a2c62e7c68e17720c274179a" - integrity sha512-BLrgEcRTwX2o6gGxGOCNyMvGSp35YofuYzw9h1IMTRmKqttAZZVU67bdb9Pr2vUHA8+j3i2tJfjO6C6+4myGTA== - -base64-arraybuffer@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/base64-arraybuffer/-/base64-arraybuffer-1.0.2.tgz#1c37589a7c4b0746e34bd1feb951da2df01c1bdc" - integrity sha512-I3yl4r9QB5ZRY3XuJVEPfc2XhZO6YweFPI+UovAzn+8/hb3oJ6lnysaFcjVpkCPfVWFUDvoZ8kmVDP7WyRtYtQ== - -base64-js@^1.3.1: - version "1.5.1" - resolved "https://registry.yarnpkg.com/base64-js/-/base64-js-1.5.1.tgz#1b1b440160a5bf7ad40b650f095963481903930a" - integrity sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA== - -bidi-js@^1.0.3: - version "1.0.3" - resolved "https://registry.yarnpkg.com/bidi-js/-/bidi-js-1.0.3.tgz#6f8bcf3c877c4d9220ddf49b9bb6930c88f877d2" - integrity sha512-RKshQI1R3YQ+n9YJz2QQ147P66ELpa1FQEg20Dk8oW9t2KgLbpDLLp9aGZ7y8WHSshDknG0bknqGw5/tyCs5tw== - dependencies: - require-from-string "^2.0.2" - -big-integer@^1.6.17: - version "1.6.52" - resolved "https://registry.yarnpkg.com/big-integer/-/big-integer-1.6.52.tgz#60a887f3047614a8e1bffe5d7173490a97dc8c85" - integrity sha512-QxD8cf2eVqJOOz63z6JIN9BzvVs/dlySa5HGSBH5xtR8dPteIRQnBxxKqkNTiT6jbDTF6jAfrd4oMcND9RGbQg== - -binary@~0.3.0: - version "0.3.0" - resolved "https://registry.yarnpkg.com/binary/-/binary-0.3.0.tgz#9f60553bc5ce8c3386f3b553cff47462adecaa79" - integrity sha512-D4H1y5KYwpJgK8wk1Cue5LLPgmwHKYSChkbspQg5JtVuR5ulGckxfR62H3AE9UDkdMC8yyXlqYihuz3Aqg2XZg== - dependencies: - buffers "~0.1.1" - chainsaw "~0.1.0" - -bl@^4.0.3: - version "4.1.0" - resolved "https://registry.yarnpkg.com/bl/-/bl-4.1.0.tgz#451535264182bec2fbbc83a62ab98cf11d9f7b3a" - integrity sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w== - dependencies: - buffer "^5.5.0" - inherits "^2.0.4" - readable-stream "^3.4.0" - -bluebird@~3.4.1: - version "3.4.7" - resolved "https://registry.yarnpkg.com/bluebird/-/bluebird-3.4.7.tgz#f72d760be09b7f76d08ed8fae98b289a8d05fab3" - integrity sha512-iD3898SR7sWVRHbiQv+sHUtHnMvC1o3nW5rAcqnq3uOn07DSAppZYUkIGslDz6gXC7HfunPe7YVBgoEJASPcHA== - -brace-expansion@^1.1.7: - version "1.1.12" - resolved "https://registry.yarnpkg.com/brace-expansion/-/brace-expansion-1.1.12.tgz#ab9b454466e5a8cc3a187beaad580412a9c5b843" - integrity sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg== - dependencies: - balanced-match "^1.0.0" - concat-map "0.0.1" - -brace-expansion@^2.0.1: - version "2.0.2" - resolved "https://registry.yarnpkg.com/brace-expansion/-/brace-expansion-2.0.2.tgz#54fc53237a613d854c7bd37463aad17df87214e7" - integrity sha512-Jt0vHyM+jmUBqojB7E1NIYadt0vI0Qxjxd2TErW94wDz+E2LAm5vKMXXwg6ZZBTHPuUlDgQHKXvjGBdfcF1ZDQ== - dependencies: - balanced-match "^1.0.0" - -brace-expansion@^5.0.2: - version "5.0.5" - resolved "https://registry.yarnpkg.com/brace-expansion/-/brace-expansion-5.0.5.tgz#dcc3a37116b79f3e1b46db994ced5d570e930fdb" - integrity sha512-VZznLgtwhn+Mact9tfiwx64fA9erHH/MCXEUfB/0bX/6Fz6ny5EGTXYltMocqg4xFAQZtnO3DHWWXi8RiuN7cQ== - dependencies: - balanced-match "^4.0.2" - -bubblesets-js@^3.0.0: - version "3.0.1" - resolved "https://registry.yarnpkg.com/bubblesets-js/-/bubblesets-js-3.0.1.tgz#b2c2d991ee5900d4d3057649354eb0ea4e835859" - integrity sha512-EKPfysvIU5+u5RLW3mOr94wxzA3nKzqMBX0F95L95BPBDZPVgLBUnT0kJNz4UK/TXbGs8G7yEgl5MvibRBCQoQ== - -buffer-crc32@^0.2.1, buffer-crc32@^0.2.13: - version "0.2.13" - resolved "https://registry.yarnpkg.com/buffer-crc32/-/buffer-crc32-0.2.13.tgz#0d333e3f00eac50aa1454abd30ef8c2a5d9a7242" - integrity sha512-VO9Ht/+p3SN7SKWqcrgEzjGbRSJYTx+Q1pTQC0wrWqHx0vpJraQ6GtHx8tvcg1rlK1byhU5gccxgOgj7B0TDkQ== - -buffer-indexof-polyfill@~1.0.0: - version "1.0.2" - resolved "https://registry.yarnpkg.com/buffer-indexof-polyfill/-/buffer-indexof-polyfill-1.0.2.tgz#d2732135c5999c64b277fcf9b1abe3498254729c" - integrity sha512-I7wzHwA3t1/lwXQh+A5PbNvJxgfo5r3xulgpYDB5zckTu/Z9oUK9biouBKQUjEqzaz3HnAT6TYoovmE+GqSf7A== - -buffer@^5.5.0: - version "5.7.1" - resolved "https://registry.yarnpkg.com/buffer/-/buffer-5.7.1.tgz#ba62e7c13133053582197160851a8f648e99eed0" - integrity sha512-EHcyIPBQ4BSGlvjB16k5KgAJ27CIsHY/2JBmCRReo48y9rQ3MaUzWX3KVlBa4U7MyX02HdVj0K7C3WaB3ju7FQ== - dependencies: - base64-js "^1.3.1" - ieee754 "^1.1.13" - -buffers@~0.1.1: - version "0.1.1" - resolved "https://registry.yarnpkg.com/buffers/-/buffers-0.1.1.tgz#b24579c3bed4d6d396aeee6d9a8ae7f5482ab7bb" - integrity sha512-9q/rDEGSb/Qsvv2qvzIzdluL5k7AaJOTrw23z9reQthrbF7is4CtlT0DXyO1oei2DCp4uojjzQ7igaSHp1kAEQ== - -call-bind-apply-helpers@^1.0.0, call-bind-apply-helpers@^1.0.1, call-bind-apply-helpers@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz#4b5428c222be985d79c3d82657479dbe0b59b2d6" - integrity sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ== - dependencies: - es-errors "^1.3.0" - function-bind "^1.1.2" - -call-bind@^1.0.7, call-bind@^1.0.8: - version "1.0.8" - resolved "https://registry.yarnpkg.com/call-bind/-/call-bind-1.0.8.tgz#0736a9660f537e3388826f440d5ec45f744eaa4c" - integrity sha512-oKlSFMcMwpUg2ednkhQ454wfWiU/ul3CkJe/PEHcTKuiX6RpbehUiFMXu13HalGZxfUwCQzZG747YXBn1im9ww== - dependencies: - call-bind-apply-helpers "^1.0.0" - es-define-property "^1.0.0" - get-intrinsic "^1.2.4" - set-function-length "^1.2.2" - -call-bound@^1.0.2, call-bound@^1.0.3, call-bound@^1.0.4: - version "1.0.4" - resolved "https://registry.yarnpkg.com/call-bound/-/call-bound-1.0.4.tgz#238de935d2a2a692928c538c7ccfa91067fd062a" - integrity sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg== - dependencies: - call-bind-apply-helpers "^1.0.2" - get-intrinsic "^1.3.0" - -callsites@^3.0.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/callsites/-/callsites-3.1.0.tgz#b3630abd8943432f54b3f0519238e33cd7df2f73" - integrity sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ== - -canvas@^3.2.1: - version "3.2.2" - resolved "https://registry.yarnpkg.com/canvas/-/canvas-3.2.2.tgz#56d6f2177b7f729a1f83d651e0ae384f19786b30" - integrity sha512-duEt4h1HHu9sJZyVKfLRXR6tsKPY7cEELzxSRJkwddOXYvQT3P/+es98SV384JA0zMOZ5s+9gatnGfM6sL4Drg== - dependencies: - node-addon-api "^7.0.0" - prebuild-install "^7.1.3" - -chai@^6.2.2: - version "6.2.2" - resolved "https://registry.yarnpkg.com/chai/-/chai-6.2.2.tgz#ae41b52c9aca87734505362717f3255facda360e" - integrity sha512-NUPRluOfOiTKBKvWPtSD4PhFvWCqOi0BGStNWs57X9js7XGTprSmFoz5F0tWhR4WPjNeR9jXqdC7/UpSJTnlRg== - -chainsaw@~0.1.0: - version "0.1.0" - resolved "https://registry.yarnpkg.com/chainsaw/-/chainsaw-0.1.0.tgz#5eab50b28afe58074d0d58291388828b5e5fbc98" - integrity sha512-75kWfWt6MEKNC8xYXIdRpDehRYY/tNSgwKaJq+dbbDcxORuVrrQ+SEHoWsniVn9XPYfP4gmdWIeDk/4YNp1rNQ== - dependencies: - traverse ">=0.3.0 <0.4" - -chalk@^4.0.0: - version "4.1.2" - resolved "https://registry.yarnpkg.com/chalk/-/chalk-4.1.2.tgz#aac4e2b7734a740867aeb16bf02aad556a1e7a01" - integrity sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA== - dependencies: - ansi-styles "^4.1.0" - supports-color "^7.1.0" - -chart.js@^4.5.1: - version "4.5.1" - resolved "https://registry.yarnpkg.com/chart.js/-/chart.js-4.5.1.tgz#19dd1a9a386a3f6397691672231cb5fc9c052c35" - integrity sha512-GIjfiT9dbmHRiYi6Nl2yFCq7kkwdkp1W/lp2J99rX0yo9tgJGn3lKQATztIjb5tVtevcBtIdICNWqlq5+E8/Pw== - dependencies: - "@kurkle/color" "^0.3.0" - -chokidar@^4.0.0: - version "4.0.3" - resolved "https://registry.yarnpkg.com/chokidar/-/chokidar-4.0.3.tgz#7be37a4c03c9aee1ecfe862a4a23b2c70c205d30" - integrity sha512-Qgzu8kfBvo+cA4962jnP1KkS6Dop5NS6g7R5LFYJr4b8Ub94PPQXUksCw9PvXoeXPRRddRNC5C1JQUR2SMGtnA== - dependencies: - readdirp "^4.0.1" - -chownr@^1.1.1: - version "1.1.4" - resolved "https://registry.yarnpkg.com/chownr/-/chownr-1.1.4.tgz#6fc9d7b42d32a583596337666e7d08084da2cc6b" - integrity sha512-jJ0bqzaylmJtVnNgzTeSOs8DPavpbYgEr/b0YL8/2GO3xJEhInFmhKMUnEJQjZumK7KXGFhUy89PrsJWlakBVg== - -chroma-js@^3.1.2: - version "3.2.0" - resolved "https://registry.yarnpkg.com/chroma-js/-/chroma-js-3.2.0.tgz#4e9e665290b9bbfece524fccf759d5120e351ff2" - integrity sha512-os/OippSlX1RlWWr+QDPcGUZs0uoqr32urfxESG9U93lhUfbnlyckte84Q8P1UQY/qth983AS1JONKmLS4T0nw== - -classnames@^2.3.0: - version "2.5.1" - resolved "https://registry.yarnpkg.com/classnames/-/classnames-2.5.1.tgz#ba774c614be0f016da105c858e7159eae8e7687b" - integrity sha512-saHYOzhIQs6wy2sVxTM6bUDsQO4F50V9RQ22qBpEdCW+I+/Wmke2HOl6lS6dTpdxVhb88/I6+Hs+438c3lfUow== - -cliui@^9.0.1: - version "9.0.1" - resolved "https://registry.yarnpkg.com/cliui/-/cliui-9.0.1.tgz#6f7890f386f6f1f79953adc1f78dec46fcc2d291" - integrity sha512-k7ndgKhwoQveBL+/1tqGJYNz097I7WOvwbmmU2AR5+magtbjPWQTS1C5vzGkBC8Ym8UWRzfKUzUUqFLypY4Q+w== - dependencies: - string-width "^7.2.0" - strip-ansi "^7.1.0" - wrap-ansi "^9.0.0" - -clsx@^2.1.1: - version "2.1.1" - resolved "https://registry.yarnpkg.com/clsx/-/clsx-2.1.1.tgz#eed397c9fd8bd882bfb18deab7102049a2f32999" - integrity sha512-eYm0QWBtUrBWZWG0d386OGAw16Z995PiOVo2B7bjWSbHedGl5e0ZWaq65kOGgUSNesEIDkB9ISbTg/JK9dhCZA== - -color-convert@^2.0.1: - version "2.0.1" - resolved "https://registry.yarnpkg.com/color-convert/-/color-convert-2.0.1.tgz#72d3a68d598c9bdb3af2ad1e84f21d896abd4de3" - integrity sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ== - dependencies: - color-name "~1.1.4" - -color-name@~1.1.4: - version "1.1.4" - resolved "https://registry.yarnpkg.com/color-name/-/color-name-1.1.4.tgz#c2a09a87acbde69543de6f63fa3995c826c536a2" - integrity sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA== - -commander@2: - version "2.20.3" - resolved "https://registry.yarnpkg.com/commander/-/commander-2.20.3.tgz#fd485e84c03eb4881c20722ba48035e8531aeb33" - integrity sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ== - -commander@7: - version "7.2.0" - resolved "https://registry.yarnpkg.com/commander/-/commander-7.2.0.tgz#a36cb57d0b501ce108e4d20559a150a391d97ab7" - integrity sha512-QrWXB+ZQSVPmIWIhtEO9H+gwHaMGYiF5ChvoJ+K9ZGHG/sVsa6yiesAD1GC/x46sET00Xlwo1u49RVVVzvcSkw== - -commander@^8.3.0: - version "8.3.0" - resolved "https://registry.yarnpkg.com/commander/-/commander-8.3.0.tgz#4837ea1b2da67b9c616a67afbb0fafee567bca66" - integrity sha512-OkTL9umf+He2DZkUq8f8J9of7yL6RJKI24dVITBmNfZBmri9zYZQrKkuXiKhyfPSu8tUhnVBB1iKXevvnlR4Ww== - -compress-commons@^4.1.2: - version "4.1.2" - resolved "https://registry.yarnpkg.com/compress-commons/-/compress-commons-4.1.2.tgz#6542e59cb63e1f46a8b21b0e06f9a32e4c8b06df" - integrity sha512-D3uMHtGc/fcO1Gt1/L7i1e33VOvD4A9hfQLP+6ewd+BvG/gQ84Yh4oftEhAdjSMgBgwGL+jsppT7JYNpo6MHHg== - dependencies: - buffer-crc32 "^0.2.13" - crc32-stream "^4.0.2" - normalize-path "^3.0.0" - readable-stream "^3.6.0" - -concat-map@0.0.1: - version "0.0.1" - resolved "https://registry.yarnpkg.com/concat-map/-/concat-map-0.0.1.tgz#d8a96bd77fd68df7793a73036a3ba0d5405d477b" - integrity sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg== - -convert-source-map@^1.5.0: - version "1.9.0" - resolved "https://registry.yarnpkg.com/convert-source-map/-/convert-source-map-1.9.0.tgz#7faae62353fb4213366d0ca98358d22e8368b05f" - integrity sha512-ASFBup0Mz1uyiIjANan1jzLQami9z1PoYSZCiiYW2FczPbenXc45FZdBZLzOT+r6+iciuEModtmCti+hjaAk0A== - -convert-source-map@^2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/convert-source-map/-/convert-source-map-2.0.0.tgz#4b560f649fc4e918dd0ab75cf4961e8bc882d82a" - integrity sha512-Kvp459HrV2FEJ1CAsi1Ku+MY3kasH19TFykTz2xWmMeq6bk2NU3XXvfJ+Q61m0xktWwt+1HSYf3JZsTms3aRJg== - -core-util-is@~1.0.0: - version "1.0.3" - resolved "https://registry.yarnpkg.com/core-util-is/-/core-util-is-1.0.3.tgz#a6042d3634c2b27e9328f837b965fac83808db85" - integrity sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ== - -cosmiconfig@^7.0.0: - version "7.1.0" - resolved "https://registry.yarnpkg.com/cosmiconfig/-/cosmiconfig-7.1.0.tgz#1443b9afa596b670082ea46cbd8f6a62b84635f6" - integrity sha512-AdmX6xUzdNASswsFtmwSt7Vj8po9IuqXm0UXz7QKPuEUmPB4XyjGfaAr2PSuELMwkRMVH1EpIkX5bTZGRB3eCA== - dependencies: - "@types/parse-json" "^4.0.0" - import-fresh "^3.2.1" - parse-json "^5.0.0" - path-type "^4.0.0" - yaml "^1.10.0" - -crc-32@^1.2.0: - version "1.2.2" - resolved "https://registry.yarnpkg.com/crc-32/-/crc-32-1.2.2.tgz#3cad35a934b8bf71f25ca524b6da51fb7eace2ff" - integrity sha512-ROmzCKrTnOwybPcJApAA6WBWij23HVfGVNKqqrZpuyZOHqK2CwHSvpGuyt/UNNvaIjEd8X5IFGp4Mh+Ie1IHJQ== - -crc32-stream@^4.0.2: - version "4.0.3" - resolved "https://registry.yarnpkg.com/crc32-stream/-/crc32-stream-4.0.3.tgz#85dd677eb78fa7cad1ba17cc506a597d41fc6f33" - integrity sha512-NT7w2JVU7DFroFdYkeq8cywxrgjPHWkdX1wjpRQXPX5Asews3tA+Ght6lddQO5Mkumffp3X7GEqku3epj2toIw== - dependencies: - crc-32 "^1.2.0" - readable-stream "^3.4.0" - -crelt@^1.0.0: - version "1.0.6" - resolved "https://registry.npmjs.org/crelt/-/crelt-1.0.6.tgz#7cc898ea74e190fb6ef9dae57f8f81cf7302df72" - integrity sha512-VQ2MBenTq1fWZUH9DJNGti7kKv6EeAuYr3cLwxUWhIu1baTaXh4Ib5W2CqHVqib4/MqbYGJqiL3Zb8GJZr3l4g== - -cross-spawn@^7.0.6: - version "7.0.6" - resolved "https://registry.yarnpkg.com/cross-spawn/-/cross-spawn-7.0.6.tgz#8a58fe78f00dcd70c370451759dfbfaf03e8ee9f" - integrity sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA== - dependencies: - path-key "^3.1.0" - shebang-command "^2.0.0" - which "^2.0.1" - -css-line-break@^2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/css-line-break/-/css-line-break-2.1.0.tgz#bfef660dfa6f5397ea54116bb3cb4873edbc4fa0" - integrity sha512-FHcKFCZcAha3LwfVBhCQbW2nCNbkZXn7KVUJcsT5/P8YmfsVja0FMPJr0B903j/E69HUphKiV9iQArX8SDYA4w== - dependencies: - utrie "^1.0.2" - -css-tree@^3.0.0, css-tree@^3.2.1: - version "3.2.1" - resolved "https://registry.yarnpkg.com/css-tree/-/css-tree-3.2.1.tgz#86cac7011561272b30e6b1e042ba6ce047aa7518" - integrity sha512-X7sjQzceUhu1u7Y/ylrRZFU2FS6LRiFVp6rKLPg23y3x3c3DOKAwuXGDp+PAGjh6CSnCjYeAul8pcT8bAl+lSA== - dependencies: - mdn-data "2.27.1" - source-map-js "^1.2.1" - -css.escape@^1.5.1: - version "1.5.1" - resolved "https://registry.yarnpkg.com/css.escape/-/css.escape-1.5.1.tgz#42e27d4fa04ae32f931a4b4d4191fa9cddee97cb" - integrity sha512-YUifsXXuknHlUsmlgyY0PKzgPOr7/FjCePfHNt0jxm83wHZi44VDMQ7/fGNkjY3/jV1MC+1CmZbaHzugyeRtpg== - -csstype@^3.0.2, csstype@^3.1.0, csstype@^3.2.2, csstype@^3.2.3: - version "3.2.3" - resolved "https://registry.yarnpkg.com/csstype/-/csstype-3.2.3.tgz#ec48c0f3e993e50648c86da559e2610995cf989a" - integrity sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ== - -culori@^4.0.2: - version "4.0.2" - resolved "https://registry.yarnpkg.com/culori/-/culori-4.0.2.tgz#fbb28dbeb8d13d0eeab7520191f74ab822a8ca71" - integrity sha512-1+BhOB8ahCn4O0cep0Sh2l9KCOfOdY+BXJnKMHFFzDEouSr/el18QwXEMRlOj9UY5nCeA8UN3a/82rUWRBeyBw== - -"d3-array@1 - 3", "d3-array@2 - 3", "d3-array@2.10.0 - 3", "d3-array@2.5.0 - 3", d3-array@3, d3-array@3.2.4, d3-array@^3.2.0, d3-array@^3.2.4: - version "3.2.4" - resolved "https://registry.yarnpkg.com/d3-array/-/d3-array-3.2.4.tgz#15fec33b237f97ac5d7c986dc77da273a8ed0bb5" - integrity sha512-tdQAmyA18i4J7wprpYq8ClcxZy3SC31QMeByyCFyRt7BVHdREQZ5lpzoe5mFEYZUWe+oq8HBvk9JjpibyEV4Jg== - dependencies: - internmap "1 - 2" - -d3-axis@3: - version "3.0.0" - resolved "https://registry.yarnpkg.com/d3-axis/-/d3-axis-3.0.0.tgz#c42a4a13e8131d637b745fc2973824cfeaf93322" - integrity sha512-IH5tgjV4jE/GhHkRV0HiVYPDtvfjHQlQfJHs0usq7M30XcSBvOotpmH1IgkcXsO/5gEQZD43B//fc7SRT5S+xw== - -d3-brush@3: - version "3.0.0" - resolved "https://registry.yarnpkg.com/d3-brush/-/d3-brush-3.0.0.tgz#6f767c4ed8dcb79de7ede3e1c0f89e63ef64d31c" - integrity sha512-ALnjWlVYkXsVIGlOsuWH1+3udkYFI48Ljihfnh8FZPF2QS9o+PzGLBslO0PjzVoHLZ2KCVgAM8NVkXPJB2aNnQ== - dependencies: - d3-dispatch "1 - 3" - d3-drag "2 - 3" - d3-interpolate "1 - 3" - d3-selection "3" - d3-transition "3" - -d3-chord@3: - version "3.0.1" - resolved "https://registry.yarnpkg.com/d3-chord/-/d3-chord-3.0.1.tgz#d156d61f485fce8327e6abf339cb41d8cbba6966" - integrity sha512-VE5S6TNa+j8msksl7HwjxMHDM2yNK3XCkusIlpX5kwauBfXuyLAtNg9jCp/iHH61tgI4sb6R/EIMWCqEIdjT/g== - dependencies: - d3-path "1 - 3" - -"d3-color@1 - 3", d3-color@3, d3-color@^3.1.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/d3-color/-/d3-color-3.1.0.tgz#395b2833dfac71507f12ac2f7af23bf819de24e2" - integrity sha512-zg/chbXyeBtMQ1LbD/WSoW2DpC3I0mpmPdW+ynRTj/x2DAWYrIY7qeZIHidozwV24m4iavr15lNwIwLxRmOxhA== - -d3-contour@4: - version "4.0.2" - resolved "https://registry.yarnpkg.com/d3-contour/-/d3-contour-4.0.2.tgz#bb92063bc8c5663acb2422f99c73cbb6c6ae3bcc" - integrity sha512-4EzFTRIikzs47RGmdxbeUvLWtGedDUNkTcmzoeyg4sP/dvCexO47AaQL7VKy/gul85TOxw+IBgA8US2xwbToNA== - dependencies: - d3-array "^3.2.0" - -d3-delaunay@6, d3-delaunay@^6.0.4: - version "6.0.4" - resolved "https://registry.yarnpkg.com/d3-delaunay/-/d3-delaunay-6.0.4.tgz#98169038733a0a5babbeda55054f795bb9e4a58b" - integrity sha512-mdjtIZ1XLAM8bm/hx3WwjfHt6Sggek7qH043O8KEjDXN40xi3vx/6pYSVTwLjEgiXQTbvaouWKynLBiUZ6SK6A== - dependencies: - delaunator "5" - -"d3-dispatch@1 - 3", d3-dispatch@3: - version "3.0.1" - resolved "https://registry.yarnpkg.com/d3-dispatch/-/d3-dispatch-3.0.1.tgz#5fc75284e9c2375c36c839411a0cf550cbfc4d5e" - integrity sha512-rzUyPU/S7rwUflMyLc1ETDeBj0NRuHKKAcvukozwhshr6g6c5d8zh4c2gQjY2bZ0dXeGLWc1PF174P2tVvKhfg== - -"d3-drag@2 - 3", d3-drag@3: - version "3.0.0" - resolved "https://registry.yarnpkg.com/d3-drag/-/d3-drag-3.0.0.tgz#994aae9cd23c719f53b5e10e3a0a6108c69607ba" - integrity sha512-pWbUJLdETVA8lQNJecMxoXfH6x+mO2UQo8rSmZ+QqxcbyA3hfeprFgIT//HW2nlHChWeIIMwS2Fq+gEARkhTkg== - dependencies: - d3-dispatch "1 - 3" - d3-selection "3" - -"d3-dsv@1 - 3", d3-dsv@3, d3-dsv@^3.0.1: - version "3.0.1" - resolved "https://registry.yarnpkg.com/d3-dsv/-/d3-dsv-3.0.1.tgz#c63af978f4d6a0d084a52a673922be2160789b73" - integrity sha512-UG6OvdI5afDIFP9w4G0mNq50dSOsXHJaRE8arAS5o9ApWnIElp8GZw1Dun8vP8OyHOZ/QJUKUJwxiiCCnUwm+Q== - dependencies: - commander "7" - iconv-lite "0.6" - rw "1" - -"d3-ease@1 - 3", d3-ease@3: - version "3.0.1" - resolved "https://registry.yarnpkg.com/d3-ease/-/d3-ease-3.0.1.tgz#9658ac38a2140d59d346160f1f6c30fda0bd12f4" - integrity sha512-wR/XK3D3XcLIZwpbvQwQ5fK+8Ykds1ip7A2Txe0yxncXSdq1L9skcG7blcedkOX+ZcgxGAmLX1FrRGbADwzi0w== - -d3-fetch@3: - version "3.0.1" - resolved "https://registry.yarnpkg.com/d3-fetch/-/d3-fetch-3.0.1.tgz#83141bff9856a0edb5e38de89cdcfe63d0a60a22" - integrity sha512-kpkQIM20n3oLVBKGg6oHrUchHM3xODkTzjMoj7aWQFq5QEM+R6E4WkzT5+tojDY7yjez8KgCBRoj4aEr99Fdqw== - dependencies: - d3-dsv "1 - 3" - -d3-force@3, d3-force@^3.0.0: - version "3.0.0" - resolved "https://registry.yarnpkg.com/d3-force/-/d3-force-3.0.0.tgz#3e2ba1a61e70888fe3d9194e30d6d14eece155c4" - integrity sha512-zxV/SsA+U4yte8051P4ECydjD/S+qeYtnaIyAs9tgHCqfguma/aAQDjo85A9Z6EKhBirHRJHXIgJUlffT4wdLg== - dependencies: - d3-dispatch "1 - 3" - d3-quadtree "1 - 3" - d3-timer "1 - 3" - -"d3-format@1 - 3", d3-format@3, d3-format@^3.1.0: - version "3.1.2" - resolved "https://registry.yarnpkg.com/d3-format/-/d3-format-3.1.2.tgz#01fdb46b58beb1f55b10b42ad70b6e344d5eb2ae" - integrity sha512-AJDdYOdnyRDV5b6ArilzCPPwc1ejkHcoyFarqlPqT7zRYjhavcT3uSrqcMvsgh2CgoPbK3RCwyHaVyxYcP2Arg== - -d3-geo-projection@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/d3-geo-projection/-/d3-geo-projection-4.0.0.tgz#dc229e5ead78d31869a4e87cf1f45bd2716c48ca" - integrity sha512-p0bK60CEzph1iqmnxut7d/1kyTmm3UWtPlwdkM31AU+LW+BXazd5zJdoCn7VFxNCHXRngPHRnsNn5uGjLRGndg== - dependencies: - commander "7" - d3-array "1 - 3" - d3-geo "1.12.0 - 3" - -"d3-geo@1.12.0 - 3", d3-geo@3, d3-geo@^3.1.1: - version "3.1.1" - resolved "https://registry.yarnpkg.com/d3-geo/-/d3-geo-3.1.1.tgz#6027cf51246f9b2ebd64f99e01dc7c3364033a4d" - integrity sha512-637ln3gXKXOwhalDzinUgY83KzNWZRKbYubaG+fGVuc/dxO64RRljtCTnf5ecMyE1RIdtqpkVcq0IbtU2S8j2Q== - dependencies: - d3-array "2.5.0 - 3" - -d3-hierarchy@3, d3-hierarchy@^3.1.2: - version "3.1.2" - resolved "https://registry.yarnpkg.com/d3-hierarchy/-/d3-hierarchy-3.1.2.tgz#b01cd42c1eed3d46db77a5966cf726f8c09160c6" - integrity sha512-FX/9frcub54beBdugHjDCdikxThEqjnR93Qt7PvQTOHxyiNCAlvMrHhclk3cD5VeAaq9fxmfRp+CnWw9rEMBuA== - -"d3-interpolate@1 - 3", "d3-interpolate@1.2.0 - 3", d3-interpolate@3, d3-interpolate@^3.0.1: - version "3.0.1" - resolved "https://registry.yarnpkg.com/d3-interpolate/-/d3-interpolate-3.0.1.tgz#3c47aa5b32c5b3dfb56ef3fd4342078a632b400d" - integrity sha512-3bYs1rOD33uo8aqJfKP3JWPAibgw8Zm2+L9vBKEHJ2Rg+viTR7o5Mmv5mZcieN+FRYaAOWX5SJATX6k1PWz72g== - dependencies: - d3-color "1 - 3" - -"d3-path@1 - 3", d3-path@3, d3-path@^3.1.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/d3-path/-/d3-path-3.1.0.tgz#22df939032fb5a71ae8b1800d61ddb7851c42526" - integrity sha512-p3KP5HCf/bvjBSSKuXid6Zqijx7wIfNW+J/maPs+iwR35at5JCbLUT0LzF1cnjbCHWhqzQTIN2Jpe8pRebIEFQ== - -d3-polygon@3: - version "3.0.1" - resolved "https://registry.yarnpkg.com/d3-polygon/-/d3-polygon-3.0.1.tgz#0b45d3dd1c48a29c8e057e6135693ec80bf16398" - integrity sha512-3vbA7vXYwfe1SYhED++fPUQlWSYTTGmFmQiany/gdbiWgU/iEyQzyymwL9SkJjFFuCS4902BSzewVGsHHmHtXg== - -"d3-quadtree@1 - 3", d3-quadtree@3: - version "3.0.1" - resolved "https://registry.yarnpkg.com/d3-quadtree/-/d3-quadtree-3.0.1.tgz#6dca3e8be2b393c9a9d514dabbd80a92deef1a4f" - integrity sha512-04xDrxQTDTCFwP5H6hRhsRcb9xxv2RzkcsygFzmkSIOJy3PeRJP7sNk3VRIbKXcog561P9oU0/rVH6vDROAgUw== - -d3-random@3: - version "3.0.1" - resolved "https://registry.yarnpkg.com/d3-random/-/d3-random-3.0.1.tgz#d4926378d333d9c0bfd1e6fa0194d30aebaa20f4" - integrity sha512-FXMe9GfxTxqd5D6jFsQ+DJ8BJS4E/fT5mqqdjovykEB2oFbTMDVdg1MGFxfQW+FBOGoB++k8swBrgwSHT1cUXQ== - -d3-scale-chromatic@3, d3-scale-chromatic@^3.1.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/d3-scale-chromatic/-/d3-scale-chromatic-3.1.0.tgz#34c39da298b23c20e02f1a4b239bd0f22e7f1314" - integrity sha512-A3s5PWiZ9YCXFye1o246KoscMWqf8BsD9eRiJ3He7C9OBaxKhAd5TFCdEx/7VbKtxxTsu//1mMJFrEt572cEyQ== - dependencies: - d3-color "1 - 3" - d3-interpolate "1 - 3" - -d3-scale@4, d3-scale@^4.0.2: - version "4.0.2" - resolved "https://registry.yarnpkg.com/d3-scale/-/d3-scale-4.0.2.tgz#82b38e8e8ff7080764f8dcec77bd4be393689396" - integrity sha512-GZW464g1SH7ag3Y7hXjf8RoUuAFIqklOAq3MRl4OaWabTFJY9PN/E1YklhXLh+OQ3fM9yS2nOkCoS+WLZ6kvxQ== - dependencies: - d3-array "2.10.0 - 3" - d3-format "1 - 3" - d3-interpolate "1.2.0 - 3" - d3-time "2.1.1 - 3" - d3-time-format "2 - 4" - -"d3-selection@2 - 3", d3-selection@3: - version "3.0.0" - resolved "https://registry.yarnpkg.com/d3-selection/-/d3-selection-3.0.0.tgz#c25338207efa72cc5b9bd1458a1a41901f1e1b31" - integrity sha512-fmTRWbNMmsmWq6xJV8D19U/gw/bwrHfNXxrIN+HfZgnzqTHp9jOmKMhsTUjXOJnZOdZY9Q28y4yebKzqDKlxlQ== - -d3-shape@3, d3-shape@^3.2.0: - version "3.2.0" - resolved "https://registry.yarnpkg.com/d3-shape/-/d3-shape-3.2.0.tgz#a1a839cbd9ba45f28674c69d7f855bcf91dfc6a5" - integrity sha512-SaLBuwGm3MOViRq2ABk3eLoxwZELpH6zhl3FbAoJ7Vm1gofKx6El1Ib5z23NUEhF9AsGl7y+dzLe5Cw2AArGTA== - dependencies: - d3-path "^3.1.0" - -"d3-time-format@2 - 4", d3-time-format@4, d3-time-format@^4.1.0: - version "4.1.0" - resolved "https://registry.yarnpkg.com/d3-time-format/-/d3-time-format-4.1.0.tgz#7ab5257a5041d11ecb4fe70a5c7d16a195bb408a" - integrity sha512-dJxPBlzC7NugB2PDLwo9Q8JiTR3M3e4/XANkreKSUxF8vvXKqm1Yfq4Q5dl8budlunRVlUUaDUgFt7eA8D6NLg== - dependencies: - d3-time "1 - 3" - -"d3-time@1 - 3", "d3-time@2.1.1 - 3", d3-time@3, d3-time@^3.1.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/d3-time/-/d3-time-3.1.0.tgz#9310db56e992e3c0175e1ef385e545e48a9bb5c7" - integrity sha512-VqKjzBLejbSMT4IgbmVgDjpkYrNWUYJnbCGo874u7MMKIWsILRX+OpX/gTk8MqjpT1A/c6HY2dCA77ZN0lkQ2Q== - dependencies: - d3-array "2 - 3" - -"d3-timer@1 - 3", d3-timer@3, d3-timer@^3.0.1: - version "3.0.1" - resolved "https://registry.yarnpkg.com/d3-timer/-/d3-timer-3.0.1.tgz#6284d2a2708285b1abb7e201eda4380af35e63b0" - integrity sha512-ndfJ/JxxMd3nw31uyKoY2naivF+r29V+Lc0svZxe1JvvIRmi8hUsrMvdOwgS1o6uBHmiz91geQ0ylPP0aj1VUA== - -"d3-transition@2 - 3", d3-transition@3: - version "3.0.1" - resolved "https://registry.yarnpkg.com/d3-transition/-/d3-transition-3.0.1.tgz#6869fdde1448868077fdd5989200cb61b2a1645f" - integrity sha512-ApKvfjsSR6tg06xrL434C0WydLr7JewBB3V+/39RMHsaXTOG0zmt/OAXeng5M5LBm0ojmxJrpomQVZ1aPvBL4w== - dependencies: - d3-color "1 - 3" - d3-dispatch "1 - 3" - d3-ease "1 - 3" - d3-interpolate "1 - 3" - d3-timer "1 - 3" - -d3-zoom@3: - version "3.0.0" - resolved "https://registry.yarnpkg.com/d3-zoom/-/d3-zoom-3.0.0.tgz#d13f4165c73217ffeaa54295cd6969b3e7aee8f3" - integrity sha512-b8AmV3kfQaqWAuacbPuNbL6vahnOJflOhexLzMMNLga62+/nh0JzvJ0aO/5a5MVgUFGS7Hu1P9P03o3fJkDCyw== - dependencies: - d3-dispatch "1 - 3" - d3-drag "2 - 3" - d3-interpolate "1 - 3" - d3-selection "2 - 3" - d3-transition "2 - 3" - -d3@^7.3.0: - version "7.9.0" - resolved "https://registry.yarnpkg.com/d3/-/d3-7.9.0.tgz#579e7acb3d749caf8860bd1741ae8d371070cd5d" - integrity sha512-e1U46jVP+w7Iut8Jt8ri1YsPOvFpg46k+K8TpCb0P+zjCkjkPnV7WzfDJzMHy1LnA+wj5pLT1wjO901gLXeEhA== - dependencies: - d3-array "3" - d3-axis "3" - d3-brush "3" - d3-chord "3" - d3-color "3" - d3-contour "4" - d3-delaunay "6" - d3-dispatch "3" - d3-drag "3" - d3-dsv "3" - d3-ease "3" - d3-fetch "3" - d3-force "3" - d3-format "3" - d3-geo "3" - d3-hierarchy "3" - d3-interpolate "3" - d3-path "3" - d3-polygon "3" - d3-quadtree "3" - d3-random "3" - d3-scale "4" - d3-scale-chromatic "3" - d3-selection "3" - d3-shape "3" - d3-time "3" - d3-time-format "4" - d3-timer "3" - d3-transition "3" - d3-zoom "3" - -damerau-levenshtein@^1.0.8: - version "1.0.8" - resolved "https://registry.yarnpkg.com/damerau-levenshtein/-/damerau-levenshtein-1.0.8.tgz#b43d286ccbd36bc5b2f7ed41caf2d0aba1f8a6e7" - integrity sha512-sdQSFB7+llfUcQHUQO3+B8ERRj0Oa4w9POWMI/puGtuf7gFywGmkaLCElnudfTiKZV+NvHqL0ifzdrI8Ro7ESA== - -data-urls@^7.0.0: - version "7.0.0" - resolved "https://registry.yarnpkg.com/data-urls/-/data-urls-7.0.0.tgz#6dce8b63226a1ecfdd907ce18a8ccfb1eee506d3" - integrity sha512-23XHcCF+coGYevirZceTVD7NdJOqVn+49IHyxgszm+JIiHLoB2TkmPtsYkNWT1pvRSGkc35L6NHs0yHkN2SumA== - dependencies: - whatwg-mimetype "^5.0.0" - whatwg-url "^16.0.0" - -data-view-buffer@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/data-view-buffer/-/data-view-buffer-1.0.2.tgz#211a03ba95ecaf7798a8c7198d79536211f88570" - integrity sha512-EmKO5V3OLXh1rtK2wgXRansaK1/mtVdTUEiEI0W8RkvgT05kfxaH29PliLnpLP73yYO6142Q72QNa8Wx/A5CqQ== - dependencies: - call-bound "^1.0.3" - es-errors "^1.3.0" - is-data-view "^1.0.2" - -data-view-byte-length@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/data-view-byte-length/-/data-view-byte-length-1.0.2.tgz#9e80f7ca52453ce3e93d25a35318767ea7704735" - integrity sha512-tuhGbE6CfTM9+5ANGf+oQb72Ky/0+s3xKUpHvShfiz2RxMFgFPjsXuRLBVMtvMs15awe45SRb83D6wH4ew6wlQ== - dependencies: - call-bound "^1.0.3" - es-errors "^1.3.0" - is-data-view "^1.0.2" - -data-view-byte-offset@^1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/data-view-byte-offset/-/data-view-byte-offset-1.0.1.tgz#068307f9b71ab76dbbe10291389e020856606191" - integrity sha512-BS8PfmtDGnrgYdOonGZQdLZslWIeCGFP9tpan0hi1Co2Zr2NKADsvGYA8XxuG/4UWgJ6Cjtv+YJnB6MM69QGlQ== - dependencies: - call-bound "^1.0.2" - es-errors "^1.3.0" - is-data-view "^1.0.1" - -dayjs@^1.8.34: - version "1.11.20" - resolved "https://registry.yarnpkg.com/dayjs/-/dayjs-1.11.20.tgz#88d919fd639dc991415da5f4cb6f1b6650811938" - integrity sha512-YbwwqR/uYpeoP4pu043q+LTDLFBLApUP6VxRihdfNTqu4ubqMlGDLd6ErXhEgsyvY0K6nCs7nggYumAN+9uEuQ== - -debug@^4.3.1, debug@^4.3.2, debug@^4.4.3: - version "4.4.3" - resolved "https://registry.yarnpkg.com/debug/-/debug-4.4.3.tgz#c6ae432d9bd9662582fce08709b038c58e9e3d6a" - integrity sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA== - dependencies: - ms "^2.1.3" - -decimal.js@^10.6.0: - version "10.6.0" - resolved "https://registry.yarnpkg.com/decimal.js/-/decimal.js-10.6.0.tgz#e649a43e3ab953a72192ff5983865e509f37ed9a" - integrity sha512-YpgQiITW3JXGntzdUmyUR1V812Hn8T1YVXhCu+wO3OpS4eU9l4YdD3qjyiKdV6mvV29zapkMeD390UVEf2lkUg== - -decompress-response@^6.0.0: - version "6.0.0" - resolved "https://registry.yarnpkg.com/decompress-response/-/decompress-response-6.0.0.tgz#ca387612ddb7e104bd16d85aab00d5ecf09c66fc" - integrity sha512-aW35yZM6Bb/4oJlZncMH2LCoZtJXTRxES17vE3hoRiowU2kWHaJKFkSBDnDR+cm9J+9QhXmREyIfv0pji9ejCQ== - dependencies: - mimic-response "^3.1.0" - -deep-extend@^0.6.0: - version "0.6.0" - resolved "https://registry.yarnpkg.com/deep-extend/-/deep-extend-0.6.0.tgz#c4fa7c95404a17a9c3e8ca7e1537312b736330ac" - integrity sha512-LOHxIOaPYdHlJRtCQfDIVZtfw/ufM8+rVj649RIHzcm/vGwQRXFt6OPqIFWsm2XEMrNIEtWR64sY1LEKD2vAOA== - -deep-is@^0.1.3: - version "0.1.4" - resolved "https://registry.yarnpkg.com/deep-is/-/deep-is-0.1.4.tgz#a6f2dce612fadd2ef1f519b73551f17e85199831" - integrity sha512-oIPzksmTg4/MriiaYGO+okXDT7ztn/w3Eptv/+gSIdMdKsJo0u4CfYNFJPy+4SKMuCqGw2wxnA+URMg3t8a/bQ== - -define-data-property@^1.0.1, define-data-property@^1.1.4: - version "1.1.4" - resolved "https://registry.yarnpkg.com/define-data-property/-/define-data-property-1.1.4.tgz#894dc141bb7d3060ae4366f6a0107e68fbe48c5e" - integrity sha512-rBMvIzlpA8v6E+SJZoo++HAYqsLrkg7MSfIinMPFhmkorw7X+dOXVJQs+QT69zGkzMyfDnIMN2Wid1+NbL3T+A== - dependencies: - es-define-property "^1.0.0" - es-errors "^1.3.0" - gopd "^1.0.1" - -define-properties@^1.1.3, define-properties@^1.2.1: - version "1.2.1" - resolved "https://registry.yarnpkg.com/define-properties/-/define-properties-1.2.1.tgz#10781cc616eb951a80a034bafcaa7377f6af2b6c" - integrity sha512-8QmQKqEASLd5nx0U1B1okLElbUuuttJ/AnYmRXbbbGDWh6uS208EjD4Xqq/I9wK7u0v6O08XhTWnt5XtEbR6Dg== - dependencies: - define-data-property "^1.0.1" - has-property-descriptors "^1.0.0" - object-keys "^1.1.1" - -delaunator@5: - version "5.1.0" - resolved "https://registry.yarnpkg.com/delaunator/-/delaunator-5.1.0.tgz#d13271fbf3aff6753f9ea6e235557f20901046ea" - integrity sha512-AGrQ4QSgssa1NGmWmLPqN5NY2KajF5MqxetNEO+o0n3ZwZZeTmt7bBnvzHWrmkZFxGgr4HdyFgelzgi06otLuQ== - dependencies: - robust-predicates "^3.0.2" - -dequal@^2.0.3: - version "2.0.3" - resolved "https://registry.yarnpkg.com/dequal/-/dequal-2.0.3.tgz#2644214f1997d39ed0ee0ece72335490a7ac67be" - integrity sha512-0je+qPKHEMohvfRTCEo3CrPG6cAzAYgmzKyxRiYSSDkS6eGJdyVJm7WaYA5ECaAD9wLB2T4EEeymA5aFVcYXCA== - -detect-libc@^2.0.0, detect-libc@^2.0.3: - version "2.1.2" - resolved "https://registry.yarnpkg.com/detect-libc/-/detect-libc-2.1.2.tgz#689c5dcdc1900ef5583a4cb9f6d7b473742074ad" - integrity sha512-Btj2BOOO83o3WyH59e8MgXsxEQVcarkUOpEYrubB0urwnN10yQ364rsiByU11nZlqWYZm05i/of7io4mzihBtQ== - -dnd-core@^16.0.1: - version "16.0.1" - resolved "https://registry.yarnpkg.com/dnd-core/-/dnd-core-16.0.1.tgz#a1c213ed08961f6bd1959a28bb76f1a868360d19" - integrity sha512-HK294sl7tbw6F6IeuK16YSBUoorvHpY8RHO+9yFfaJyCDVb6n7PRcezrOEOa2SBCqiYpemh5Jx20ZcjKdFAVng== - dependencies: - "@react-dnd/asap" "^5.0.1" - "@react-dnd/invariant" "^4.0.1" - redux "^4.2.0" - -doctrine@^2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/doctrine/-/doctrine-2.1.0.tgz#5cd01fc101621b42c4cd7f5d1a66243716d3f39d" - integrity sha512-35mSku4ZXK0vfCuHEDAwt55dg2jNajHZ1odvF+8SSr82EsZY4QmXfuWso8oEd8zRhVObSN18aM0CjSdoBX7zIw== - dependencies: - esutils "^2.0.2" - -dom-accessibility-api@^0.5.9: - version "0.5.16" - resolved "https://registry.yarnpkg.com/dom-accessibility-api/-/dom-accessibility-api-0.5.16.tgz#5a7429e6066eb3664d911e33fb0e45de8eb08453" - integrity sha512-X7BJ2yElsnOJ30pZF4uIIDfBEVgF4XEBxL9Bxhy6dnrm5hkzqmsWHGTiHqRiITNhMyFLyAiWndIJP7Z1NTteDg== - -dom-accessibility-api@^0.6.3: - version "0.6.3" - resolved "https://registry.yarnpkg.com/dom-accessibility-api/-/dom-accessibility-api-0.6.3.tgz#993e925cc1d73f2c662e7d75dd5a5445259a8fd8" - integrity sha512-7ZgogeTnjuHbo+ct10G9Ffp0mif17idi0IyWNVA/wcwcm7NPOD/WEHVP3n7n3MhXqxoIYm8d6MuZohYWIZ4T3w== - -dom-helpers@^5.0.1: - version "5.2.1" - resolved "https://registry.yarnpkg.com/dom-helpers/-/dom-helpers-5.2.1.tgz#d9400536b2bf8225ad98fe052e029451ac40e902" - integrity sha512-nRCa7CK3VTrM2NmGkIy4cbK7IZlgBE/PYMn55rrXefr5xXDP0LdtfPnblFDoVdcAfslJ7or6iqAUnx0CCGIWQA== - dependencies: - "@babel/runtime" "^7.8.7" - csstype "^3.0.2" - -dompurify@*, dompurify@^3.2.4: - version "3.3.3" - resolved "https://registry.yarnpkg.com/dompurify/-/dompurify-3.3.3.tgz#680cae8af3e61320ddf3666a3bc843f7b291b2b6" - integrity sha512-Oj6pzI2+RqBfFG+qOaOLbFXLQ90ARpcGG6UePL82bJLtdsa6CYJD7nmiU8MW9nQNOtCHV3lZ/Bzq1X0QYbBZCA== - optionalDependencies: - "@types/trusted-types" "^2.0.7" - -dunder-proto@^1.0.0, dunder-proto@^1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/dunder-proto/-/dunder-proto-1.0.1.tgz#d7ae667e1dc83482f8b70fd0f6eefc50da30f58a" - integrity sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A== - dependencies: - call-bind-apply-helpers "^1.0.1" - es-errors "^1.3.0" - gopd "^1.2.0" - -duplexer2@~0.1.4: - version "0.1.4" - resolved "https://registry.yarnpkg.com/duplexer2/-/duplexer2-0.1.4.tgz#8b12dab878c0d69e3e7891051662a32fc6bddcc1" - integrity sha512-asLFVfWWtJ90ZyOUHMqk7/S2w2guQKxUI2itj3d92ADHhxUSbCMGi1f1cBcJ7xM1To+pE/Khbwo1yuNbMEPKeA== - dependencies: - readable-stream "^2.0.2" - -echarts@^6.0.0: - version "6.0.0" - resolved "https://registry.yarnpkg.com/echarts/-/echarts-6.0.0.tgz#2935aa7751c282d1abbbf7d719d397199a15b9e7" - integrity sha512-Tte/grDQRiETQP4xz3iZWSvoHrkCQtwqd6hs+mifXcjrCuo2iKWbajFObuLJVBlDIJlOzgQPd1hsaKt/3+OMkQ== - dependencies: - tslib "2.3.0" - zrender "6.0.0" - -emoji-regex@^10.3.0: - version "10.6.0" - resolved "https://registry.yarnpkg.com/emoji-regex/-/emoji-regex-10.6.0.tgz#bf3d6e8f7f8fd22a65d9703475bc0147357a6b0d" - integrity sha512-toUI84YS5YmxW219erniWD0CIVOo46xGKColeNQRgOzDorgBi1v4D71/OFzgD9GO2UGKIv1C3Sp8DAn0+j5w7A== - -emoji-regex@^9.2.2: - version "9.2.2" - resolved "https://registry.yarnpkg.com/emoji-regex/-/emoji-regex-9.2.2.tgz#840c8803b0d8047f4ff0cf963176b32d4ef3ed72" - integrity sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg== - -end-of-stream@^1.1.0, end-of-stream@^1.4.1: - version "1.4.5" - resolved "https://registry.yarnpkg.com/end-of-stream/-/end-of-stream-1.4.5.tgz#7344d711dea40e0b74abc2ed49778743ccedb08c" - integrity sha512-ooEGc6HP26xXq/N+GCGOT0JKCLDGrq2bQUZrQ7gyrJiZANJ/8YDTxTpQBXGMn+WbIQXNVpyWymm7KYVICQnyOg== - dependencies: - once "^1.4.0" - -entities@^4.4.0: - version "4.5.0" - resolved "https://registry.npmjs.org/entities/-/entities-4.5.0.tgz#5d268ea5e7113ec74c4d033b79ea5a35a488fb48" - integrity sha512-V0hjH4dGPh9Ao5p0MoRY6BVqtwCjhz6vI5LT8AJ55H+4g9/4vbHx1I54fS0XuclLhDHArPQCiMjDxjaL8fPxhw== - -entities@^6.0.0: - version "6.0.1" - resolved "https://registry.yarnpkg.com/entities/-/entities-6.0.1.tgz#c28c34a43379ca7f61d074130b2f5f7020a30694" - integrity sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g== - -error-ex@^1.3.1: - version "1.3.4" - resolved "https://registry.yarnpkg.com/error-ex/-/error-ex-1.3.4.tgz#b3a8d8bb6f92eecc1629e3e27d3c8607a8a32414" - integrity sha512-sqQamAnR14VgCr1A618A3sGrygcpK+HEbenA/HiEAkkUwcZIIB/tgWqHFxWgOyDh4nB4JCRimh79dR5Ywc9MDQ== - dependencies: - is-arrayish "^0.2.1" - -es-abstract@^1.17.5, es-abstract@^1.23.2, es-abstract@^1.23.3, es-abstract@^1.23.5, es-abstract@^1.23.6, es-abstract@^1.23.9, es-abstract@^1.24.0, es-abstract@^1.24.1: - version "1.24.1" - resolved "https://registry.yarnpkg.com/es-abstract/-/es-abstract-1.24.1.tgz#f0c131ed5ea1bb2411134a8dd94def09c46c7899" - integrity sha512-zHXBLhP+QehSSbsS9Pt23Gg964240DPd6QCf8WpkqEXxQ7fhdZzYsocOr5u7apWonsS5EjZDmTF+/slGMyasvw== - dependencies: - array-buffer-byte-length "^1.0.2" - arraybuffer.prototype.slice "^1.0.4" - available-typed-arrays "^1.0.7" - call-bind "^1.0.8" - call-bound "^1.0.4" - data-view-buffer "^1.0.2" - data-view-byte-length "^1.0.2" - data-view-byte-offset "^1.0.1" - es-define-property "^1.0.1" - es-errors "^1.3.0" - es-object-atoms "^1.1.1" - es-set-tostringtag "^2.1.0" - es-to-primitive "^1.3.0" - function.prototype.name "^1.1.8" - get-intrinsic "^1.3.0" - get-proto "^1.0.1" - get-symbol-description "^1.1.0" - globalthis "^1.0.4" - gopd "^1.2.0" - has-property-descriptors "^1.0.2" - has-proto "^1.2.0" - has-symbols "^1.1.0" - hasown "^2.0.2" - internal-slot "^1.1.0" - is-array-buffer "^3.0.5" - is-callable "^1.2.7" - is-data-view "^1.0.2" - is-negative-zero "^2.0.3" - is-regex "^1.2.1" - is-set "^2.0.3" - is-shared-array-buffer "^1.0.4" - is-string "^1.1.1" - is-typed-array "^1.1.15" - is-weakref "^1.1.1" - math-intrinsics "^1.1.0" - object-inspect "^1.13.4" - object-keys "^1.1.1" - object.assign "^4.1.7" - own-keys "^1.0.1" - regexp.prototype.flags "^1.5.4" - safe-array-concat "^1.1.3" - safe-push-apply "^1.0.0" - safe-regex-test "^1.1.0" - set-proto "^1.0.0" - stop-iteration-iterator "^1.1.0" - string.prototype.trim "^1.2.10" - string.prototype.trimend "^1.0.9" - string.prototype.trimstart "^1.0.8" - typed-array-buffer "^1.0.3" - typed-array-byte-length "^1.0.3" - typed-array-byte-offset "^1.0.4" - typed-array-length "^1.0.7" - unbox-primitive "^1.1.0" - which-typed-array "^1.1.19" - -es-define-property@^1.0.0, es-define-property@^1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/es-define-property/-/es-define-property-1.0.1.tgz#983eb2f9a6724e9303f61addf011c72e09e0b0fa" - integrity sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g== - -es-errors@^1.3.0: - version "1.3.0" - resolved "https://registry.yarnpkg.com/es-errors/-/es-errors-1.3.0.tgz#05f75a25dab98e4fb1dcd5e1472c0546d5057c8f" - integrity sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw== - -es-iterator-helpers@^1.2.1: - version "1.3.1" - resolved "https://registry.yarnpkg.com/es-iterator-helpers/-/es-iterator-helpers-1.3.1.tgz#3be0f4e63438d6c5a1fb5f33b891aaad3f7dae06" - integrity sha512-zWwRvqWiuBPr0muUG/78cW3aHROFCNIQ3zpmYDpwdbnt2m+xlNyRWpHBpa2lJjSBit7BQ+RXA1iwbSmu5yJ/EQ== - dependencies: - call-bind "^1.0.8" - call-bound "^1.0.4" - define-properties "^1.2.1" - es-abstract "^1.24.1" - es-errors "^1.3.0" - es-set-tostringtag "^2.1.0" - function-bind "^1.1.2" - get-intrinsic "^1.3.0" - globalthis "^1.0.4" - gopd "^1.2.0" - has-property-descriptors "^1.0.2" - has-proto "^1.2.0" - has-symbols "^1.1.0" - internal-slot "^1.1.0" - iterator.prototype "^1.1.5" - math-intrinsics "^1.1.0" - safe-array-concat "^1.1.3" - -es-module-lexer@^2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/es-module-lexer/-/es-module-lexer-2.0.0.tgz#f657cd7a9448dcdda9c070a3cb75e5dc1e85f5b1" - integrity sha512-5POEcUuZybH7IdmGsD8wlf0AI55wMecM9rVBTI/qEAy2c1kTOm3DjFYjrBdI2K3BaJjJYfYFeRtM0t9ssnRuxw== - -es-object-atoms@^1.0.0, es-object-atoms@^1.1.1: - version "1.1.1" - resolved "https://registry.yarnpkg.com/es-object-atoms/-/es-object-atoms-1.1.1.tgz#1c4f2c4837327597ce69d2ca190a7fdd172338c1" - integrity sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA== - dependencies: - es-errors "^1.3.0" - -es-set-tostringtag@^2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz#f31dbbe0c183b00a6d26eb6325c810c0fd18bd4d" - integrity sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA== - dependencies: - es-errors "^1.3.0" - get-intrinsic "^1.2.6" - has-tostringtag "^1.0.2" - hasown "^2.0.2" - -es-shim-unscopables@^1.0.2: - version "1.1.0" - resolved "https://registry.yarnpkg.com/es-shim-unscopables/-/es-shim-unscopables-1.1.0.tgz#438df35520dac5d105f3943d927549ea3b00f4b5" - integrity sha512-d9T8ucsEhh8Bi1woXCf+TIKDIROLG5WCkxg8geBCbvk22kzwC5G2OnXVMO6FUsvQlgUUXQ2itephWDLqDzbeCw== - dependencies: - hasown "^2.0.2" - -es-to-primitive@^1.3.0: - version "1.3.0" - resolved "https://registry.yarnpkg.com/es-to-primitive/-/es-to-primitive-1.3.0.tgz#96c89c82cc49fd8794a24835ba3e1ff87f214e18" - integrity sha512-w+5mJ3GuFL+NjVtJlvydShqE1eN3h3PbI7/5LAsYJP/2qtuMXjfL2LpHSRqo4b4eSF5K/DH1JXKUAHSB2UW50g== - dependencies: - is-callable "^1.2.7" - is-date-object "^1.0.5" - is-symbol "^1.0.4" - -esbuild@^0.21.3: - version "0.21.5" - resolved "https://registry.yarnpkg.com/esbuild/-/esbuild-0.21.5.tgz#9ca301b120922959b766360d8ac830da0d02997d" - integrity sha512-mg3OPMV4hXywwpoDxu3Qda5xCKQi+vCTZq8S9J/EpkhB2HzKXq4SNFZE3+NK93JYxc8VMSep+lOUSC/RVKaBqw== - optionalDependencies: - "@esbuild/aix-ppc64" "0.21.5" - "@esbuild/android-arm" "0.21.5" - "@esbuild/android-arm64" "0.21.5" - "@esbuild/android-x64" "0.21.5" - "@esbuild/darwin-arm64" "0.21.5" - "@esbuild/darwin-x64" "0.21.5" - "@esbuild/freebsd-arm64" "0.21.5" - "@esbuild/freebsd-x64" "0.21.5" - "@esbuild/linux-arm" "0.21.5" - "@esbuild/linux-arm64" "0.21.5" - "@esbuild/linux-ia32" "0.21.5" - "@esbuild/linux-loong64" "0.21.5" - "@esbuild/linux-mips64el" "0.21.5" - "@esbuild/linux-ppc64" "0.21.5" - "@esbuild/linux-riscv64" "0.21.5" - "@esbuild/linux-s390x" "0.21.5" - "@esbuild/linux-x64" "0.21.5" - "@esbuild/netbsd-x64" "0.21.5" - "@esbuild/openbsd-x64" "0.21.5" - "@esbuild/sunos-x64" "0.21.5" - "@esbuild/win32-arm64" "0.21.5" - "@esbuild/win32-ia32" "0.21.5" - "@esbuild/win32-x64" "0.21.5" - -escalade@^3.1.1: - version "3.2.0" - resolved "https://registry.yarnpkg.com/escalade/-/escalade-3.2.0.tgz#011a3f69856ba189dffa7dc8fcce99d2a87903e5" - integrity sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA== - -escape-string-regexp@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/escape-string-regexp/-/escape-string-regexp-4.0.0.tgz#14ba83a5d373e3d311e5afca29cf5bfad965bf34" - integrity sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA== - -eslint-plugin-jsx-a11y@^6.10.2: - version "6.10.2" - resolved "https://registry.yarnpkg.com/eslint-plugin-jsx-a11y/-/eslint-plugin-jsx-a11y-6.10.2.tgz#d2812bb23bf1ab4665f1718ea442e8372e638483" - integrity sha512-scB3nz4WmG75pV8+3eRUQOHZlNSUhFNq37xnpgRkCCELU3XMvXAxLk1eqWWyE22Ki4Q01Fnsw9BA3cJHDPgn2Q== - dependencies: - aria-query "^5.3.2" - array-includes "^3.1.8" - array.prototype.flatmap "^1.3.2" - ast-types-flow "^0.0.8" - axe-core "^4.10.0" - axobject-query "^4.1.0" - damerau-levenshtein "^1.0.8" - emoji-regex "^9.2.2" - hasown "^2.0.2" - jsx-ast-utils "^3.3.5" - language-tags "^1.0.9" - minimatch "^3.1.2" - object.fromentries "^2.0.8" - safe-regex-test "^1.0.3" - string.prototype.includes "^2.0.1" - -eslint-plugin-react@^7.37.2: - version "7.37.5" - resolved "https://registry.yarnpkg.com/eslint-plugin-react/-/eslint-plugin-react-7.37.5.tgz#2975511472bdda1b272b34d779335c9b0e877065" - integrity sha512-Qteup0SqU15kdocexFNAJMvCJEfa2xUKNV4CC1xsVMrIIqEy3SQ/rqyxCWNzfrd3/ldy6HMlD2e0JDVpDg2qIA== - dependencies: - array-includes "^3.1.8" - array.prototype.findlast "^1.2.5" - array.prototype.flatmap "^1.3.3" - array.prototype.tosorted "^1.1.4" - doctrine "^2.1.0" - es-iterator-helpers "^1.2.1" - estraverse "^5.3.0" - hasown "^2.0.2" - jsx-ast-utils "^2.4.1 || ^3.0.0" - minimatch "^3.1.2" - object.entries "^1.1.9" - object.fromentries "^2.0.8" - object.values "^1.2.1" - prop-types "^15.8.1" - resolve "^2.0.0-next.5" - semver "^6.3.1" - string.prototype.matchall "^4.0.12" - string.prototype.repeat "^1.0.0" - -eslint-scope@^8.4.0: - version "8.4.0" - resolved "https://registry.yarnpkg.com/eslint-scope/-/eslint-scope-8.4.0.tgz#88e646a207fad61436ffa39eb505147200655c82" - integrity sha512-sNXOfKCn74rt8RICKMvJS7XKV/Xk9kA7DyJr8mJik3S7Cwgy3qlkkmyS2uQB3jiJg6VNdZd/pDBJu0nvG2NlTg== - dependencies: - esrecurse "^4.3.0" - estraverse "^5.2.0" - -eslint-visitor-keys@^3.4.3: - version "3.4.3" - resolved "https://registry.yarnpkg.com/eslint-visitor-keys/-/eslint-visitor-keys-3.4.3.tgz#0cd72fe8550e3c2eae156a96a4dddcd1c8ac5800" - integrity sha512-wpc+LXeiyiisxPlEkUzU6svyS1frIO3Mgxj1fdy7Pm8Ygzguax2N3Fa/D/ag1WqbOprdI+uY6wMUl8/a2G+iag== - -eslint-visitor-keys@^4.2.1: - version "4.2.1" - resolved "https://registry.yarnpkg.com/eslint-visitor-keys/-/eslint-visitor-keys-4.2.1.tgz#4cfea60fe7dd0ad8e816e1ed026c1d5251b512c1" - integrity sha512-Uhdk5sfqcee/9H/rCOJikYz67o0a2Tw2hGRPOG2Y1R2dg7brRe1uG0yaNQDHu+TO/uQPF/5eCapvYSmHUjt7JQ== - -eslint-visitor-keys@^5.0.0: - version "5.0.1" - resolved "https://registry.yarnpkg.com/eslint-visitor-keys/-/eslint-visitor-keys-5.0.1.tgz#9e3c9489697824d2d4ce3a8ad12628f91e9f59be" - integrity sha512-tD40eHxA35h0PEIZNeIjkHoDR4YjjJp34biM0mDvplBe//mB+IHCqHDGV7pxF+7MklTvighcCPPZC7ynWyjdTA== - -eslint@^9.15.0: - version "9.39.4" - resolved "https://registry.yarnpkg.com/eslint/-/eslint-9.39.4.tgz#855da1b2e2ad66dc5991195f35e262bcec8117b5" - integrity sha512-XoMjdBOwe/esVgEvLmNsD3IRHkm7fbKIUGvrleloJXUZgDHig2IPWNniv+GwjyJXzuNqVjlr5+4yVUZjycJwfQ== - dependencies: - "@eslint-community/eslint-utils" "^4.8.0" - "@eslint-community/regexpp" "^4.12.1" - "@eslint/config-array" "^0.21.2" - "@eslint/config-helpers" "^0.4.2" - "@eslint/core" "^0.17.0" - "@eslint/eslintrc" "^3.3.5" - "@eslint/js" "9.39.4" - "@eslint/plugin-kit" "^0.4.1" - "@humanfs/node" "^0.16.6" - "@humanwhocodes/module-importer" "^1.0.1" - "@humanwhocodes/retry" "^0.4.2" - "@types/estree" "^1.0.6" - ajv "^6.14.0" - chalk "^4.0.0" - cross-spawn "^7.0.6" - debug "^4.3.2" - escape-string-regexp "^4.0.0" - eslint-scope "^8.4.0" - eslint-visitor-keys "^4.2.1" - espree "^10.4.0" - esquery "^1.5.0" - esutils "^2.0.2" - fast-deep-equal "^3.1.3" - file-entry-cache "^8.0.0" - find-up "^5.0.0" - glob-parent "^6.0.2" - ignore "^5.2.0" - imurmurhash "^0.1.4" - is-glob "^4.0.0" - json-stable-stringify-without-jsonify "^1.0.1" - lodash.merge "^4.6.2" - minimatch "^3.1.5" - natural-compare "^1.4.0" - optionator "^0.9.3" - -espree@^10.0.1, espree@^10.4.0: - version "10.4.0" - resolved "https://registry.yarnpkg.com/espree/-/espree-10.4.0.tgz#d54f4949d4629005a1fa168d937c3ff1f7e2a837" - integrity sha512-j6PAQ2uUr79PZhBjP5C5fhl8e39FmRnOjsD5lGnWrFU8i2G776tBK7+nP8KuQUTTyAZUwfQqXAgrVH5MbH9CYQ== - dependencies: - acorn "^8.15.0" - acorn-jsx "^5.3.2" - eslint-visitor-keys "^4.2.1" - -esquery@^1.5.0: - version "1.7.0" - resolved "https://registry.yarnpkg.com/esquery/-/esquery-1.7.0.tgz#08d048f261f0ddedb5bae95f46809463d9c9496d" - integrity sha512-Ap6G0WQwcU/LHsvLwON1fAQX9Zp0A2Y6Y/cJBl9r/JbW90Zyg4/zbG6zzKa2OTALELarYHmKu0GhpM5EO+7T0g== - dependencies: - estraverse "^5.1.0" - -esrecurse@^4.3.0: - version "4.3.0" - resolved "https://registry.yarnpkg.com/esrecurse/-/esrecurse-4.3.0.tgz#7ad7964d679abb28bee72cec63758b1c5d2c9921" - integrity sha512-KmfKL3b6G+RXvP8N1vr3Tq1kL/oCFgn2NYXEtqP8/L3pKapUA4G8cFVaoF3SU323CD4XypR/ffioHmkti6/Tag== - dependencies: - estraverse "^5.2.0" - -estraverse@^5.1.0, estraverse@^5.2.0, estraverse@^5.3.0: - version "5.3.0" - resolved "https://registry.yarnpkg.com/estraverse/-/estraverse-5.3.0.tgz#2eea5290702f26ab8fe5370370ff86c965d21123" - integrity sha512-MMdARuVEQziNTeJD8DgMqmhwR11BRQ/cBP+pLtYdSTnf3MIO8fFeiINEbX36ZdNlfU/7A9f3gUw49B3oQsvwBA== - -estree-walker@^3.0.3: - version "3.0.3" - resolved "https://registry.yarnpkg.com/estree-walker/-/estree-walker-3.0.3.tgz#67c3e549ec402a487b4fc193d1953a524752340d" - integrity sha512-7RUKfXgSMMkzt6ZuXmqapOurLGPPfgj6l9uRZ7lRGolvk0y2yocc35LdcxKC5PQZdn2DMqioAQ2NoWcrTKmm6g== - dependencies: - "@types/estree" "^1.0.0" - -esutils@^2.0.2: - version "2.0.3" - resolved "https://registry.yarnpkg.com/esutils/-/esutils-2.0.3.tgz#74d2eb4de0b8da1293711910d50775b9b710ef64" - integrity sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g== - -eventemitter3@^5.0.0: - version "5.0.4" - resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-5.0.4.tgz#a86d66170433712dde814707ac52b5271ceb1feb" - integrity sha512-mlsTRyGaPBjPedk6Bvw+aqbsXDtoAyAzm5MO7JgU+yVRyMQ5O8bD4Kcci7BS85f93veegeCPkL8R4GLClnjLFw== - -exceljs@^4.4.0: - version "4.4.0" - resolved "https://registry.yarnpkg.com/exceljs/-/exceljs-4.4.0.tgz#cfb1cb8dcc82c760a9fc9faa9e52dadab66b0156" - integrity sha512-XctvKaEMaj1Ii9oDOqbW/6e1gXknSY4g/aLCDicOXqBE4M0nRWkUu0PTp++UPNzoFY12BNHMfs/VadKIS6llvg== - dependencies: - archiver "^5.0.0" - dayjs "^1.8.34" - fast-csv "^4.3.1" - jszip "^3.10.1" - readable-stream "^3.6.0" - saxes "^5.0.1" - tmp "^0.2.0" - unzipper "^0.10.11" - uuid "^8.3.0" - -expand-template@^2.0.3: - version "2.0.3" - resolved "https://registry.yarnpkg.com/expand-template/-/expand-template-2.0.3.tgz#6e14b3fcee0f3a6340ecb57d2e8918692052a47c" - integrity sha512-XYfuKMvj4O35f/pOXLObndIRvyQ+/+6AhODh+OKWj9S9498pHHn/IMszH+gt0fBCRWMNfk1ZSp5x3AifmnI2vg== - -expect-type@^1.3.0: - version "1.3.0" - resolved "https://registry.yarnpkg.com/expect-type/-/expect-type-1.3.0.tgz#0d58ed361877a31bbc4dd6cf71bbfef7faf6bd68" - integrity sha512-knvyeauYhqjOYvQ66MznSMs83wmHrCycNEN6Ao+2AeYEfxUIkuiVxdEa1qlGEPK+We3n0THiDciYSsCcgW/DoA== - -fast-csv@^4.3.1: - version "4.3.6" - resolved "https://registry.yarnpkg.com/fast-csv/-/fast-csv-4.3.6.tgz#70349bdd8fe4d66b1130d8c91820b64a21bc4a63" - integrity sha512-2RNSpuwwsJGP0frGsOmTb9oUF+VkFSM4SyLTDgwf2ciHWTarN0lQTC+F2f/t5J9QjW+c65VFIAAu85GsvMIusw== - dependencies: - "@fast-csv/format" "4.3.5" - "@fast-csv/parse" "4.3.6" - -fast-deep-equal@^3.1.1, fast-deep-equal@^3.1.3: - version "3.1.3" - resolved "https://registry.yarnpkg.com/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz#3a7d56b559d6cbc3eb512325244e619a65c6c525" - integrity sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q== - -fast-equals@^5.3.3: - version "5.4.0" - resolved "https://registry.npmjs.org/fast-equals/-/fast-equals-5.4.0.tgz#b60073b8764f27029598447f05773c7534ba7f1e" - integrity sha512-jt2DW/aNFNwke7AUd+Z+e6pz39KO5rzdbbFCg2sGafS4mk13MI7Z8O5z9cADNn5lhGODIgLwug6TZO2ctf7kcw== - -fast-json-patch@^3.0.0-1, fast-json-patch@^3.1.1: - version "3.1.1" - resolved "https://registry.yarnpkg.com/fast-json-patch/-/fast-json-patch-3.1.1.tgz#85064ea1b1ebf97a3f7ad01e23f9337e72c66947" - integrity sha512-vf6IHUX2SBcA+5/+4883dsIjpBTqmfBjmYiWK1savxQmFk4JfBMLa7ynTYOs1Rolp/T1betJxHiGD3g1Mn8lUQ== - -fast-json-stable-stringify@^2.0.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/fast-json-stable-stringify/-/fast-json-stable-stringify-2.1.0.tgz#874bf69c6f404c2b5d99c481341399fd55892633" - integrity sha512-lhd/wF+Lk98HZoTCtlVraHtfh5XYijIjalXck7saUtuanSDyLMxnHhSXEDJqHxD7msR8D0uCmqlkwjCV8xvwHw== - -fast-levenshtein@^2.0.6: - version "2.0.6" - resolved "https://registry.yarnpkg.com/fast-levenshtein/-/fast-levenshtein-2.0.6.tgz#3d8a5c66883a16a30ca8643e851f19baa7797917" - integrity sha512-DCXu6Ifhqcks7TZKY3Hxp3y6qphY5SJZmrWMDrKcERSOXWQdMhU9Ig/PYrzyw/ul9jOIyh0N4M0tbC5hodg8dw== - -fdir@^6.5.0: - version "6.5.0" - resolved "https://registry.yarnpkg.com/fdir/-/fdir-6.5.0.tgz#ed2ab967a331ade62f18d077dae192684d50d350" - integrity sha512-tIbYtZbucOs0BRGqPJkshJUYdL+SDH7dVM8gjy+ERp3WAUjLEFJE+02kanyHtwjWOnwrKYBiwAmM0p4kLJAnXg== - -file-entry-cache@^8.0.0: - version "8.0.0" - resolved "https://registry.yarnpkg.com/file-entry-cache/-/file-entry-cache-8.0.0.tgz#7787bddcf1131bffb92636c69457bbc0edd6d81f" - integrity sha512-XXTUwCvisa5oacNGRP9SfNtYBNAMi+RPwBFmblZEF7N7swHYQS6/Zfk7SRwx4D5j3CH211YNRco1DEMNVfZCnQ== - dependencies: - flat-cache "^4.0.0" - -find-root@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/find-root/-/find-root-1.1.0.tgz#abcfc8ba76f708c42a97b3d685b7e9450bfb9ce4" - integrity sha512-NKfW6bec6GfKc0SGx1e07QZY9PE99u0Bft/0rzSD5k3sO/vwkVUpDUKVm5Gpp5Ue3YfShPFTX2070tDs5kB9Ng== - -find-up@^5.0.0: - version "5.0.0" - resolved "https://registry.yarnpkg.com/find-up/-/find-up-5.0.0.tgz#4c92819ecb7083561e4f4a240a86be5198f536fc" - integrity sha512-78/PXT1wlLLDgTzDs7sjq9hzz0vXD+zn+7wypEe4fXQxCmdmqfGsEPQxmiCSQI3ajFV91bVSsvNtrJRiW6nGng== - dependencies: - locate-path "^6.0.0" - path-exists "^4.0.0" - -flat-cache@^4.0.0: - version "4.0.1" - resolved "https://registry.yarnpkg.com/flat-cache/-/flat-cache-4.0.1.tgz#0ece39fcb14ee012f4b0410bd33dd9c1f011127c" - integrity sha512-f7ccFPK3SXFHpx15UIGyRJ/FJQctuKZ0zVuN3frBo4HnK3cay9VEW0R6yPYFHC0AgqhukPzKjq22t5DmAyqGyw== - dependencies: - flatted "^3.2.9" - keyv "^4.5.4" - -flatted@^3.2.9: - version "3.4.2" - resolved "https://registry.yarnpkg.com/flatted/-/flatted-3.4.2.tgz#f5c23c107f0f37de8dbdf24f13722b3b98d52726" - integrity sha512-PjDse7RzhcPkIJwy5t7KPWQSZ9cAbzQXcafsetQoD7sOJRQlGikNbx7yZp2OotDnJyrDcbyRq3Ttb18iYOqkxA== - -for-each@^0.3.3, for-each@^0.3.5: - version "0.3.5" - resolved "https://registry.yarnpkg.com/for-each/-/for-each-0.3.5.tgz#d650688027826920feeb0af747ee7b9421a41d47" - integrity sha512-dKx12eRCVIzqCxFGplyFKJMPvLEWgmNtUrpTiJIR5u97zEhRG8ySrtboPHZXx7daLxQVrl643cTzbab2tkQjxg== - dependencies: - is-callable "^1.2.7" - -fs-constants@^1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/fs-constants/-/fs-constants-1.0.0.tgz#6be0de9be998ce16af8afc24497b9ee9b7ccd9ad" - integrity sha512-y6OAwoSIf7FyjMIv94u+b5rdheZEjzR63GTyZJm5qh4Bi+2YgwLCcI/fPFZkL5PSixOt6ZNKm+w+Hfp/Bciwow== - -fs.realpath@^1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/fs.realpath/-/fs.realpath-1.0.0.tgz#1504ad2523158caa40db4a2787cb01411994ea4f" - integrity sha512-OO0pH2lK6a0hZnAdau5ItzHPI6pUlvI7jMVnxUQRtw4owF2wk8lOSabtGDCTP4Ggrg2MbGnWO9X8K1t4+fGMDw== - -fsevents@~2.3.2, fsevents@~2.3.3: - version "2.3.3" - resolved "https://registry.yarnpkg.com/fsevents/-/fsevents-2.3.3.tgz#cac6407785d03675a2a5e1a5305c697b347d90d6" - integrity sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw== - -fstream@^1.0.12: - version "1.0.12" - resolved "https://registry.yarnpkg.com/fstream/-/fstream-1.0.12.tgz#4e8ba8ee2d48be4f7d0de505455548eae5932045" - integrity sha512-WvJ193OHa0GHPEL+AycEJgxvBEwyfRkN1vhjca23OaPVMCaLCXTd5qAu82AjTcgP1UJmytkOKb63Ypde7raDIg== - dependencies: - graceful-fs "^4.1.2" - inherits "~2.0.0" - mkdirp ">=0.5 0" - rimraf "2" - -function-bind@^1.1.2: - version "1.1.2" - resolved "https://registry.yarnpkg.com/function-bind/-/function-bind-1.1.2.tgz#2c02d864d97f3ea6c8830c464cbd11ab6eab7a1c" - integrity sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA== - -function.prototype.name@^1.1.6, function.prototype.name@^1.1.8: - version "1.1.8" - resolved "https://registry.yarnpkg.com/function.prototype.name/-/function.prototype.name-1.1.8.tgz#e68e1df7b259a5c949eeef95cdbde53edffabb78" - integrity sha512-e5iwyodOHhbMr/yNrc7fDYG4qlbIvI5gajyzPnb5TCwyhjApznQh1BMFou9b30SevY43gCJKXycoCBjMbsuW0Q== - dependencies: - call-bind "^1.0.8" - call-bound "^1.0.3" - define-properties "^1.2.1" - functions-have-names "^1.2.3" - hasown "^2.0.2" - is-callable "^1.2.7" - -functions-have-names@^1.2.3: - version "1.2.3" - resolved "https://registry.yarnpkg.com/functions-have-names/-/functions-have-names-1.2.3.tgz#0404fe4ee2ba2f607f0e0ec3c80bae994133b834" - integrity sha512-xckBUXyTIqT97tq2x2AMb+g163b5JFysYk0x4qxNFwbfQkmNZoiRHb6sPzI9/QV33WeuvVYBUIiD4NzNIyqaRQ== - -generator-function@^2.0.0: - version "2.0.1" - resolved "https://registry.yarnpkg.com/generator-function/-/generator-function-2.0.1.tgz#0e75dd410d1243687a0ba2e951b94eedb8f737a2" - integrity sha512-SFdFmIJi+ybC0vjlHN0ZGVGHc3lgE0DxPAT0djjVg+kjOnSqclqmj0KQ7ykTOLP6YxoqOvuAODGdcHJn+43q3g== - -get-caller-file@^2.0.5: - version "2.0.5" - resolved "https://registry.yarnpkg.com/get-caller-file/-/get-caller-file-2.0.5.tgz#4f94412a82db32f36e3b0b9741f8a97feb031f7e" - integrity sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg== - -get-east-asian-width@^1.0.0: - version "1.5.0" - resolved "https://registry.yarnpkg.com/get-east-asian-width/-/get-east-asian-width-1.5.0.tgz#ce7008fe345edcf5497a6f557cfa54bc318a9ce7" - integrity sha512-CQ+bEO+Tva/qlmw24dCejulK5pMzVnUOFOijVogd3KQs07HnRIgp8TGipvCCRT06xeYEbpbgwaCxglFyiuIcmA== - -get-intrinsic@^1.2.4, get-intrinsic@^1.2.5, get-intrinsic@^1.2.6, get-intrinsic@^1.2.7, get-intrinsic@^1.3.0: - version "1.3.0" - resolved "https://registry.yarnpkg.com/get-intrinsic/-/get-intrinsic-1.3.0.tgz#743f0e3b6964a93a5491ed1bffaae054d7f98d01" - integrity sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ== - dependencies: - call-bind-apply-helpers "^1.0.2" - es-define-property "^1.0.1" - es-errors "^1.3.0" - es-object-atoms "^1.1.1" - function-bind "^1.1.2" - get-proto "^1.0.1" - gopd "^1.2.0" - has-symbols "^1.1.0" - hasown "^2.0.2" - math-intrinsics "^1.1.0" - -get-proto@^1.0.0, get-proto@^1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/get-proto/-/get-proto-1.0.1.tgz#150b3f2743869ef3e851ec0c49d15b1d14d00ee1" - integrity sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g== - dependencies: - dunder-proto "^1.0.1" - es-object-atoms "^1.0.0" - -get-symbol-description@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/get-symbol-description/-/get-symbol-description-1.1.0.tgz#7bdd54e0befe8ffc9f3b4e203220d9f1e881b6ee" - integrity sha512-w9UMqWwJxHNOvoNzSJ2oPF5wvYcvP7jUvYzhp67yEhTi17ZDBBC1z9pTdGuzjD+EFIqLSYRweZjqfiPzQ06Ebg== - dependencies: - call-bound "^1.0.3" - es-errors "^1.3.0" - get-intrinsic "^1.2.6" - -github-from-package@0.0.0: - version "0.0.0" - resolved "https://registry.yarnpkg.com/github-from-package/-/github-from-package-0.0.0.tgz#97fb5d96bfde8973313f20e8288ef9a167fa64ce" - integrity sha512-SyHy3T1v2NUXn29OsWdxmK6RwHD+vkj3v8en8AOBZ1wBQ/hCAQ5bAQTD02kW4W9tUp/3Qh6J8r9EvntiyCmOOw== - -glob-parent@^6.0.2: - version "6.0.2" - resolved "https://registry.yarnpkg.com/glob-parent/-/glob-parent-6.0.2.tgz#6d237d99083950c79290f24c7642a3de9a28f9e3" - integrity sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A== - dependencies: - is-glob "^4.0.3" - -glob@^7.1.3, glob@^7.1.4, glob@^7.2.3: - version "7.2.3" - resolved "https://registry.yarnpkg.com/glob/-/glob-7.2.3.tgz#b8df0fb802bbfa8e89bd1d938b4e16578ed44f2b" - integrity sha512-nFR0zLpU2YCaRxwoCJvL6UvCH2JFyFVIvwTLsIf21AuHlMskA1hhTdk+LlYJtOlYt9v6dvszD2BGRqBL+iQK9Q== - dependencies: - fs.realpath "^1.0.0" - inflight "^1.0.4" - inherits "2" - minimatch "^3.1.1" - once "^1.3.0" - path-is-absolute "^1.0.0" - -globals@^14.0.0: - version "14.0.0" - resolved "https://registry.yarnpkg.com/globals/-/globals-14.0.0.tgz#898d7413c29babcf6bafe56fcadded858ada724e" - integrity sha512-oahGvuMGQlPw/ivIYBjVSrWAfWLBeku5tpPE2fOPLi+WHffIWbuh2tCjhyQhTBPMf5E9jDEH4FOmTYgYwbKwtQ== - -globals@^15.12.0: - version "15.15.0" - resolved "https://registry.yarnpkg.com/globals/-/globals-15.15.0.tgz#7c4761299d41c32b075715a4ce1ede7897ff72a8" - integrity sha512-7ACyT3wmyp3I61S4fG682L0VA2RGD9otkqGJIwNUMF1SWUombIIk+af1unuDYgMm082aHYwD+mzJvv9Iu8dsgg== - -globalthis@^1.0.4: - version "1.0.4" - resolved "https://registry.yarnpkg.com/globalthis/-/globalthis-1.0.4.tgz#7430ed3a975d97bfb59bcce41f5cabbafa651236" - integrity sha512-DpLKbNU4WylpxJykQujfCcwYWiV/Jhm50Goo0wrVILAv5jOr9d+H+UR3PhSCD2rCCEIg0uc+G+muBTwD54JhDQ== - dependencies: - define-properties "^1.2.1" - gopd "^1.0.1" - -gofish-graphics@^0.0.22: - version "0.0.22" - resolved "https://registry.yarnpkg.com/gofish-graphics/-/gofish-graphics-0.0.22.tgz#3102f094f77806b2e9a79934f0121221970427bb" - integrity sha512-zRDziOMXIJFAL8Z3mirXKaIC05ZhNd+yWQ3prKsX41MEty+ohs6xH/D43fMhwrKo5AxC8ohdBJ+bvu2m9Z+6cw== - dependencies: - "@types/d3-array" "^3.2.1" - bubblesets-js "^3.0.0" - chroma-js "^3.1.2" - culori "^4.0.2" - d3-array "^3.2.4" - lodash "^4.17.21" - rybitten "^0.22.0" - solid-js "^1.9.5" - spectral.js "^2.0.2" - -gopd@^1.0.1, gopd@^1.2.0: - version "1.2.0" - resolved "https://registry.yarnpkg.com/gopd/-/gopd-1.2.0.tgz#89f56b8217bdbc8802bd299df6d7f1081d7e51a1" - integrity sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg== - -graceful-fs@^4.1.2, graceful-fs@^4.2.0, graceful-fs@^4.2.2: - version "4.2.11" - resolved "https://registry.yarnpkg.com/graceful-fs/-/graceful-fs-4.2.11.tgz#4183e4e8bf08bb6e05bbb2f7d2e0c8f712ca40e3" - integrity sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ== - -has-bigints@^1.0.2: - version "1.1.0" - resolved "https://registry.yarnpkg.com/has-bigints/-/has-bigints-1.1.0.tgz#28607e965ac967e03cd2a2c70a2636a1edad49fe" - integrity sha512-R3pbpkcIqv2Pm3dUwgjclDRVmWpTJW2DcMzcIhEXEx1oh/CEMObMm3KLmRJOdvhM7o4uQBnwr8pzRK2sJWIqfg== - -has-flag@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/has-flag/-/has-flag-4.0.0.tgz#944771fd9c81c81265c4d6941860da06bb59479b" - integrity sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ== - -has-property-descriptors@^1.0.0, has-property-descriptors@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/has-property-descriptors/-/has-property-descriptors-1.0.2.tgz#963ed7d071dc7bf5f084c5bfbe0d1b6222586854" - integrity sha512-55JNKuIW+vq4Ke1BjOTjM2YctQIvCT7GFzHwmfZPGo5wnrgkid0YQtnAleFSqumZm4az3n2BS+erby5ipJdgrg== - dependencies: - es-define-property "^1.0.0" - -has-proto@^1.2.0: - version "1.2.0" - resolved "https://registry.yarnpkg.com/has-proto/-/has-proto-1.2.0.tgz#5de5a6eabd95fdffd9818b43055e8065e39fe9d5" - integrity sha512-KIL7eQPfHQRC8+XluaIw7BHUwwqL19bQn4hzNgdr+1wXoU0KKj6rufu47lhY7KbJR2C6T6+PfyN0Ea7wkSS+qQ== - dependencies: - dunder-proto "^1.0.0" - -has-symbols@^1.0.3, has-symbols@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/has-symbols/-/has-symbols-1.1.0.tgz#fc9c6a783a084951d0b971fe1018de813707a338" - integrity sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ== - -has-tostringtag@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/has-tostringtag/-/has-tostringtag-1.0.2.tgz#2cdc42d40bef2e5b4eeab7c01a73c54ce7ab5abc" - integrity sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw== - dependencies: - has-symbols "^1.0.3" - -hasown@^2.0.2: - version "2.0.2" - resolved "https://registry.yarnpkg.com/hasown/-/hasown-2.0.2.tgz#003eaf91be7adc372e84ec59dc37252cedb80003" - integrity sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ== - dependencies: - function-bind "^1.1.2" - -hoist-non-react-statics@^3.3.0, hoist-non-react-statics@^3.3.1, hoist-non-react-statics@^3.3.2: - version "3.3.2" - resolved "https://registry.yarnpkg.com/hoist-non-react-statics/-/hoist-non-react-statics-3.3.2.tgz#ece0acaf71d62c2969c2ec59feff42a4b1a85b45" - integrity sha512-/gGivxi8JPKWNm/W0jSmzcMPpfpPLc3dY/6GxhX2hQ9iGj3aDfklV4ET7NjKpSinLpJ5vafa9iiGIEZg10SfBw== - dependencies: - react-is "^16.7.0" - -html-encoding-sniffer@^6.0.0: - version "6.0.0" - resolved "https://registry.yarnpkg.com/html-encoding-sniffer/-/html-encoding-sniffer-6.0.0.tgz#f8d9390b3b348b50d4f61c16dd2ef5c05980a882" - integrity sha512-CV9TW3Y3f8/wT0BRFc1/KAVQ3TUHiXmaAb6VW9vtiMFf7SLoMd1PdAc4W3KFOFETBJUb90KatHqlsZMWV+R9Gg== - dependencies: - "@exodus/bytes" "^1.6.0" - -html-parse-stringify@^3.0.1: - version "3.0.1" - resolved "https://registry.yarnpkg.com/html-parse-stringify/-/html-parse-stringify-3.0.1.tgz#dfc1017347ce9f77c8141a507f233040c59c55d2" - integrity sha512-KknJ50kTInJ7qIScF3jeaFRpMpE8/lfiTdzf/twXyPBLAGrLRTmkz3AdTnKeh40X8k9L2fdYwEp/42WGXIRGcg== - dependencies: - void-elements "3.1.0" - -html2canvas@^1.4.1: - version "1.4.1" - resolved "https://registry.yarnpkg.com/html2canvas/-/html2canvas-1.4.1.tgz#7cef1888311b5011d507794a066041b14669a543" - integrity sha512-fPU6BHNpsyIhr8yyMpTLLxAbkaK8ArIBcmZIRiBLiDhjeqvXolaEmDGmELFuX9I4xDcaKKcJl+TKZLqruBbmWA== - dependencies: - css-line-break "^2.1.0" - text-segmentation "^1.0.3" - -i18next-browser-languagedetector@^8.2.1: - version "8.2.1" - resolved "https://registry.yarnpkg.com/i18next-browser-languagedetector/-/i18next-browser-languagedetector-8.2.1.tgz#f17a918d376a97aa12a5b63fd8ea559a6231935b" - integrity sha512-bZg8+4bdmaOiApD7N7BPT9W8MLZG+nPTOFlLiJiT8uzKXFjhxw4v2ierCXOwB5sFDMtuA5G4kgYZ0AznZxQ/cw== - dependencies: - "@babel/runtime" "^7.23.2" - -i18next@^26.0.1: - version "26.0.1" - resolved "https://registry.npmjs.org/i18next/-/i18next-26.0.1.tgz#de5df38603b96f6863933f1a0a4937b6451c5960" - integrity sha512-vtz5sXU4+nkCm8yEU+JJ6yYIx0mkg9e68W0G0PXpnOsmzLajNsW5o28DJMqbajxfsfq0gV3XdrBudsDQnwxfsQ== - dependencies: - "@babel/runtime" "^7.29.2" - -iconv-lite@0.6: - version "0.6.3" - resolved "https://registry.yarnpkg.com/iconv-lite/-/iconv-lite-0.6.3.tgz#a52f80bf38da1952eb5c681790719871a1a72501" - integrity sha512-4fCk79wshMdzMp2rH06qWrJE4iolqLhCUH+OiuIgU++RB0+94NlDL81atO7GX55uUKueo0txHNtvEyI6D7WdMw== - dependencies: - safer-buffer ">= 2.1.2 < 3.0.0" - -ieee754@^1.1.13: - version "1.2.1" - resolved "https://registry.yarnpkg.com/ieee754/-/ieee754-1.2.1.tgz#8eb7a10a63fff25d15a57b001586d177d1b0d352" - integrity sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA== - -ignore@^5.2.0: - version "5.3.2" - resolved "https://registry.yarnpkg.com/ignore/-/ignore-5.3.2.tgz#3cd40e729f3643fd87cb04e50bf0eb722bc596f5" - integrity sha512-hsBTNUqQTDwkWtcdYI2i06Y/nUBEsNEDJKjWdigLvegy8kDuJAS8uRlpkkcQpyEXL0Z/pjDy5HBmMjRCJ2gq+g== - -ignore@^7.0.5: - version "7.0.5" - resolved "https://registry.yarnpkg.com/ignore/-/ignore-7.0.5.tgz#4cb5f6cd7d4c7ab0365738c7aea888baa6d7efd9" - integrity sha512-Hs59xBNfUIunMFgWAbGX5cq6893IbWg4KnrjbYwX3tx0ztorVgTDA6B2sxf8ejHJ4wz8BqGUMYlnzNBer5NvGg== - -immediate@~3.0.5: - version "3.0.6" - resolved "https://registry.yarnpkg.com/immediate/-/immediate-3.0.6.tgz#9db1dbd0faf8de6fbe0f5dd5e56bb606280de69b" - integrity sha512-XXOFtyqDjNDAQxVfYxuF7g9Il/IbWmmlQg2MYKOH8ExIT1qg6xc4zyS3HaEEATgs1btfzxq15ciUiY7gjSXRGQ== - -immer@^9.0.21: - version "9.0.21" - resolved "https://registry.yarnpkg.com/immer/-/immer-9.0.21.tgz#1e025ea31a40f24fb064f1fef23e931496330176" - integrity sha512-bc4NBHqOqSfRW7POMkHd51LvClaeMXpm8dx0e8oE2GORbq5aRK7Bxl4FyzVLdGtLmvLKL7BTDBG5ACQm4HWjTA== - -immutable@^5.1.5: - version "5.1.5" - resolved "https://registry.yarnpkg.com/immutable/-/immutable-5.1.5.tgz#93ee4db5c2a9ab42a4a783069f3c5d8847d40165" - integrity sha512-t7xcm2siw+hlUM68I+UEOK+z84RzmN59as9DZ7P1l0994DKUWV7UXBMQZVxaoMSRQ+PBZbHCOoBt7a2wxOMt+A== - -import-fresh@^3.2.1: - version "3.3.1" - resolved "https://registry.yarnpkg.com/import-fresh/-/import-fresh-3.3.1.tgz#9cecb56503c0ada1f2741dbbd6546e4b13b57ccf" - integrity sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ== - dependencies: - parent-module "^1.0.0" - resolve-from "^4.0.0" - -imurmurhash@^0.1.4: - version "0.1.4" - resolved "https://registry.yarnpkg.com/imurmurhash/-/imurmurhash-0.1.4.tgz#9218b9b2b928a238b13dc4fb6b6d576f231453ea" - integrity sha512-JmXMZ6wuvDmLiHEml9ykzqO6lwFbof0GG4IkcGaENdCRDDmMVnny7s5HsIgHCbaq0w2MyPhDqkhTUgS2LU2PHA== - -indent-string@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/indent-string/-/indent-string-4.0.0.tgz#624f8f4497d619b2d9768531d58f4122854d7251" - integrity sha512-EdDDZu4A2OyIK7Lr/2zG+w5jmbuk1DVBnEwREQvBzspBJkCEbRa8GxU1lghYcaGJCnRWibjDXlq779X1/y5xwg== - -inflight@^1.0.4: - version "1.0.6" - resolved "https://registry.yarnpkg.com/inflight/-/inflight-1.0.6.tgz#49bd6331d7d02d0c09bc910a1075ba8165b56df9" - integrity sha512-k92I/b08q4wvFscXCLvqfsHCrjrF7yiXsQuIVvVE7N82W3+aqpzuUdBbfhWcy/FZR3/4IgflMgKLOsvPDrGCJA== - dependencies: - once "^1.3.0" - wrappy "1" - -inherits@2, inherits@^2.0.3, inherits@^2.0.4, inherits@~2.0.0, inherits@~2.0.3: - version "2.0.4" - resolved "https://registry.yarnpkg.com/inherits/-/inherits-2.0.4.tgz#0fa2c64f932917c3433a0ded55363aae37416b7c" - integrity sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ== - -ini@~1.3.0: - version "1.3.8" - resolved "https://registry.yarnpkg.com/ini/-/ini-1.3.8.tgz#a29da425b48806f34767a4efce397269af28432c" - integrity sha512-JV/yugV2uzW5iMRSiZAyDtQd+nxtUnjeLt0acNdw98kKLrvuRVyB80tsREOE7yvGVgalhZ6RNXCmEHkUKBKxew== - -internal-slot@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/internal-slot/-/internal-slot-1.1.0.tgz#1eac91762947d2f7056bc838d93e13b2e9604961" - integrity sha512-4gd7VpWNQNB4UKKCFFVcp1AVv+FMOgs9NKzjHKusc8jTMhd5eL1NqQqOpE0KzMds804/yHlglp3uxgluOqAPLw== - dependencies: - es-errors "^1.3.0" - hasown "^2.0.2" - side-channel "^1.1.0" - -"internmap@1 - 2": - version "2.0.3" - resolved "https://registry.yarnpkg.com/internmap/-/internmap-2.0.3.tgz#6685f23755e43c524e251d29cbc97248e3061009" - integrity sha512-5Hh7Y1wQbvY5ooGgPbDaL5iYLAPzMTUrjMulskHLH6wnv/A+1q5rgEaiuqEjB+oxGXIVZs1FF+R/KPN3ZSQYYg== - -is-array-buffer@^3.0.4, is-array-buffer@^3.0.5: - version "3.0.5" - resolved "https://registry.yarnpkg.com/is-array-buffer/-/is-array-buffer-3.0.5.tgz#65742e1e687bd2cc666253068fd8707fe4d44280" - integrity sha512-DDfANUiiG2wC1qawP66qlTugJeL5HyzMpfr8lLK+jMQirGzNod0B12cFB/9q838Ru27sBwfw78/rdoU7RERz6A== - dependencies: - call-bind "^1.0.8" - call-bound "^1.0.3" - get-intrinsic "^1.2.6" - -is-arrayish@^0.2.1: - version "0.2.1" - resolved "https://registry.yarnpkg.com/is-arrayish/-/is-arrayish-0.2.1.tgz#77c99840527aa8ecb1a8ba697b80645a7a926a9d" - integrity sha512-zz06S8t0ozoDXMG+ube26zeCTNXcKIPJZJi8hBrF4idCLms4CG9QtK7qBl1boi5ODzFpjswb5JPmHCbMpjaYzg== - -is-async-function@^2.0.0: - version "2.1.1" - resolved "https://registry.yarnpkg.com/is-async-function/-/is-async-function-2.1.1.tgz#3e69018c8e04e73b738793d020bfe884b9fd3523" - integrity sha512-9dgM/cZBnNvjzaMYHVoxxfPj2QXt22Ev7SuuPrs+xav0ukGB0S6d4ydZdEiM48kLx5kDV+QBPrpVnFyefL8kkQ== - dependencies: - async-function "^1.0.0" - call-bound "^1.0.3" - get-proto "^1.0.1" - has-tostringtag "^1.0.2" - safe-regex-test "^1.1.0" - -is-bigint@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/is-bigint/-/is-bigint-1.1.0.tgz#dda7a3445df57a42583db4228682eba7c4170672" - integrity sha512-n4ZT37wG78iz03xPRKJrHTdZbe3IicyucEtdRsV5yglwc3GyUfbAfpSeD0FJ41NbUNSt5wbhqfp1fS+BgnvDFQ== - dependencies: - has-bigints "^1.0.2" - -is-boolean-object@^1.2.1: - version "1.2.2" - resolved "https://registry.yarnpkg.com/is-boolean-object/-/is-boolean-object-1.2.2.tgz#7067f47709809a393c71ff5bb3e135d8a9215d9e" - integrity sha512-wa56o2/ElJMYqjCjGkXri7it5FbebW5usLw/nPmCMs5DeZ7eziSYZhSmPRn0txqeW4LnAmQQU7FgqLpsEFKM4A== - dependencies: - call-bound "^1.0.3" - has-tostringtag "^1.0.2" - -is-callable@^1.2.7: - version "1.2.7" - resolved "https://registry.yarnpkg.com/is-callable/-/is-callable-1.2.7.tgz#3bc2a85ea742d9e36205dcacdd72ca1fdc51b055" - integrity sha512-1BC0BVFhS/p0qtw6enp8e+8OD0UrK0oFLztSjNzhcKA3WDuJxxAPXzPuPtKkjEY9UUoEWlX/8fgKeu2S8i9JTA== - -is-core-module@^2.16.1: - version "2.16.1" - resolved "https://registry.yarnpkg.com/is-core-module/-/is-core-module-2.16.1.tgz#2a98801a849f43e2add644fbb6bc6229b19a4ef4" - integrity sha512-UfoeMA6fIJ8wTYFEUjelnaGI67v6+N7qXJEvQuIGa99l4xsCruSYOVSQ0uPANn4dAzm8lkYPaKLrrijLq7x23w== - dependencies: - hasown "^2.0.2" - -is-data-view@^1.0.1, is-data-view@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/is-data-view/-/is-data-view-1.0.2.tgz#bae0a41b9688986c2188dda6657e56b8f9e63b8e" - integrity sha512-RKtWF8pGmS87i2D6gqQu/l7EYRlVdfzemCJN/P3UOs//x1QE7mfhvzHIApBTRf7axvT6DMGwSwBXYCT0nfB9xw== - dependencies: - call-bound "^1.0.2" - get-intrinsic "^1.2.6" - is-typed-array "^1.1.13" - -is-date-object@^1.0.5, is-date-object@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/is-date-object/-/is-date-object-1.1.0.tgz#ad85541996fc7aa8b2729701d27b7319f95d82f7" - integrity sha512-PwwhEakHVKTdRNVOw+/Gyh0+MzlCl4R6qKvkhuvLtPMggI1WAHt9sOwZxQLSGpUaDnrdyDsomoRgNnCfKNSXXg== - dependencies: - call-bound "^1.0.2" - has-tostringtag "^1.0.2" - -is-extglob@^2.1.1: - version "2.1.1" - resolved "https://registry.yarnpkg.com/is-extglob/-/is-extglob-2.1.1.tgz#a88c02535791f02ed37c76a1b9ea9773c833f8c2" - integrity sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ== - -is-finalizationregistry@^1.1.0: - version "1.1.1" - resolved "https://registry.yarnpkg.com/is-finalizationregistry/-/is-finalizationregistry-1.1.1.tgz#eefdcdc6c94ddd0674d9c85887bf93f944a97c90" - integrity sha512-1pC6N8qWJbWoPtEjgcL2xyhQOP491EQjeUo3qTKcmV8YSDDJrOepfG8pcC7h/QgnQHYSv0mJ3Z/ZWxmatVrysg== - dependencies: - call-bound "^1.0.3" - -is-generator-function@^1.0.10: - version "1.1.2" - resolved "https://registry.yarnpkg.com/is-generator-function/-/is-generator-function-1.1.2.tgz#ae3b61e3d5ea4e4839b90bad22b02335051a17d5" - integrity sha512-upqt1SkGkODW9tsGNG5mtXTXtECizwtS2kA161M+gJPc1xdb/Ax629af6YrTwcOeQHbewrPNlE5Dx7kzvXTizA== - dependencies: - call-bound "^1.0.4" - generator-function "^2.0.0" - get-proto "^1.0.1" - has-tostringtag "^1.0.2" - safe-regex-test "^1.1.0" - -is-glob@^4.0.0, is-glob@^4.0.3: - version "4.0.3" - resolved "https://registry.yarnpkg.com/is-glob/-/is-glob-4.0.3.tgz#64f61e42cbbb2eec2071a9dac0b28ba1e65d5084" - integrity sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg== - dependencies: - is-extglob "^2.1.1" - -is-map@^2.0.3: - version "2.0.3" - resolved "https://registry.yarnpkg.com/is-map/-/is-map-2.0.3.tgz#ede96b7fe1e270b3c4465e3a465658764926d62e" - integrity sha512-1Qed0/Hr2m+YqxnM09CjA2d/i6YZNfF6R2oRAOj36eUdS6qIV/huPJNSEpKbupewFs+ZsJlxsjjPbc0/afW6Lw== - -is-negative-zero@^2.0.3: - version "2.0.3" - resolved "https://registry.yarnpkg.com/is-negative-zero/-/is-negative-zero-2.0.3.tgz#ced903a027aca6381b777a5743069d7376a49747" - integrity sha512-5KoIu2Ngpyek75jXodFvnafB6DJgr3u8uuK0LEZJjrU19DrMD3EVERaR8sjz8CCGgpZvxPl9SuE1GMVPFHx1mw== - -is-number-object@^1.1.1: - version "1.1.1" - resolved "https://registry.yarnpkg.com/is-number-object/-/is-number-object-1.1.1.tgz#144b21e95a1bc148205dcc2814a9134ec41b2541" - integrity sha512-lZhclumE1G6VYD8VHe35wFaIif+CTy5SJIi5+3y4psDgWu4wPDoBhF8NxUOinEc7pHgiTsT6MaBb92rKhhD+Xw== - dependencies: - call-bound "^1.0.3" - has-tostringtag "^1.0.2" - -is-potential-custom-element-name@^1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/is-potential-custom-element-name/-/is-potential-custom-element-name-1.0.1.tgz#171ed6f19e3ac554394edf78caa05784a45bebb5" - integrity sha512-bCYeRA2rVibKZd+s2625gGnGF/t7DSqDs4dP7CrLA1m7jKWz6pps0LpYLJN8Q64HtmPKJ1hrN3nzPNKFEKOUiQ== - -is-regex@^1.2.1: - version "1.2.1" - resolved "https://registry.yarnpkg.com/is-regex/-/is-regex-1.2.1.tgz#76d70a3ed10ef9be48eb577887d74205bf0cad22" - integrity sha512-MjYsKHO5O7mCsmRGxWcLWheFqN9DJ/2TmngvjKXihe6efViPqc274+Fx/4fYj/r03+ESvBdTXK0V6tA3rgez1g== - dependencies: - call-bound "^1.0.2" - gopd "^1.2.0" - has-tostringtag "^1.0.2" - hasown "^2.0.2" - -is-set@^2.0.3: - version "2.0.3" - resolved "https://registry.yarnpkg.com/is-set/-/is-set-2.0.3.tgz#8ab209ea424608141372ded6e0cb200ef1d9d01d" - integrity sha512-iPAjerrse27/ygGLxw+EBR9agv9Y6uLeYVJMu+QNCoouJ1/1ri0mGrcWpfCqFZuzzx3WjtwxG098X+n4OuRkPg== - -is-shared-array-buffer@^1.0.4: - version "1.0.4" - resolved "https://registry.yarnpkg.com/is-shared-array-buffer/-/is-shared-array-buffer-1.0.4.tgz#9b67844bd9b7f246ba0708c3a93e34269c774f6f" - integrity sha512-ISWac8drv4ZGfwKl5slpHG9OwPNty4jOWPRIhBpxOoD+hqITiwuipOQ2bNthAzwA3B4fIjO4Nln74N0S9byq8A== - dependencies: - call-bound "^1.0.3" - -is-string@^1.1.1: - version "1.1.1" - resolved "https://registry.yarnpkg.com/is-string/-/is-string-1.1.1.tgz#92ea3f3d5c5b6e039ca8677e5ac8d07ea773cbb9" - integrity sha512-BtEeSsoaQjlSPBemMQIrY1MY0uM6vnS1g5fmufYOtnxLGUZM2178PKbhsk7Ffv58IX+ZtcvoGwccYsh0PglkAA== - dependencies: - call-bound "^1.0.3" - has-tostringtag "^1.0.2" - -is-symbol@^1.0.4, is-symbol@^1.1.1: - version "1.1.1" - resolved "https://registry.yarnpkg.com/is-symbol/-/is-symbol-1.1.1.tgz#f47761279f532e2b05a7024a7506dbbedacd0634" - integrity sha512-9gGx6GTtCQM73BgmHQXfDmLtfjjTUDSyoxTCbp5WtoixAhfgsDirWIcVQ/IHpvI5Vgd5i/J5F7B9cN/WlVbC/w== - dependencies: - call-bound "^1.0.2" - has-symbols "^1.1.0" - safe-regex-test "^1.1.0" - -is-typed-array@^1.1.13, is-typed-array@^1.1.14, is-typed-array@^1.1.15: - version "1.1.15" - resolved "https://registry.yarnpkg.com/is-typed-array/-/is-typed-array-1.1.15.tgz#4bfb4a45b61cee83a5a46fba778e4e8d59c0ce0b" - integrity sha512-p3EcsicXjit7SaskXHs1hA91QxgTw46Fv6EFKKGS5DRFLD8yKnohjF3hxoju94b/OcMZoQukzpPpBE9uLVKzgQ== - dependencies: - which-typed-array "^1.1.16" - -is-weakmap@^2.0.2: - version "2.0.2" - resolved "https://registry.yarnpkg.com/is-weakmap/-/is-weakmap-2.0.2.tgz#bf72615d649dfe5f699079c54b83e47d1ae19cfd" - integrity sha512-K5pXYOm9wqY1RgjpL3YTkF39tni1XajUIkawTLUo9EZEVUFga5gSQJF8nNS7ZwJQ02y+1YCNYcMh+HIf1ZqE+w== - -is-weakref@^1.0.2, is-weakref@^1.1.1: - version "1.1.1" - resolved "https://registry.yarnpkg.com/is-weakref/-/is-weakref-1.1.1.tgz#eea430182be8d64174bd96bffbc46f21bf3f9293" - integrity sha512-6i9mGWSlqzNMEqpCp93KwRS1uUOodk2OJ6b+sq7ZPDSy2WuI5NFIxp/254TytR8ftefexkWn5xNiHUNpPOfSew== - dependencies: - call-bound "^1.0.3" - -is-weakset@^2.0.3: - version "2.0.4" - resolved "https://registry.yarnpkg.com/is-weakset/-/is-weakset-2.0.4.tgz#c9f5deb0bc1906c6d6f1027f284ddf459249daca" - integrity sha512-mfcwb6IzQyOKTs84CQMrOwW4gQcaTOAWJ0zzJCl2WSPDrWk/OzDaImWFH3djXhb24g4eudZfLRozAvPGw4d9hQ== - dependencies: - call-bound "^1.0.3" - get-intrinsic "^1.2.6" - -isarray@^2.0.5: - version "2.0.5" - resolved "https://registry.yarnpkg.com/isarray/-/isarray-2.0.5.tgz#8af1e4c1221244cc62459faf38940d4e644a5723" - integrity sha512-xHjhDr3cNBK0BzdUJSPXZntQUx/mwMS5Rw4A7lPJ90XGAO6ISP/ePDNuo0vhqOZU+UD5JoodwCAAoZQd3FeAKw== - -isarray@~1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/isarray/-/isarray-1.0.0.tgz#bb935d48582cba168c06834957a54a3e07124f11" - integrity sha512-VLghIWNM6ELQzo7zwmcg0NmTVyWKYjvIeM83yjp0wRDTmUnrM678fQbcKBo6n2CJEF0szoG//ytg+TKla89ALQ== - -isexe@^2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/isexe/-/isexe-2.0.0.tgz#e8fbf374dc556ff8947a10dcb0572d633f2cfa10" - integrity sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw== - -iterator.prototype@^1.1.5: - version "1.1.5" - resolved "https://registry.yarnpkg.com/iterator.prototype/-/iterator.prototype-1.1.5.tgz#12c959a29de32de0aa3bbbb801f4d777066dae39" - integrity sha512-H0dkQoCa3b2VEeKQBOxFph+JAbcrQdE7KC0UkqwpLmv2EC4P41QXP+rqo9wYodACiG5/WM5s9oDApTU8utwj9g== - dependencies: - define-data-property "^1.1.4" - es-object-atoms "^1.0.0" - get-intrinsic "^1.2.6" - get-proto "^1.0.0" - has-symbols "^1.1.0" - set-function-name "^2.0.2" - -"js-tokens@^3.0.0 || ^4.0.0", js-tokens@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/js-tokens/-/js-tokens-4.0.0.tgz#19203fb59991df98e3a287050d4647cdeaf32499" - integrity sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ== - -js-yaml@^4.1.1: - version "4.1.1" - resolved "https://registry.yarnpkg.com/js-yaml/-/js-yaml-4.1.1.tgz#854c292467705b699476e1a2decc0c8a3458806b" - integrity sha512-qQKT4zQxXl8lLwBtHMWwaTcGfFOZviOJet3Oy/xmGk2gZH677CJM9EvtfdSkgWcATZhj/55JZ0rmy3myCT5lsA== - dependencies: - argparse "^2.0.1" - -jsdom@^29.0.1: - version "29.0.1" - resolved "https://registry.yarnpkg.com/jsdom/-/jsdom-29.0.1.tgz#b2db17191533dd5ba1e0d4c61fe9fa2289e87be9" - integrity sha512-z6JOK5gRO7aMybVq/y/MlIpKh8JIi68FBKMUtKkK2KH/wMSRlCxQ682d08LB9fYXplyY/UXG8P4XXTScmdjApg== - dependencies: - "@asamuzakjp/css-color" "^5.0.1" - "@asamuzakjp/dom-selector" "^7.0.3" - "@bramus/specificity" "^2.4.2" - "@csstools/css-syntax-patches-for-csstree" "^1.1.1" - "@exodus/bytes" "^1.15.0" - css-tree "^3.2.1" - data-urls "^7.0.0" - decimal.js "^10.6.0" - html-encoding-sniffer "^6.0.0" - is-potential-custom-element-name "^1.0.1" - lru-cache "^11.2.7" - parse5 "^8.0.0" - saxes "^6.0.0" - symbol-tree "^3.2.4" - tough-cookie "^6.0.1" - undici "^7.24.5" - w3c-xmlserializer "^5.0.0" - webidl-conversions "^8.0.1" - whatwg-mimetype "^5.0.0" - whatwg-url "^16.0.1" - xml-name-validator "^5.0.0" - -jsesc@^3.0.2: - version "3.1.0" - resolved "https://registry.yarnpkg.com/jsesc/-/jsesc-3.1.0.tgz#74d335a234f67ed19907fdadfac7ccf9d409825d" - integrity sha512-/sM3dO2FOzXjKQhJuo0Q173wf2KOo8t4I8vHy6lF9poUp7bKT0/NHE8fPX23PwfhnykfqnC2xRxOnVw5XuGIaA== - -json-buffer@3.0.1: - version "3.0.1" - resolved "https://registry.yarnpkg.com/json-buffer/-/json-buffer-3.0.1.tgz#9338802a30d3b6605fbe0613e094008ca8c05a13" - integrity sha512-4bV5BfR2mqfQTJm+V5tPPdf+ZpuhiIvTuAB5g8kcrXOZpTT/QwwVRWBywX1ozr6lEuPdbHxwaJlm9G6mI2sfSQ== - -json-parse-even-better-errors@^2.3.0: - version "2.3.1" - resolved "https://registry.yarnpkg.com/json-parse-even-better-errors/-/json-parse-even-better-errors-2.3.1.tgz#7c47805a94319928e05777405dc12e1f7a4ee02d" - integrity sha512-xyFwyhro/JEof6Ghe2iz2NcXoj2sloNsWr/XsERDK/oiPCfaNhl5ONfp+jQdAZRQQ0IJWNzH9zIZF7li91kh2w== - -json-schema-traverse@^0.4.1: - version "0.4.1" - resolved "https://registry.yarnpkg.com/json-schema-traverse/-/json-schema-traverse-0.4.1.tgz#69f6a87d9513ab8bb8fe63bdb0979c448e684660" - integrity sha512-xbbCH5dCYU5T8LcEhhuh7HJ88HXuW3qsI3Y0zOZFKfZEHcpWiHU/Jxzk629Brsab/mMiHQti9wMP+845RPe3Vg== - -json-stable-stringify-without-jsonify@^1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/json-stable-stringify-without-jsonify/-/json-stable-stringify-without-jsonify-1.0.1.tgz#9db7b59496ad3f3cfef30a75142d2d930ad72651" - integrity sha512-Bdboy+l7tA3OGW6FjyFHWkP5LuByj1Tk33Ljyq0axyzdk9//JSi2u3fP1QSmd1KNwq6VOKYGlAu87CisVir6Pw== - -json-stringify-pretty-compact@^2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/json-stringify-pretty-compact/-/json-stringify-pretty-compact-2.0.0.tgz#e77c419f52ff00c45a31f07f4c820c2433143885" - integrity sha512-WRitRfs6BGq4q8gTgOy4ek7iPFXjbra0H3PmDLKm2xnZ+Gh1HUhiKGgCZkSPNULlP7mvfu6FV/mOLhCarspADQ== - -json-stringify-pretty-compact@^4.0.0, json-stringify-pretty-compact@~4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/json-stringify-pretty-compact/-/json-stringify-pretty-compact-4.0.0.tgz#cf4844770bddee3cb89a6170fe4b00eee5dbf1d4" - integrity sha512-3CNZ2DnrpByG9Nqj6Xo8vqbjT4F6N+tb4Gb28ESAZjYZ5yqvmc56J+/kuIwkaAMOyblTQhUW7PxMkUb8Q36N3Q== - -"jsx-ast-utils@^2.4.1 || ^3.0.0", jsx-ast-utils@^3.3.5: - version "3.3.5" - resolved "https://registry.yarnpkg.com/jsx-ast-utils/-/jsx-ast-utils-3.3.5.tgz#4766bd05a8e2a11af222becd19e15575e52a853a" - integrity sha512-ZZow9HBI5O6EPgSJLUb8n2NKgmVWTwCvHGwFuJlMjvLFqlGG6pjirPhtdsseaLZjSibD8eegzmYpUZwoIlj2cQ== - dependencies: - array-includes "^3.1.6" - array.prototype.flat "^1.3.1" - object.assign "^4.1.4" - object.values "^1.1.6" - -jszip@^3.10.1: - version "3.10.1" - resolved "https://registry.yarnpkg.com/jszip/-/jszip-3.10.1.tgz#34aee70eb18ea1faec2f589208a157d1feb091c2" - integrity sha512-xXDvecyTpGLrqFrvkrUSoxxfJI5AH7U8zxxtVclpsUtMCq4JQ290LY8AW5c7Ggnr/Y/oK+bQMbqK2qmtk3pN4g== - dependencies: - lie "~3.3.0" - pako "~1.0.2" - readable-stream "~2.3.6" - setimmediate "^1.0.5" - -jwt-decode@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/jwt-decode/-/jwt-decode-4.0.0.tgz#2270352425fd413785b2faf11f6e755c5151bd4b" - integrity sha512-+KJGIyHgkGuIq3IEBNftfhW/LfWhXUIY6OmyVWjliu5KH1y0fw7VQ8YndE2O4qZdMSd9SqbnC8GOcZEy0Om7sA== - -katex@^0.16.0, katex@^0.16.22: - version "0.16.42" - resolved "https://registry.yarnpkg.com/katex/-/katex-0.16.42.tgz#444e81ad095e1c14e54229f65c26b5ea4e5e153c" - integrity sha512-sZ4jqyEXfHTLEFK+qsFYToa3UZ0rtFcPGwKpyiRYh2NJn8obPWOQ+/u7ux0F6CAU/y78+Mksh1YkxTPXTh47TQ== - dependencies: - commander "^8.3.0" - -keyv@^4.5.4: - version "4.5.4" - resolved "https://registry.yarnpkg.com/keyv/-/keyv-4.5.4.tgz#a879a99e29452f942439f2a405e3af8b31d4de93" - integrity sha512-oxVHkHR/EJf2CNXnWxRLW6mg7JyCCUcG0DtEGmL2ctUo1PNTin1PUil+r/+4r5MpVgC/fn1kjsx7mjSujKqIpw== - dependencies: - json-buffer "3.0.1" - -language-subtag-registry@^0.3.20: - version "0.3.23" - resolved "https://registry.yarnpkg.com/language-subtag-registry/-/language-subtag-registry-0.3.23.tgz#23529e04d9e3b74679d70142df3fd2eb6ec572e7" - integrity sha512-0K65Lea881pHotoGEa5gDlMxt3pctLi2RplBb7Ezh4rRdLEOtgi7n4EwK9lamnUCkKBqaeKRVebTq6BAxSkpXQ== - -language-tags@^1.0.9: - version "1.0.9" - resolved "https://registry.yarnpkg.com/language-tags/-/language-tags-1.0.9.tgz#1ffdcd0ec0fafb4b1be7f8b11f306ad0f9c08777" - integrity sha512-MbjN408fEndfiQXbFQ1vnd+1NoLDsnQW41410oQBXiyXDMYH5z505juWa4KUE1LqxRC7DgOgZDbKLxHIwm27hA== - dependencies: - language-subtag-registry "^0.3.20" - -lazystream@^1.0.0: - version "1.0.1" - resolved "https://registry.yarnpkg.com/lazystream/-/lazystream-1.0.1.tgz#494c831062f1f9408251ec44db1cba29242a2638" - integrity sha512-b94GiNHQNy6JNTrt5w6zNyffMrNkXZb3KTkCZJb2V1xaEGCk093vkZ2jk3tpaeP33/OiXC+WvK9AxUebnf5nbw== - dependencies: - readable-stream "^2.0.5" - -levn@^0.4.1: - version "0.4.1" - resolved "https://registry.yarnpkg.com/levn/-/levn-0.4.1.tgz#ae4562c007473b932a6200d403268dd2fffc6ade" - integrity sha512-+bT2uH4E5LGE7h/n3evcS/sQlJXCpIp6ym8OWJ5eV6+67Dsql/LaaT7qJBAt2rzfoa/5QBGBhxDix1dMt2kQKQ== - dependencies: - prelude-ls "^1.2.1" - type-check "~0.4.0" - -lie@3.1.1: - version "3.1.1" - resolved "https://registry.yarnpkg.com/lie/-/lie-3.1.1.tgz#9a436b2cc7746ca59de7a41fa469b3efb76bd87e" - integrity sha512-RiNhHysUjhrDQntfYSfY4MU24coXXdEOgw9WGcKHNeEwffDYbF//u87M1EWaMGzuFoSbqW0C9C6lEEhDOAswfw== - dependencies: - immediate "~3.0.5" - -lie@~3.3.0: - version "3.3.0" - resolved "https://registry.yarnpkg.com/lie/-/lie-3.3.0.tgz#dcf82dee545f46074daf200c7c1c5a08e0f40f6a" - integrity sha512-UaiMJzeWRlEujzAuw5LokY1L5ecNQYZKfmyZ9L7wDHb/p5etKaxXhohBcrw0EYby+G/NA52vRSN4N39dxHAIwQ== - dependencies: - immediate "~3.0.5" - -lightningcss-android-arm64@1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss-android-arm64/-/lightningcss-android-arm64-1.32.0.tgz#f033885116dfefd9c6f54787523e3514b61e1968" - integrity sha512-YK7/ClTt4kAK0vo6w3X+Pnm0D2cf2vPHbhOXdoNti1Ga0al1P4TBZhwjATvjNwLEBCnKvjJc2jQgHXH0NEwlAg== - -lightningcss-darwin-arm64@1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss-darwin-arm64/-/lightningcss-darwin-arm64-1.32.0.tgz#50b71871b01c8199584b649e292547faea7af9b5" - integrity sha512-RzeG9Ju5bag2Bv1/lwlVJvBE3q6TtXskdZLLCyfg5pt+HLz9BqlICO7LZM7VHNTTn/5PRhHFBSjk5lc4cmscPQ== - -lightningcss-darwin-x64@1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss-darwin-x64/-/lightningcss-darwin-x64-1.32.0.tgz#35f3e97332d130b9ca181e11b568ded6aebc6d5e" - integrity sha512-U+QsBp2m/s2wqpUYT/6wnlagdZbtZdndSmut/NJqlCcMLTWp5muCrID+K5UJ6jqD2BFshejCYXniPDbNh73V8w== - -lightningcss-freebsd-x64@1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss-freebsd-x64/-/lightningcss-freebsd-x64-1.32.0.tgz#9777a76472b64ed6ff94342ad64c7bafd794a575" - integrity sha512-JCTigedEksZk3tHTTthnMdVfGf61Fky8Ji2E4YjUTEQX14xiy/lTzXnu1vwiZe3bYe0q+SpsSH/CTeDXK6WHig== - -lightningcss-linux-arm-gnueabihf@1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss-linux-arm-gnueabihf/-/lightningcss-linux-arm-gnueabihf-1.32.0.tgz#13ae652e1ab73b9135d7b7da172f666c410ad53d" - integrity sha512-x6rnnpRa2GL0zQOkt6rts3YDPzduLpWvwAF6EMhXFVZXD4tPrBkEFqzGowzCsIWsPjqSK+tyNEODUBXeeVHSkw== - -lightningcss-linux-arm64-gnu@1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss-linux-arm64-gnu/-/lightningcss-linux-arm64-gnu-1.32.0.tgz#417858795a94592f680123a1b1f9da8a0e1ef335" - integrity sha512-0nnMyoyOLRJXfbMOilaSRcLH3Jw5z9HDNGfT/gwCPgaDjnx0i8w7vBzFLFR1f6CMLKF8gVbebmkUN3fa/kQJpQ== - -lightningcss-linux-arm64-musl@1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss-linux-arm64-musl/-/lightningcss-linux-arm64-musl-1.32.0.tgz#6be36692e810b718040802fd809623cffe732133" - integrity sha512-UpQkoenr4UJEzgVIYpI80lDFvRmPVg6oqboNHfoH4CQIfNA+HOrZ7Mo7KZP02dC6LjghPQJeBsvXhJod/wnIBg== - -lightningcss-linux-x64-gnu@1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss-linux-x64-gnu/-/lightningcss-linux-x64-gnu-1.32.0.tgz#0b7803af4eb21cfd38dd39fe2abbb53c7dd091f6" - integrity sha512-V7Qr52IhZmdKPVr+Vtw8o+WLsQJYCTd8loIfpDaMRWGUZfBOYEJeyJIkqGIDMZPwPx24pUMfwSxxI8phr/MbOA== - -lightningcss-linux-x64-musl@1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss-linux-x64-musl/-/lightningcss-linux-x64-musl-1.32.0.tgz#88dc8ba865ddddb1ac5ef04b0f161804418c163b" - integrity sha512-bYcLp+Vb0awsiXg/80uCRezCYHNg1/l3mt0gzHnWV9XP1W5sKa5/TCdGWaR/zBM2PeF/HbsQv/j2URNOiVuxWg== - -lightningcss-win32-arm64-msvc@1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss-win32-arm64-msvc/-/lightningcss-win32-arm64-msvc-1.32.0.tgz#4f30ba3fa5e925f5b79f945e8cc0d176c3b1ab38" - integrity sha512-8SbC8BR40pS6baCM8sbtYDSwEVQd4JlFTOlaD3gWGHfThTcABnNDBda6eTZeqbofalIJhFx0qKzgHJmcPTnGdw== - -lightningcss-win32-x64-msvc@1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss-win32-x64-msvc/-/lightningcss-win32-x64-msvc-1.32.0.tgz#141aa5605645064928902bb4af045fa7d9f4220a" - integrity sha512-Amq9B/SoZYdDi1kFrojnoqPLxYhQ4Wo5XiL8EVJrVsB8ARoC1PWW6VGtT0WKCemjy8aC+louJnjS7U18x3b06Q== - -lightningcss@^1.32.0: - version "1.32.0" - resolved "https://registry.yarnpkg.com/lightningcss/-/lightningcss-1.32.0.tgz#b85aae96486dcb1bf49a7c8571221273f4f1e4a9" - integrity sha512-NXYBzinNrblfraPGyrbPoD19C1h9lfI/1mzgWYvXUTe414Gz/X1FD2XBZSZM7rRTrMA8JL3OtAaGifrIKhQ5yQ== - dependencies: - detect-libc "^2.0.3" - optionalDependencies: - lightningcss-android-arm64 "1.32.0" - lightningcss-darwin-arm64 "1.32.0" - lightningcss-darwin-x64 "1.32.0" - lightningcss-freebsd-x64 "1.32.0" - lightningcss-linux-arm-gnueabihf "1.32.0" - lightningcss-linux-arm64-gnu "1.32.0" - lightningcss-linux-arm64-musl "1.32.0" - lightningcss-linux-x64-gnu "1.32.0" - lightningcss-linux-x64-musl "1.32.0" - lightningcss-win32-arm64-msvc "1.32.0" - lightningcss-win32-x64-msvc "1.32.0" - -lines-and-columns@^1.1.6: - version "1.2.4" - resolved "https://registry.yarnpkg.com/lines-and-columns/-/lines-and-columns-1.2.4.tgz#eca284f75d2965079309dc0ad9255abb2ebc1632" - integrity sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg== - -linkify-it@^5.0.0: - version "5.0.0" - resolved "https://registry.npmjs.org/linkify-it/-/linkify-it-5.0.0.tgz#9ef238bfa6dc70bd8e7f9572b52d369af569b421" - integrity sha512-5aHCbzQRADcdP+ATqnDuhhJ/MRIqDkZX5pyjFHRRysS8vZ5AbqGEoFIb6pYHPZ+L/OC2Lc+xT8uHVVR5CAK/wQ== - dependencies: - uc.micro "^2.0.0" - -linkifyjs@^4.3.2: - version "4.3.2" - resolved "https://registry.npmjs.org/linkifyjs/-/linkifyjs-4.3.2.tgz#d97eb45419aabf97ceb4b05a7adeb7b8c8ade2b1" - integrity sha512-NT1CJtq3hHIreOianA8aSXn6Cw0JzYOuDQbOrSPe7gqFnCpKP++MQe3ODgO3oh2GJFORkAAdqredOa60z63GbA== - -listenercount@~1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/listenercount/-/listenercount-1.0.1.tgz#84c8a72ab59c4725321480c975e6508342e70937" - integrity sha512-3mk/Zag0+IJxeDrxSgaDPy4zZ3w05PRZeJNnlWhzFz5OkX49J4krc+A8X2d2M69vGMBEX0uyl8M+W+8gH+kBqQ== - -localforage@^1.10.0: - version "1.10.0" - resolved "https://registry.yarnpkg.com/localforage/-/localforage-1.10.0.tgz#5c465dc5f62b2807c3a84c0c6a1b1b3212781dd4" - integrity sha512-14/H1aX7hzBBmmh7sGPd+AOMkkIrHM3Z1PAyGgZigA1H1p5O5ANnMyWzvpAETtG68/dC4pC0ncy3+PPGzXZHPg== - dependencies: - lie "3.1.1" - -locate-path@^6.0.0: - version "6.0.0" - resolved "https://registry.yarnpkg.com/locate-path/-/locate-path-6.0.0.tgz#55321eb309febbc59c4801d931a72452a681d286" - integrity sha512-iPZK6eYjbxRu3uB4/WZ3EsEIMJFMqAoopl3R+zuq0UjcAm/MO6KCweDgPfP3elTztoKP3KtnVHxTn2NHBSDVUw== - dependencies: - p-locate "^5.0.0" - -lodash.clamp@^4.0.0: - version "4.0.3" - resolved "https://registry.yarnpkg.com/lodash.clamp/-/lodash.clamp-4.0.3.tgz#5c24bedeeeef0753560dc2b4cb4671f90a6ddfaa" - integrity sha512-HvzRFWjtcguTW7yd8NJBshuNaCa8aqNFtnswdT7f/cMd/1YKy5Zzoq4W/Oxvnx9l7aeY258uSdDfM793+eLsVg== - -lodash.debounce@^4.0.0, lodash.debounce@^4.0.8: - version "4.0.8" - resolved "https://registry.yarnpkg.com/lodash.debounce/-/lodash.debounce-4.0.8.tgz#82d79bff30a67c4005ffd5e2515300ad9ca4d7af" - integrity sha512-FT1yDzDYEoYWhnSGnpE/4Kj1fLZkDFyqRb7fNt6FdYOSxlUWAtp42Eh6Wb0rGIv/m9Bgo7x4GhQbm5Ys4SG5ow== - -lodash.defaults@^4.2.0: - version "4.2.0" - resolved "https://registry.yarnpkg.com/lodash.defaults/-/lodash.defaults-4.2.0.tgz#d09178716ffea4dde9e5fb7b37f6f0802274580c" - integrity sha512-qjxPLHd3r5DnsdGacqOMU6pb/avJzdh9tFX2ymgoZE27BmjXrNy/y4LoaiTeAb+O3gL8AfpJGtqfX/ae2leYYQ== - -lodash.difference@^4.5.0: - version "4.5.0" - resolved "https://registry.yarnpkg.com/lodash.difference/-/lodash.difference-4.5.0.tgz#9ccb4e505d486b91651345772885a2df27fd017c" - integrity sha512-dS2j+W26TQ7taQBGN8Lbbq04ssV3emRw4NY58WErlTO29pIqS0HmoT5aJ9+TUQ1N3G+JOZSji4eugsWwGp9yPA== - -lodash.escaperegexp@^4.1.2: - version "4.1.2" - resolved "https://registry.yarnpkg.com/lodash.escaperegexp/-/lodash.escaperegexp-4.1.2.tgz#64762c48618082518ac3df4ccf5d5886dae20347" - integrity sha512-TM9YBvyC84ZxE3rgfefxUWiQKLilstD6k7PTGt6wfbtXF8ixIJLOL3VYyV/z+ZiPLsVxAsKAFVwWlWeb2Y8Yyw== - -lodash.flatten@^4.4.0: - version "4.4.0" - resolved "https://registry.yarnpkg.com/lodash.flatten/-/lodash.flatten-4.4.0.tgz#f31c22225a9632d2bbf8e4addbef240aa765a61f" - integrity sha512-C5N2Z3DgnnKr0LOpv/hKCgKdb7ZZwafIrsesve6lmzvZIRZRGaZ/l6Q8+2W7NaT+ZwO3fFlSCzCzrDCFdJfZ4g== - -lodash.groupby@^4.6.0: - version "4.6.0" - resolved "https://registry.yarnpkg.com/lodash.groupby/-/lodash.groupby-4.6.0.tgz#0b08a1dcf68397c397855c3239783832df7403d1" - integrity sha512-5dcWxm23+VAoz+awKmBaiBvzox8+RqMgFhi7UvX9DHZr2HdxHXM/Wrf8cfKpsW37RNrvtPn6hSwNqurSILbmJw== - -lodash.isboolean@^3.0.3: - version "3.0.3" - resolved "https://registry.yarnpkg.com/lodash.isboolean/-/lodash.isboolean-3.0.3.tgz#6c2e171db2a257cd96802fd43b01b20d5f5870f6" - integrity sha512-Bz5mupy2SVbPHURB98VAcw+aHh4vRV5IPNhILUCsOzRmsTmSQ17jIuqopAentWoehktxGd9e/hbIXq980/1QJg== - -lodash.isequal@^4.5.0: - version "4.5.0" - resolved "https://registry.yarnpkg.com/lodash.isequal/-/lodash.isequal-4.5.0.tgz#415c4478f2bcc30120c22ce10ed3226f7d3e18e0" - integrity sha512-pDo3lu8Jhfjqls6GkMgpahsF9kCyayhgykjyLMNFTKWrpVdAQtYyB4muAMWozBB4ig/dtWAmsMxLEI8wuz+DYQ== - -lodash.isfunction@^3.0.9: - version "3.0.9" - resolved "https://registry.yarnpkg.com/lodash.isfunction/-/lodash.isfunction-3.0.9.tgz#06de25df4db327ac931981d1bdb067e5af68d051" - integrity sha512-AirXNj15uRIMMPihnkInB4i3NHeb4iBtNg9WRWuK2o31S+ePwwNmDPaTL3o7dTJ+VXNZim7rFs4rxN4YU1oUJw== - -lodash.isnil@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/lodash.isnil/-/lodash.isnil-4.0.0.tgz#49e28cd559013458c814c5479d3c663a21bfaa6c" - integrity sha512-up2Mzq3545mwVnMhTDMdfoG1OurpA/s5t88JmQX809eH3C8491iu2sfKhTfhQtKY78oPNhiaHJUpT/dUDAAtng== - -lodash.isplainobject@^4.0.6: - version "4.0.6" - resolved "https://registry.yarnpkg.com/lodash.isplainobject/-/lodash.isplainobject-4.0.6.tgz#7c526a52d89b45c45cc690b88163be0497f550cb" - integrity sha512-oSXzaWypCMHkPC3NvBEaPHf0KsA5mvPrOPgQWDsbg8n7orZ290M0BmC/jgRZ4vcJ6DTAhjrsSYgdsW/F+MFOBA== - -lodash.isundefined@^3.0.1: - version "3.0.1" - resolved "https://registry.yarnpkg.com/lodash.isundefined/-/lodash.isundefined-3.0.1.tgz#23ef3d9535565203a66cefd5b830f848911afb48" - integrity sha512-MXB1is3s899/cD8jheYYE2V9qTHwKvt+npCwpD+1Sxm3Q3cECXCiYHjeHWXNwr6Q0SOBPrYUDxendrO6goVTEA== - -lodash.merge@^4.6.2: - version "4.6.2" - resolved "https://registry.yarnpkg.com/lodash.merge/-/lodash.merge-4.6.2.tgz#558aa53b43b661e1925a0afdfa36a9a1085fe57a" - integrity sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ== - -lodash.union@^4.6.0: - version "4.6.0" - resolved "https://registry.yarnpkg.com/lodash.union/-/lodash.union-4.6.0.tgz#48bb5088409f16f1821666641c44dd1aaae3cd88" - integrity sha512-c4pB2CdGrGdjMKYLA+XiRDO7Y0PRQbm/Gzg8qMj+QH+pFVAoTp5sBpO0odL3FjoPCGjK96p6qsP+yQoiLoOBcw== - -lodash.uniq@^4.5.0: - version "4.5.0" - resolved "https://registry.yarnpkg.com/lodash.uniq/-/lodash.uniq-4.5.0.tgz#d0225373aeb652adc1bc82e4945339a842754773" - integrity sha512-xfBaXQd9ryd9dlSDvnvI0lvxfLJlYAZzXomUYzLKtUeOQvOP5piqAWuGtrhWeqaXK9hhoM/iyJc5AV+XfsX3HQ== - -lodash@^4.17.21, lodash@^4.17.23: - version "4.17.23" - resolved "https://registry.yarnpkg.com/lodash/-/lodash-4.17.23.tgz#f113b0378386103be4f6893388c73d0bde7f2c5a" - integrity sha512-LgVTMpQtIopCi79SJeDiP0TfWi5CNEc/L/aRdTh3yIvmZXTnheWpKjSZhnvMl8iXbC1tFg9gdHHDMLoV7CnG+w== - -loose-envify@^1.1.0, loose-envify@^1.4.0: - version "1.4.0" - resolved "https://registry.yarnpkg.com/loose-envify/-/loose-envify-1.4.0.tgz#71ee51fa7be4caec1a63839f7e682d8132d30caf" - integrity sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q== - dependencies: - js-tokens "^3.0.0 || ^4.0.0" - -lru-cache@^11.2.6, lru-cache@^11.2.7: - version "11.2.7" - resolved "https://registry.yarnpkg.com/lru-cache/-/lru-cache-11.2.7.tgz#9127402617f34cd6767b96daee98c28e74458d35" - integrity sha512-aY/R+aEsRelme17KGQa/1ZSIpLpNYYrhcrepKTZgE+W3WM16YMCaPwOHLHsmopZHELU0Ojin1lPVxKR0MihncA== - -lz-string@^1.5.0: - version "1.5.0" - resolved "https://registry.yarnpkg.com/lz-string/-/lz-string-1.5.0.tgz#c1ab50f77887b712621201ba9fd4e3a6ed099941" - integrity sha512-h5bgJWpxJNswbU7qCrV0tIKQCaS3blPDrqKWx+QxzuzL1zGUzij9XCWLrSLsJPu5t+eWA/ycetzYAO5IOMcWAQ== - -magic-string@^0.30.21: - version "0.30.21" - resolved "https://registry.yarnpkg.com/magic-string/-/magic-string-0.30.21.tgz#56763ec09a0fa8091df27879fd94d19078c00d91" - integrity sha512-vd2F4YUyEXKGcLHoq+TEyCjxueSeHnFxyyjNp80yg0XV4vUhnDer/lvvlqM/arB5bXQN5K2/3oinyCRyx8T2CQ== - dependencies: - "@jridgewell/sourcemap-codec" "^1.5.5" - -markdown-it-task-lists@^2.1.1: - version "2.1.1" - resolved "https://registry.npmjs.org/markdown-it-task-lists/-/markdown-it-task-lists-2.1.1.tgz#f68f4d2ac2bad5a2c373ba93081a1a6848417088" - integrity sha512-TxFAc76Jnhb2OUu+n3yz9RMu4CwGfaT788br6HhEDlvWfdeJcLUsxk1Hgw2yJio0OXsxv7pyIPmvECY7bMbluA== - -markdown-it@^14.0.0, markdown-it@^14.1.0: - version "14.1.1" - resolved "https://registry.npmjs.org/markdown-it/-/markdown-it-14.1.1.tgz#856f90b66fc39ae70affd25c1b18b581d7deee1f" - integrity sha512-BuU2qnTti9YKgK5N+IeMubp14ZUKUUw7yeJbkjtosvHiP0AZ5c8IAgEMk79D0eC8F23r4Ac/q8cAIFdm2FtyoA== - dependencies: - argparse "^2.0.1" - entities "^4.4.0" - linkify-it "^5.0.0" - mdurl "^2.0.0" - punycode.js "^2.3.1" - uc.micro "^2.1.0" - -markdown-to-jsx@^7.4.0: - version "7.7.17" - resolved "https://registry.yarnpkg.com/markdown-to-jsx/-/markdown-to-jsx-7.7.17.tgz#6e997d6aa4dbe2e69c423c65745541846777483c" - integrity sha512-7mG/1feQ0TX5I7YyMZVDgCC/y2I3CiEhIRQIhyov9nGBP5eoVrOXXHuL5ZP8GRfxVZKRiXWJgwXkb9It+nQZfQ== - -math-intrinsics@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/math-intrinsics/-/math-intrinsics-1.1.0.tgz#a0dd74be81e2aa5c2f27e65ce283605ee4e2b7f9" - integrity sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g== - -mdn-data@2.27.1: - version "2.27.1" - resolved "https://registry.yarnpkg.com/mdn-data/-/mdn-data-2.27.1.tgz#e37b9c50880b75366c4d40ac63d9bbcacdb61f0e" - integrity sha512-9Yubnt3e8A0OKwxYSXyhLymGW4sCufcLG6VdiDdUGVkPhpqLxlvP5vl1983gQjJl3tqbrM731mjaZaP68AgosQ== - -mdurl@^2.0.0: - version "2.0.0" - resolved "https://registry.npmjs.org/mdurl/-/mdurl-2.0.0.tgz#80676ec0433025dd3e17ee983d0fe8de5a2237e0" - integrity sha512-Lf+9+2r+Tdp5wXDXC4PcIBjTDtq4UKjCPMQhKIuzpJNW0b96kVqSwW0bT7FhRSfmAiFYgP+SCRvdrDozfh0U5w== - -mimic-response@^3.1.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/mimic-response/-/mimic-response-3.1.0.tgz#2d1d59af9c1b129815accc2c46a022a5ce1fa3c9" - integrity sha512-z0yWI+4FDrrweS8Zmt4Ej5HdJmky15+L2e6Wgn3+iK5fWzb6T3fhNFq2+MeTRb064c6Wr4N/wv0DzQTjNzHNGQ== - -min-indent@^1.0.0: - version "1.0.1" - resolved "https://registry.yarnpkg.com/min-indent/-/min-indent-1.0.1.tgz#a63f681673b30571fbe8bc25686ae746eefa9869" - integrity sha512-I9jwMn07Sy/IwOj3zVkVik2JTvgpaykDZEigL6Rx6N9LbMywwUSMtxET+7lVoDLLd3O3IXwJwvuuns8UB/HeAg== - -minimatch@^10.2.2: - version "10.2.4" - resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-10.2.4.tgz#465b3accbd0218b8281f5301e27cedc697f96fde" - integrity sha512-oRjTw/97aTBN0RHbYCdtF1MQfvusSIBQM0IZEgzl6426+8jSC0nF1a/GmnVLpfB9yyr6g6FTqWqiZVbxrtaCIg== - dependencies: - brace-expansion "^5.0.2" - -minimatch@^3.1.1, minimatch@^3.1.2, minimatch@^3.1.5: - version "3.1.5" - resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-3.1.5.tgz#580c88f8d5445f2bd6aa8f3cadefa0de79fbd69e" - integrity sha512-VgjWUsnnT6n+NUk6eZq77zeFdpW2LWDzP6zFGrCbHXiYNul5Dzqk2HHQ5uFH2DNW5Xbp8+jVzaeNt94ssEEl4w== - dependencies: - brace-expansion "^1.1.7" - -minimatch@^5.1.0: - version "5.1.9" - resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-5.1.9.tgz#1293ef15db0098b394540e8f9f744f9fda8dee4b" - integrity sha512-7o1wEA2RyMP7Iu7GNba9vc0RWWGACJOCZBJX2GJWip0ikV+wcOsgVuY9uE8CPiyQhkGFSlhuSkZPavN7u1c2Fw== - dependencies: - brace-expansion "^2.0.1" - -minimist@^1.2.0, minimist@^1.2.3, minimist@^1.2.6: - version "1.2.8" - resolved "https://registry.yarnpkg.com/minimist/-/minimist-1.2.8.tgz#c1a464e7693302e082a075cee0c057741ac4772c" - integrity sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA== - -mkdirp-classic@^0.5.2, mkdirp-classic@^0.5.3: - version "0.5.3" - resolved "https://registry.yarnpkg.com/mkdirp-classic/-/mkdirp-classic-0.5.3.tgz#fa10c9115cc6d8865be221ba47ee9bed78601113" - integrity sha512-gKLcREMhtuZRwRAfqP3RFW+TK4JqApVBtOIftVgjuABpAtpxhPGaDcfvbhNvD0B8iD1oUr/txX35NjcaY6Ns/A== - -"mkdirp@>=0.5 0": - version "0.5.6" - resolved "https://registry.yarnpkg.com/mkdirp/-/mkdirp-0.5.6.tgz#7def03d2432dcae4ba1d611445c48396062255f6" - integrity sha512-FP+p8RB8OWpF3YZBCrP5gtADmtXApB5AMLn+vdyA+PyxCjrCs00mjyUozssO33cwDeT3wNGdLxJ5M//YqtHAJw== - dependencies: - minimist "^1.2.6" - -ms@^2.1.3: - version "2.1.3" - resolved "https://registry.yarnpkg.com/ms/-/ms-2.1.3.tgz#574c8138ce1d2b5861f0b44579dbadd60c6615b2" - integrity sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA== - -nanoid@^3.3.11: - version "3.3.11" - resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.3.11.tgz#4f4f112cefbe303202f2199838128936266d185b" - integrity sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w== - -napi-build-utils@^2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/napi-build-utils/-/napi-build-utils-2.0.0.tgz#13c22c0187fcfccce1461844136372a47ddc027e" - integrity sha512-GEbrYkbfF7MoNaoh2iGG84Mnf/WZfB0GdGEsM8wz7Expx/LlWf5U8t9nvJKXSp3qr5IsEbK04cBGhol/KwOsWA== - -natural-compare@^1.4.0: - version "1.4.0" - resolved "https://registry.yarnpkg.com/natural-compare/-/natural-compare-1.4.0.tgz#4abebfeed7541f2c27acfb29bdbbd15c8d5ba4f7" - integrity sha512-OWND8ei3VtNC9h7V60qff3SVobHr996CTwgxubgyQYEpg290h9J0buyECNNJexkFm5sOajh5G116RYA1c8ZMSw== - -node-abi@^3.3.0: - version "3.89.0" - resolved "https://registry.yarnpkg.com/node-abi/-/node-abi-3.89.0.tgz#eea98bf89d4534743bbbf2defa9f4f9bd3bdccfd" - integrity sha512-6u9UwL0HlAl21+agMN3YAMXcKByMqwGx+pq+P76vii5f7hTPtKDp08/H9py6DY+cfDw7kQNTGEj/rly3IgbNQA== - dependencies: - semver "^7.3.5" - -node-addon-api@^7.0.0: - version "7.1.1" - resolved "https://registry.yarnpkg.com/node-addon-api/-/node-addon-api-7.1.1.tgz#1aba6693b0f255258a049d621329329322aad558" - integrity sha512-5m3bsyrjFWE1xf7nz7YXdN4udnVtXK6/Yfgn5qnahL6bCkf2yKt4k3nuTKAtT4r3IG8JNR2ncsIMdZuAzJjHQQ== - -node-exports-info@^1.6.0: - version "1.6.0" - resolved "https://registry.yarnpkg.com/node-exports-info/-/node-exports-info-1.6.0.tgz#1aedafb01a966059c9a5e791a94a94d93f5c2a13" - integrity sha512-pyFS63ptit/P5WqUkt+UUfe+4oevH+bFeIiPPdfb0pFeYEu/1ELnJu5l+5EcTKYL5M7zaAa7S8ddywgXypqKCw== - dependencies: - array.prototype.flatmap "^1.3.3" - es-errors "^1.3.0" - object.entries "^1.1.9" - semver "^6.3.1" - -normalize-path@^3.0.0: - version "3.0.0" - resolved "https://registry.yarnpkg.com/normalize-path/-/normalize-path-3.0.0.tgz#0dcd69ff23a1c9b11fd0978316644a0388216a65" - integrity sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA== - -object-assign@^4.1.1: - version "4.1.1" - resolved "https://registry.yarnpkg.com/object-assign/-/object-assign-4.1.1.tgz#2109adc7965887cfc05cbbd442cac8bfbb360863" - integrity sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg== - -object-inspect@^1.13.3, object-inspect@^1.13.4: - version "1.13.4" - resolved "https://registry.yarnpkg.com/object-inspect/-/object-inspect-1.13.4.tgz#8375265e21bc20d0fa582c22e1b13485d6e00213" - integrity sha512-W67iLl4J2EXEGTbfeHCffrjDfitvLANg0UlX3wFUUSTx92KXRFegMHUVgSqE+wvhAbi4WqjGg9czysTV2Epbew== - -object-keys@^1.1.1: - version "1.1.1" - resolved "https://registry.yarnpkg.com/object-keys/-/object-keys-1.1.1.tgz#1c47f272df277f3b1daf061677d9c82e2322c60e" - integrity sha512-NuAESUOUMrlIXOfHKzD6bpPu3tYt3xvjNdRIQ+FeT0lNb4K8WR70CaDxhuNguS2XG+GjkyMwOzsN5ZktImfhLA== - -object.assign@^4.1.4, object.assign@^4.1.7: - version "4.1.7" - resolved "https://registry.yarnpkg.com/object.assign/-/object.assign-4.1.7.tgz#8c14ca1a424c6a561b0bb2a22f66f5049a945d3d" - integrity sha512-nK28WOo+QIjBkDduTINE4JkF/UJJKyf2EJxvJKfblDpyg0Q+pkOHNTL0Qwy6NP6FhE/EnzV73BxxqcJaXY9anw== - dependencies: - call-bind "^1.0.8" - call-bound "^1.0.3" - define-properties "^1.2.1" - es-object-atoms "^1.0.0" - has-symbols "^1.1.0" - object-keys "^1.1.1" - -object.entries@^1.1.9: - version "1.1.9" - resolved "https://registry.yarnpkg.com/object.entries/-/object.entries-1.1.9.tgz#e4770a6a1444afb61bd39f984018b5bede25f8b3" - integrity sha512-8u/hfXFRBD1O0hPUjioLhoWFHRmt6tKA4/vZPyckBr18l1KE9uHrFaFaUi8MDRTpi4uak2goyPTSNJLXX2k2Hw== - dependencies: - call-bind "^1.0.8" - call-bound "^1.0.4" - define-properties "^1.2.1" - es-object-atoms "^1.1.1" - -object.fromentries@^2.0.8: - version "2.0.8" - resolved "https://registry.yarnpkg.com/object.fromentries/-/object.fromentries-2.0.8.tgz#f7195d8a9b97bd95cbc1999ea939ecd1a2b00c65" - integrity sha512-k6E21FzySsSK5a21KRADBd/NGneRegFO5pLHfdQLpRDETUNJueLXs3WCzyQ3tFRDYgbq3KHGXfTbi2bs8WQ6rQ== - dependencies: - call-bind "^1.0.7" - define-properties "^1.2.1" - es-abstract "^1.23.2" - es-object-atoms "^1.0.0" - -object.values@^1.1.6, object.values@^1.2.1: - version "1.2.1" - resolved "https://registry.yarnpkg.com/object.values/-/object.values-1.2.1.tgz#deed520a50809ff7f75a7cfd4bc64c7a038c6216" - integrity sha512-gXah6aZrcUxjWg2zR2MwouP2eHlCBzdV4pygudehaKXSGW4v2AsRQUK+lwwXhii6KFZcunEnmSUoYp5CXibxtA== - dependencies: - call-bind "^1.0.8" - call-bound "^1.0.3" - define-properties "^1.2.1" - es-object-atoms "^1.0.0" - -obug@^2.1.1: - version "2.1.1" - resolved "https://registry.yarnpkg.com/obug/-/obug-2.1.1.tgz#2cba74ff241beb77d63055ddf4cd1e9f90b538be" - integrity sha512-uTqF9MuPraAQ+IsnPf366RG4cP9RtUi7MLO1N3KEc+wb0a6yKpeL0lmk2IB1jY5KHPAlTc6T/JRdC/YqxHNwkQ== - -oidc-client-ts@3.5.0: - version "3.5.0" - resolved "https://registry.yarnpkg.com/oidc-client-ts/-/oidc-client-ts-3.5.0.tgz#222c145d35e8d654ea4e1dee71cc5dde5e1c91a2" - integrity sha512-l2q8l9CTCTOlbX+AnK4p3M+4CEpKpyQhle6blQkdFhm0IsBqsxm15bYaSa11G7pWdsYr6epdsRZxJpCyCRbT8A== - dependencies: - jwt-decode "^4.0.0" - -once@^1.3.0, once@^1.3.1, once@^1.4.0: - version "1.4.0" - resolved "https://registry.yarnpkg.com/once/-/once-1.4.0.tgz#583b1aa775961d4b113ac17d9c50baef9dd76bd1" - integrity sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w== - dependencies: - wrappy "1" - -optionator@^0.9.3: - version "0.9.4" - resolved "https://registry.yarnpkg.com/optionator/-/optionator-0.9.4.tgz#7ea1c1a5d91d764fb282139c88fe11e182a3a734" - integrity sha512-6IpQ7mKUxRcZNLIObR0hz7lxsapSSIYNZJwXPGeF0mTVqGKFIXj1DQcMoT22S3ROcLyY/rz0PWaWZ9ayWmad9g== - dependencies: - deep-is "^0.1.3" - fast-levenshtein "^2.0.6" - levn "^0.4.1" - prelude-ls "^1.2.1" - type-check "^0.4.0" - word-wrap "^1.2.5" - -orderedmap@^2.0.0: - version "2.1.1" - resolved "https://registry.npmjs.org/orderedmap/-/orderedmap-2.1.1.tgz#61481269c44031c449915497bf5a4ad273c512d2" - integrity sha512-TvAWxi0nDe1j/rtMcWcIj94+Ffe6n7zhow33h40SKxmsmozs6dz/e+EajymfoFcHd7sxNn8yHM8839uixMOV6g== - -own-keys@^1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/own-keys/-/own-keys-1.0.1.tgz#e4006910a2bf913585289676eebd6f390cf51358" - integrity sha512-qFOyK5PjiWZd+QQIh+1jhdb9LpxTF0qs7Pm8o5QHYZ0M3vKqSqzsZaEB6oWlxZ+q2sJBMI/Ktgd2N5ZwQoRHfg== - dependencies: - get-intrinsic "^1.2.6" - object-keys "^1.1.1" - safe-push-apply "^1.0.0" - -p-limit@^3.0.2: - version "3.1.0" - resolved "https://registry.yarnpkg.com/p-limit/-/p-limit-3.1.0.tgz#e1daccbe78d0d1388ca18c64fea38e3e57e3706b" - integrity sha512-TYOanM3wGwNGsZN2cVTYPArw454xnXj5qmWF1bEoAc4+cU/ol7GVh7odevjp1FNHduHc3KZMcFduxU5Xc6uJRQ== - dependencies: - yocto-queue "^0.1.0" - -p-locate@^5.0.0: - version "5.0.0" - resolved "https://registry.yarnpkg.com/p-locate/-/p-locate-5.0.0.tgz#83c8315c6785005e3bd021839411c9e110e6d834" - integrity sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw== - dependencies: - p-limit "^3.0.2" - -pako@~1.0.2: - version "1.0.11" - resolved "https://registry.yarnpkg.com/pako/-/pako-1.0.11.tgz#6c9599d340d54dfd3946380252a35705a6b992bf" - integrity sha512-4hLB8Py4zZce5s4yd9XzopqwVv/yGNhV1Bl8NTmCq1763HeK2+EwVTv+leGeL13Dnh2wfbqowVPXCIO0z4taYw== - -parent-module@^1.0.0: - version "1.0.1" - resolved "https://registry.yarnpkg.com/parent-module/-/parent-module-1.0.1.tgz#691d2709e78c79fae3a156622452d00762caaaa2" - integrity sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g== - dependencies: - callsites "^3.0.0" - -parse-json@^5.0.0: - version "5.2.0" - resolved "https://registry.yarnpkg.com/parse-json/-/parse-json-5.2.0.tgz#c76fc66dee54231c962b22bcc8a72cf2f99753cd" - integrity sha512-ayCKvm/phCGxOkYRSCM82iDwct8/EonSEgCSxWxD7ve6jHggsFl4fZVQBPRNgQoKiuV/odhFrGzQXZwbifC8Rg== - dependencies: - "@babel/code-frame" "^7.0.0" - error-ex "^1.3.1" - json-parse-even-better-errors "^2.3.0" - lines-and-columns "^1.1.6" - -parse5@^8.0.0: - version "8.0.0" - resolved "https://registry.yarnpkg.com/parse5/-/parse5-8.0.0.tgz#aceb267f6b15f9b6e6ba9e35bfdd481fc2167b12" - integrity sha512-9m4m5GSgXjL4AjumKzq1Fgfp3Z8rsvjRNbnkVwfu2ImRqE5D0LnY2QfDen18FSY9C573YU5XxSapdHZTZ2WolA== - dependencies: - entities "^6.0.0" - -path-exists@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/path-exists/-/path-exists-4.0.0.tgz#513bdbe2d3b95d7762e8c1137efa195c6c61b5b3" - integrity sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w== - -path-is-absolute@^1.0.0: - version "1.0.1" - resolved "https://registry.yarnpkg.com/path-is-absolute/-/path-is-absolute-1.0.1.tgz#174b9268735534ffbc7ace6bf53a5a9e1b5c5f5f" - integrity sha512-AVbw3UJ2e9bq64vSaS9Am0fje1Pa8pbGqTTsmXfaIiMpnr5DlDhfJOuLj9Sf95ZPVDAUerDfEk88MPmPe7UCQg== - -path-key@^3.1.0: - version "3.1.1" - resolved "https://registry.yarnpkg.com/path-key/-/path-key-3.1.1.tgz#581f6ade658cbba65a0d3380de7753295054f375" - integrity sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q== - -path-parse@^1.0.7: - version "1.0.7" - resolved "https://registry.yarnpkg.com/path-parse/-/path-parse-1.0.7.tgz#fbc114b60ca42b30d9daf5858e4bd68bbedb6735" - integrity sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw== - -path-type@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/path-type/-/path-type-4.0.0.tgz#84ed01c0a7ba380afe09d90a8c180dcd9d03043b" - integrity sha512-gDKb8aZMDeD/tZWs9P6+q0J9Mwkdl6xMV8TjnGP3qJVJ06bdMgkbBlLU8IdfOsIsFz2BW1rNVT3XuNEl8zPAvw== - -pathe@^2.0.3: - version "2.0.3" - resolved "https://registry.yarnpkg.com/pathe/-/pathe-2.0.3.tgz#3ecbec55421685b70a9da872b2cff3e1cbed1716" - integrity sha512-WUjGcAqP1gQacoQe+OBJsFA7Ld4DyXuUIjZ5cc75cLHvJ7dtNsTugphxIADwspS+AraAUePCKrSVtPLFj/F88w== - -picocolors@1.1.1, picocolors@^1.1.1: - version "1.1.1" - resolved "https://registry.yarnpkg.com/picocolors/-/picocolors-1.1.1.tgz#3d321af3eab939b083c8f929a1d12cda81c26b6b" - integrity sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA== - -picomatch@^4.0.3: - version "4.0.4" - resolved "https://registry.yarnpkg.com/picomatch/-/picomatch-4.0.4.tgz#fd6f5e00a143086e074dffe4c924b8fb293b0589" - integrity sha512-QP88BAKvMam/3NxH6vj2o21R6MjxZUAd6nlwAS/pnGvN9IVLocLHxGYIzFhg6fUQ+5th6P4dv4eW9jX3DSIj7A== - -possible-typed-array-names@^1.0.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/possible-typed-array-names/-/possible-typed-array-names-1.1.0.tgz#93e3582bc0e5426586d9d07b79ee40fc841de4ae" - integrity sha512-/+5VFTchJDoVj3bhoqi6UeymcD00DAwb1nJwamzPvHEszJ4FpF6SNNbUbOS8yI56qHzdV8eK0qEfOSiodkTdxg== - -postcss@^8.4.43, postcss@^8.5.8: - version "8.5.8" - resolved "https://registry.yarnpkg.com/postcss/-/postcss-8.5.8.tgz#6230ecc8fb02e7a0f6982e53990937857e13f399" - integrity sha512-OW/rX8O/jXnm82Ey1k44pObPtdblfiuWnrd8X7GJ7emImCOstunGbXUpp7HdBrFQX6rJzn3sPT397Wp5aCwCHg== - dependencies: - nanoid "^3.3.11" - picocolors "^1.1.1" - source-map-js "^1.2.1" - -prebuild-install@^7.1.3: - version "7.1.3" - resolved "https://registry.yarnpkg.com/prebuild-install/-/prebuild-install-7.1.3.tgz#d630abad2b147443f20a212917beae68b8092eec" - integrity sha512-8Mf2cbV7x1cXPUILADGI3wuhfqWvtiLA1iclTDbFRZkgRQS0NqsPZphna9V+HyTEadheuPmjaJMsbzKQFOzLug== - dependencies: - detect-libc "^2.0.0" - expand-template "^2.0.3" - github-from-package "0.0.0" - minimist "^1.2.3" - mkdirp-classic "^0.5.3" - napi-build-utils "^2.0.0" - node-abi "^3.3.0" - pump "^3.0.0" - rc "^1.2.7" - simple-get "^4.0.0" - tar-fs "^2.0.0" - tunnel-agent "^0.6.0" - -prelude-ls@^1.2.1: - version "1.2.1" - resolved "https://registry.yarnpkg.com/prelude-ls/-/prelude-ls-1.2.1.tgz#debc6489d7a6e6b0e7611888cec880337d316396" - integrity sha512-vkcDPrRZo1QZLbn5RLGPpg/WmIQ65qoWWhcGKf/b5eplkkarX0m9z8ppCat4mlOqUsWpyNuYgO3VRyrYHSzX5g== - -prettier@^2.8.3: - version "2.8.8" - resolved "https://registry.yarnpkg.com/prettier/-/prettier-2.8.8.tgz#e8c5d7e98a4305ffe3de2e1fc4aca1a71c28b1da" - integrity sha512-tdN8qQGvNjw4CHbY+XXk0JgCXn9QiF21a55rBe5LJAU+kDyC4WQn4+awm2Xfk2lQMk5fKup9XgzTZtGkjBdP9Q== - -pretty-format@^27.0.2: - version "27.5.1" - resolved "https://registry.yarnpkg.com/pretty-format/-/pretty-format-27.5.1.tgz#2181879fdea51a7a5851fb39d920faa63f01d88e" - integrity sha512-Qb1gy5OrP5+zDf2Bvnzdl3jsTf1qXVMazbvCoKhtKqVs4/YK4ozX4gKQJJVyNe+cajNPn0KoC0MC3FUmaHWEmQ== - dependencies: - ansi-regex "^5.0.1" - ansi-styles "^5.0.0" - react-is "^17.0.1" - -prism-react-renderer@^1.3.5: - version "1.3.5" - resolved "https://registry.yarnpkg.com/prism-react-renderer/-/prism-react-renderer-1.3.5.tgz#786bb69aa6f73c32ba1ee813fbe17a0115435085" - integrity sha512-IJ+MSwBWKG+SM3b2SUfdrhC+gu01QkV2KmRQgREThBfSQRoufqRfxfHUxpG1WcaFjP+kojcFyO9Qqtpgt3qLCg== - -prismjs@^1.30.0: - version "1.30.0" - resolved "https://registry.yarnpkg.com/prismjs/-/prismjs-1.30.0.tgz#d9709969d9d4e16403f6f348c63553b19f0975a9" - integrity sha512-DEvV2ZF2r2/63V+tK8hQvrR2ZGn10srHbXviTlcv7Kpzw8jWiNTqbVgjO3IY8RxrrOUF8VPMQQFysYYYv0YZxw== - -process-nextick-args@~2.0.0: - version "2.0.1" - resolved "https://registry.yarnpkg.com/process-nextick-args/-/process-nextick-args-2.0.1.tgz#7820d9b16120cc55ca9ae7792680ae7dba6d7fe2" - integrity sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag== - -prop-types@^15.6.2, prop-types@^15.8.1: - version "15.8.1" - resolved "https://registry.yarnpkg.com/prop-types/-/prop-types-15.8.1.tgz#67d87bf1a694f48435cf332c24af10214a3140b5" - integrity sha512-oj87CgZICdulUohogVAR7AjlC0327U4el4L6eAvOqCeudMDVU0NThNaV+b9Df4dXgSP1gXMTnPdhfe/2qDH5cg== - dependencies: - loose-envify "^1.4.0" - object-assign "^4.1.1" - react-is "^16.13.1" - -prosemirror-changeset@^2.3.0: - version "2.4.0" - resolved "https://registry.npmjs.org/prosemirror-changeset/-/prosemirror-changeset-2.4.0.tgz#8d8ea0290cb9545c298ec427ac3a8f298c39170f" - integrity sha512-LvqH2v7Q2SF6yxatuPP2e8vSUKS/L+xAU7dPDC4RMyHMhZoGDfBC74mYuyYF4gLqOEG758wajtyhNnsTkuhvng== - dependencies: - prosemirror-transform "^1.0.0" - -prosemirror-collab@^1.3.1: - version "1.3.1" - resolved "https://registry.npmjs.org/prosemirror-collab/-/prosemirror-collab-1.3.1.tgz#0e8c91e76e009b53457eb3b3051fb68dad029a33" - integrity sha512-4SnynYR9TTYaQVXd/ieUvsVV4PDMBzrq2xPUWutHivDuOshZXqQ5rGbZM84HEaXKbLdItse7weMGOUdDVcLKEQ== - dependencies: - prosemirror-state "^1.0.0" - -prosemirror-commands@^1.0.0, prosemirror-commands@^1.6.2: - version "1.7.1" - resolved "https://registry.npmjs.org/prosemirror-commands/-/prosemirror-commands-1.7.1.tgz#d101fef85618b1be53d5b99ea17bee5600781b38" - integrity sha512-rT7qZnQtx5c0/y/KlYaGvtG411S97UaL6gdp6RIZ23DLHanMYLyfGBV5DtSnZdthQql7W+lEVbpSfwtO8T+L2w== - dependencies: - prosemirror-model "^1.0.0" - prosemirror-state "^1.0.0" - prosemirror-transform "^1.10.2" - -prosemirror-dropcursor@^1.8.1: - version "1.8.2" - resolved "https://registry.npmjs.org/prosemirror-dropcursor/-/prosemirror-dropcursor-1.8.2.tgz#2ed30c4796109ddeb1cf7282372b3850528b7228" - integrity sha512-CCk6Gyx9+Tt2sbYk5NK0nB1ukHi2ryaRgadV/LvyNuO3ena1payM2z6Cg0vO1ebK8cxbzo41ku2DE5Axj1Zuiw== - dependencies: - prosemirror-state "^1.0.0" - prosemirror-transform "^1.1.0" - prosemirror-view "^1.1.0" - -prosemirror-gapcursor@^1.3.2: - version "1.4.1" - resolved "https://registry.npmjs.org/prosemirror-gapcursor/-/prosemirror-gapcursor-1.4.1.tgz#da33c905fece147df577342c06f4929b25d365ee" - integrity sha512-pMdYaEnjNMSwl11yjEGtgTmLkR08m/Vl+Jj443167p9eB3HVQKhYCc4gmHVDsLPODfZfjr/MmirsdyZziXbQKw== - dependencies: - prosemirror-keymap "^1.0.0" - prosemirror-model "^1.0.0" - prosemirror-state "^1.0.0" - prosemirror-view "^1.0.0" - -prosemirror-history@^1.0.0, prosemirror-history@^1.4.1: - version "1.5.0" - resolved "https://registry.npmjs.org/prosemirror-history/-/prosemirror-history-1.5.0.tgz#ee21fc5de85a1473e3e3752015ffd6d649a06859" - integrity sha512-zlzTiH01eKA55UAf1MEjtssJeHnGxO0j4K4Dpx+gnmX9n+SHNlDqI2oO1Kv1iPN5B1dm5fsljCfqKF9nFL6HRg== - dependencies: - prosemirror-state "^1.2.2" - prosemirror-transform "^1.0.0" - prosemirror-view "^1.31.0" - rope-sequence "^1.3.0" - -prosemirror-inputrules@^1.4.0: - version "1.5.1" - resolved "https://registry.npmjs.org/prosemirror-inputrules/-/prosemirror-inputrules-1.5.1.tgz#d2e935f6086e3801486b09222638f61dae89a570" - integrity sha512-7wj4uMjKaXWAQ1CDgxNzNtR9AlsuwzHfdFH1ygEHA2KHF2DOEaXl1CJfNPAKCg9qNEh4rum975QLaCiQPyY6Fw== - dependencies: - prosemirror-state "^1.0.0" - prosemirror-transform "^1.0.0" - -prosemirror-keymap@^1.0.0, prosemirror-keymap@^1.2.2, prosemirror-keymap@^1.2.3: - version "1.2.3" - resolved "https://registry.npmjs.org/prosemirror-keymap/-/prosemirror-keymap-1.2.3.tgz#c0f6ab95f75c0b82c97e44eb6aaf29cbfc150472" - integrity sha512-4HucRlpiLd1IPQQXNqeo81BGtkY8Ai5smHhKW9jjPKRc2wQIxksg7Hl1tTI2IfT2B/LgX6bfYvXxEpJl7aKYKw== - dependencies: - prosemirror-state "^1.0.0" - w3c-keyname "^2.2.0" - -prosemirror-markdown@^1.11.1, prosemirror-markdown@^1.13.1: - version "1.13.4" - resolved "https://registry.npmjs.org/prosemirror-markdown/-/prosemirror-markdown-1.13.4.tgz#4620e6a0580cd52b5fc8e352c7e04830cd4b3048" - integrity sha512-D98dm4cQ3Hs6EmjK500TdAOew4Z03EV71ajEFiWra3Upr7diytJsjF4mPV2dW+eK5uNectiRj0xFxYI9NLXDbw== - dependencies: - "@types/markdown-it" "^14.0.0" - markdown-it "^14.0.0" - prosemirror-model "^1.25.0" - -prosemirror-menu@^1.2.4: - version "1.3.0" - resolved "https://registry.npmjs.org/prosemirror-menu/-/prosemirror-menu-1.3.0.tgz#f51e25259b91d7c35ad7b65fc0c92d838404e177" - integrity sha512-TImyPXCHPcDsSka2/lwJ6WjTASr4re/qWq1yoTTuLOqfXucwF6VcRa2LWCkM/EyTD1UO3CUwiH8qURJoWJRxwg== - dependencies: - crelt "^1.0.0" - prosemirror-commands "^1.0.0" - prosemirror-history "^1.0.0" - prosemirror-state "^1.0.0" - -prosemirror-model@^1.0.0, prosemirror-model@^1.20.0, prosemirror-model@^1.21.0, prosemirror-model@^1.24.1, prosemirror-model@^1.25.0, prosemirror-model@^1.25.4: - version "1.25.4" - resolved "https://registry.npmjs.org/prosemirror-model/-/prosemirror-model-1.25.4.tgz#8ebfbe29ecbee9e5e2e4048c4fe8e363fcd56e7c" - integrity sha512-PIM7E43PBxKce8OQeezAs9j4TP+5yDpZVbuurd1h5phUxEKIu+G2a+EUZzIC5nS1mJktDJWzbqS23n1tsAf5QA== - dependencies: - orderedmap "^2.0.0" - -prosemirror-schema-basic@^1.2.3: - version "1.2.4" - resolved "https://registry.npmjs.org/prosemirror-schema-basic/-/prosemirror-schema-basic-1.2.4.tgz#389ce1ec09b8a30ea9bbb92c58569cb690c2d695" - integrity sha512-ELxP4TlX3yr2v5rM7Sb70SqStq5NvI15c0j9j/gjsrO5vaw+fnnpovCLEGIcpeGfifkuqJwl4fon6b+KdrODYQ== - dependencies: - prosemirror-model "^1.25.0" - -prosemirror-schema-list@^1.5.0: - version "1.5.1" - resolved "https://registry.npmjs.org/prosemirror-schema-list/-/prosemirror-schema-list-1.5.1.tgz#5869c8f749e8745c394548bb11820b0feb1e32f5" - integrity sha512-927lFx/uwyQaGwJxLWCZRkjXG0p48KpMj6ueoYiu4JX05GGuGcgzAy62dfiV8eFZftgyBUvLx76RsMe20fJl+Q== - dependencies: - prosemirror-model "^1.0.0" - prosemirror-state "^1.0.0" - prosemirror-transform "^1.7.3" - -prosemirror-state@^1.0.0, prosemirror-state@^1.2.2, prosemirror-state@^1.4.3, prosemirror-state@^1.4.4: - version "1.4.4" - resolved "https://registry.npmjs.org/prosemirror-state/-/prosemirror-state-1.4.4.tgz#72b5e926f9e92dcee12b62a05fcc8a2de3bf5b39" - integrity sha512-6jiYHH2CIGbCfnxdHbXZ12gySFY/fz/ulZE333G6bPqIZ4F+TXo9ifiR86nAHpWnfoNjOb3o5ESi7J8Uz1jXHw== - dependencies: - prosemirror-model "^1.0.0" - prosemirror-transform "^1.0.0" - prosemirror-view "^1.27.0" - -prosemirror-tables@^1.6.4: - version "1.8.5" - resolved "https://registry.npmjs.org/prosemirror-tables/-/prosemirror-tables-1.8.5.tgz#104427012e5a5da1d2a38c122efee8d66bdd5104" - integrity sha512-V/0cDCsHKHe/tfWkeCmthNUcEp1IVO3p6vwN8XtwE9PZQLAZJigbw3QoraAdfJPir4NKJtNvOB8oYGKRl+t0Dw== - dependencies: - prosemirror-keymap "^1.2.3" - prosemirror-model "^1.25.4" - prosemirror-state "^1.4.4" - prosemirror-transform "^1.10.5" - prosemirror-view "^1.41.4" - -prosemirror-trailing-node@^3.0.0: - version "3.0.0" - resolved "https://registry.npmjs.org/prosemirror-trailing-node/-/prosemirror-trailing-node-3.0.0.tgz#5bc223d4fc1e8d9145e4079ec77a932b54e19e04" - integrity sha512-xiun5/3q0w5eRnGYfNlW1uU9W6x5MoFKWwq/0TIRgt09lv7Hcser2QYV8t4muXbEr+Fwo0geYn79Xs4GKywrRQ== - dependencies: - "@remirror/core-constants" "3.0.0" - escape-string-regexp "^4.0.0" - -prosemirror-transform@^1.0.0, prosemirror-transform@^1.1.0, prosemirror-transform@^1.10.2, prosemirror-transform@^1.10.5, prosemirror-transform@^1.7.3: - version "1.12.0" - resolved "https://registry.npmjs.org/prosemirror-transform/-/prosemirror-transform-1.12.0.tgz#0239288d0e98d91e6af3dd269a8968466be406d7" - integrity sha512-GxboyN4AMIsoHNtz5uf2r2Ru551i5hWeCMD6E2Ib4Eogqoub0NflniaBPVQ4MrGE5yZ8JV9tUHg9qcZTTrcN4w== - dependencies: - prosemirror-model "^1.21.0" - -prosemirror-view@^1.0.0, prosemirror-view@^1.1.0, prosemirror-view@^1.27.0, prosemirror-view@^1.31.0, prosemirror-view@^1.38.1, prosemirror-view@^1.41.4: - version "1.41.8" - resolved "https://registry.npmjs.org/prosemirror-view/-/prosemirror-view-1.41.8.tgz#bfb48d9dc328f1aa2a0eea1600b0828818be03f1" - integrity sha512-TnKDdohEatgyZNGCDWIdccOHXhYloJwbwU+phw/a23KBvJIR9lWQWW7WHHK3vBdOLDNuF7TaX98GObUZOWkOnA== - dependencies: - prosemirror-model "^1.20.0" - prosemirror-state "^1.0.0" - prosemirror-transform "^1.1.0" - -pump@^3.0.0: - version "3.0.4" - resolved "https://registry.yarnpkg.com/pump/-/pump-3.0.4.tgz#1f313430527fa8b905622ebd22fe1444e757ab3c" - integrity sha512-VS7sjc6KR7e1ukRFhQSY5LM2uBWAUPiOPa/A3mkKmiMwSmRFUITt0xuj+/lesgnCv+dPIEYlkzrcyXgquIHMcA== - dependencies: - end-of-stream "^1.1.0" - once "^1.3.1" - -punycode.js@^2.3.1: - version "2.3.1" - resolved "https://registry.npmjs.org/punycode.js/-/punycode.js-2.3.1.tgz#6b53e56ad75588234e79f4affa90972c7dd8cdb7" - integrity sha512-uxFIHU0YlHYhDQtV4R9J6a52SLx28BCjT+4ieh7IGbgwVJWO+km431c4yRlREUAsAmt/uMjQUyQHNEPf0M39CA== - -punycode@^2.1.0, punycode@^2.3.1: - version "2.3.1" - resolved "https://registry.yarnpkg.com/punycode/-/punycode-2.3.1.tgz#027422e2faec0b25e1549c3e1bd8309b9133b6e5" - integrity sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg== - -rc@^1.2.7: - version "1.2.8" - resolved "https://registry.yarnpkg.com/rc/-/rc-1.2.8.tgz#cd924bf5200a075b83c188cd6b9e211b7fc0d3ed" - integrity sha512-y3bGgqKj3QBdxLbLkomlohkvsA8gdAiUQlSBJnBhfn+BPxg4bc62d8TcBW15wavDfgexCgccckhcZvywyQYPOw== - dependencies: - deep-extend "^0.6.0" - ini "~1.3.0" - minimist "^1.2.0" - strip-json-comments "~2.0.1" - -react-animate-height@^3.0.4: - version "3.2.3" - resolved "https://registry.yarnpkg.com/react-animate-height/-/react-animate-height-3.2.3.tgz#90929aadac1bd1851cb6a685acc105b50ccfda8c" - integrity sha512-R6DSvr7ud07oeCixScyvXWEMJY/Mt2+GyOWC1KMaRc69gOBw+SsCg4TJmrp4rKUM1hyd6p+YKw90brjPH93Y2A== - -react-animate-on-change@^2.2.0: - version "2.2.0" - resolved "https://registry.yarnpkg.com/react-animate-on-change/-/react-animate-on-change-2.2.0.tgz#862d3d5a66d09d6b5c32308acdcd587c1bee47a4" - integrity sha512-cM0YHbsxIh8fshX/U24+pk4nDG7Ike9NsEy21reqJPqVt6xRA+6oYkaQHEggINKjYEMbztwK40Ro0/EHZ5naVQ== - -react-dnd-html5-backend@^16.0.1: - version "16.0.1" - resolved "https://registry.yarnpkg.com/react-dnd-html5-backend/-/react-dnd-html5-backend-16.0.1.tgz#87faef15845d512a23b3c08d29ecfd34871688b6" - integrity sha512-Wu3dw5aDJmOGw8WjH1I1/yTH+vlXEL4vmjk5p+MHxP8HuHJS1lAGeIdG/hze1AvNeXWo/JgULV87LyQOr+r5jw== - dependencies: - dnd-core "^16.0.1" - -react-dnd@^16.0.1: - version "16.0.1" - resolved "https://registry.yarnpkg.com/react-dnd/-/react-dnd-16.0.1.tgz#2442a3ec67892c60d40a1559eef45498ba26fa37" - integrity sha512-QeoM/i73HHu2XF9aKksIUuamHPDvRglEwdHL4jsp784BgUuWcg6mzfxT0QDdQz8Wj0qyRKx2eMg8iZtWvU4E2Q== - dependencies: - "@react-dnd/invariant" "^4.0.1" - "@react-dnd/shallowequal" "^4.0.1" - dnd-core "^16.0.1" - fast-deep-equal "^3.1.3" - hoist-non-react-statics "^3.3.2" - -react-dom@^18.2.0: - version "18.3.1" - resolved "https://registry.yarnpkg.com/react-dom/-/react-dom-18.3.1.tgz#c2265d79511b57d479b3dd3fdfa51536494c5cb4" - integrity sha512-5m4nQKp+rZRb09LNH59GM4BxTh9251/ylbKIbpe7TpGxfJ+9kv6BLkLBXIjjspbgbnIBNqlI23tRnTWT0snUIw== - dependencies: - loose-envify "^1.1.0" - scheduler "^0.23.2" - -react-i18next@^16.5.4: - version "16.6.6" - resolved "https://registry.yarnpkg.com/react-i18next/-/react-i18next-16.6.6.tgz#046bbf651ab4837ba505a5995189c1d42b8956c7" - integrity sha512-ZgL2HUoW34UKUkOV7uSQFE1CDnRPD+tCR3ywSuWH7u2iapnz86U8Bi3Vrs620qNDzCf1F47NxglCEkchCTDOHw== - dependencies: - "@babel/runtime" "^7.29.2" - html-parse-stringify "^3.0.1" - use-sync-external-store "^1.6.0" - -react-is@^16.13.1, react-is@^16.7.0: - version "16.13.1" - resolved "https://registry.yarnpkg.com/react-is/-/react-is-16.13.1.tgz#789729a4dc36de2999dc156dd6c1d9c18cea56a4" - integrity sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ== - -react-is@^17.0.1: - version "17.0.2" - resolved "https://registry.yarnpkg.com/react-is/-/react-is-17.0.2.tgz#e691d4a8e9c789365655539ab372762b0efb54f0" - integrity sha512-w2GsyukL62IJnlaff/nRegPQR94C/XXamvMWmSHRJ4y7Ts/4ocGRmTHvOs8PSE6pB3dWOrD/nueuU5sduBsQ4w== - -react-is@^18.0.0: - version "18.3.1" - resolved "https://registry.yarnpkg.com/react-is/-/react-is-18.3.1.tgz#e83557dc12eae63a99e003a46388b1dcbb44db7e" - integrity sha512-/LLMVyas0ljjAtoYiPqYiL8VWXzUUdThrmU5+n20DZv+a+ClRoevUzw5JxU+Ieh5/c87ytoTBV9G1FiKfNJdmg== - -react-is@^19.2.3: - version "19.2.4" - resolved "https://registry.yarnpkg.com/react-is/-/react-is-19.2.4.tgz#a080758243c572ccd4a63386537654298c99d135" - integrity sha512-W+EWGn2v0ApPKgKKCy/7s7WHXkboGcsrXE+2joLyVxkbyVQfO3MUEaUQDHoSmb8TFFrSKYa9mw64WZHNHSDzYA== - -react-katex@^3.1.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/react-katex/-/react-katex-3.1.0.tgz#825559010b599f9a6ba8c035982771fe2e2a3347" - integrity sha512-At9uLOkC75gwn2N+ZXc5HD8TlATsB+3Hkp9OGs6uA8tM3dwZ3Wljn74Bk3JyHFPgSnesY/EMrIAB1WJwqZqejA== - dependencies: - katex "^0.16.0" - -react-redux@^8.0.4: - version "8.1.3" - resolved "https://registry.yarnpkg.com/react-redux/-/react-redux-8.1.3.tgz#4fdc0462d0acb59af29a13c27ffef6f49ab4df46" - integrity sha512-n0ZrutD7DaX/j9VscF+uTALI3oUPa/pO4Z3soOBIjuRn/FzVu6aehhysxZCLi6y7duMf52WNZGMl7CtuK5EnRw== - dependencies: - "@babel/runtime" "^7.12.1" - "@types/hoist-non-react-statics" "^3.3.1" - "@types/use-sync-external-store" "^0.0.3" - hoist-non-react-statics "^3.3.2" - react-is "^18.0.0" - use-sync-external-store "^1.0.0" - -react-router-dom@^6.22.0: - version "6.30.3" - resolved "https://registry.yarnpkg.com/react-router-dom/-/react-router-dom-6.30.3.tgz#42ae6dc4c7158bfb0b935f162b9621b29dddf740" - integrity sha512-pxPcv1AczD4vso7G4Z3TKcvlxK7g7TNt3/FNGMhfqyntocvYKj+GCatfigGDjbLozC4baguJ0ReCigoDJXb0ag== - dependencies: - "@remix-run/router" "1.23.2" - react-router "6.30.3" - -react-router@6.30.3: - version "6.30.3" - resolved "https://registry.yarnpkg.com/react-router/-/react-router-6.30.3.tgz#994b3ccdbe0e81fe84d4f998100f62584dfbf1cf" - integrity sha512-XRnlbKMTmktBkjCLE8/XcZFlnHvr2Ltdr1eJX4idL55/9BbORzyZEaIkBFDhFGCEWBBItsVrDxwx3gnisMitdw== - dependencies: - "@remix-run/router" "1.23.2" - -react-selectable-fast@^3.4.0: - version "3.4.0" - resolved "https://registry.yarnpkg.com/react-selectable-fast/-/react-selectable-fast-3.4.0.tgz#fb3e6490ebd3f91309b5a58ca17f629baab6e625" - integrity sha512-4DVrX6eTCLqt+GVtSNAEcL3S9ODUvtcPrzUL1ObjSL507D+i+HE4tCokSxUn4PqLtEsrWxXJU+CVC43XmwIVyw== - -react-simple-code-editor@^0.13.1: - version "0.13.1" - resolved "https://registry.yarnpkg.com/react-simple-code-editor/-/react-simple-code-editor-0.13.1.tgz#4514553fa132dcaffec33a6612c58f1613c52416" - integrity sha512-XYeVwRZwgyKtjNIYcAEgg2FaQcCZwhbarnkJIV20U2wkCU9q/CPFBo8nRXrK4GXUz3AvbqZFsZRrpUTkqqEYyQ== - -react-transition-group@^4.4.5: - version "4.4.5" - resolved "https://registry.yarnpkg.com/react-transition-group/-/react-transition-group-4.4.5.tgz#e53d4e3f3344da8521489fbef8f2581d42becdd1" - integrity sha512-pZcd1MCJoiKiBR2NRxeCRg13uCXbydPnmB4EOeRrY7480qNWO8IIgQG6zlDkm6uRMsURXPuKq0GWtiM59a5Q6g== - dependencies: - "@babel/runtime" "^7.5.5" - dom-helpers "^5.0.1" - loose-envify "^1.4.0" - prop-types "^15.6.2" - -react-vega@^7.6.0: - version "7.7.1" - resolved "https://registry.yarnpkg.com/react-vega/-/react-vega-7.7.1.tgz#80f2a41a50c0b225f65503444040f77e29dda70b" - integrity sha512-Dj7n1LkfJEkY/FdwQfOZqIQ+wGUcJNwlTuWhYcuQtbBpTgvtI4wwqOvJ0QWBE19nXMU7t9HmP8sqQO5v6soOlg== - dependencies: - "@types/react" "*" - fast-deep-equal "^3.1.1" - prop-types "^15.8.1" - vega-embed "6.5.1" - -react-virtuoso@^4.3.10: - version "4.18.3" - resolved "https://registry.yarnpkg.com/react-virtuoso/-/react-virtuoso-4.18.3.tgz#12e69600c258bc6e6bd31c2516942ef08700deac" - integrity sha512-fLz/peHAx4Eu0DLHurFEEI7Y6n5CqEoxBh04rgJM9yMuOJah2a9zWg/MUOmZLcp7zuWYorXq5+5bf3IRgkNvWg== - -react@^18.2.0: - version "18.3.1" - resolved "https://registry.yarnpkg.com/react/-/react-18.3.1.tgz#49ab892009c53933625bd16b2533fc754cab2891" - integrity sha512-wS+hAgJShR0KhEvPJArfuPVN1+Hz1t0Y6n5jLrGQbkb4urgPE/0Rve+1kMB1v/oWgHgm4WIcV+i7F2pTVj+2iQ== - dependencies: - loose-envify "^1.1.0" - -readable-stream@^2.0.0, readable-stream@^2.0.2, readable-stream@^2.0.5, readable-stream@~2.3.6: - version "2.3.8" - resolved "https://registry.yarnpkg.com/readable-stream/-/readable-stream-2.3.8.tgz#91125e8042bba1b9887f49345f6277027ce8be9b" - integrity sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA== - dependencies: - core-util-is "~1.0.0" - inherits "~2.0.3" - isarray "~1.0.0" - process-nextick-args "~2.0.0" - safe-buffer "~5.1.1" - string_decoder "~1.1.1" - util-deprecate "~1.0.1" - -readable-stream@^3.1.1, readable-stream@^3.4.0, readable-stream@^3.6.0: - version "3.6.2" - resolved "https://registry.yarnpkg.com/readable-stream/-/readable-stream-3.6.2.tgz#56a9b36ea965c00c5a93ef31eb111a0f11056967" - integrity sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA== - dependencies: - inherits "^2.0.3" - string_decoder "^1.1.1" - util-deprecate "^1.0.1" - -readdir-glob@^1.1.2: - version "1.1.3" - resolved "https://registry.yarnpkg.com/readdir-glob/-/readdir-glob-1.1.3.tgz#c3d831f51f5e7bfa62fa2ffbe4b508c640f09584" - integrity sha512-v05I2k7xN8zXvPD9N+z/uhXPaj0sUFCe2rcWZIpBsqxfP7xXFQ0tipAd/wjj1YxWyWtUS5IDJpOG82JKt2EAVA== - dependencies: - minimatch "^5.1.0" - -readdirp@^4.0.1: - version "4.1.2" - resolved "https://registry.yarnpkg.com/readdirp/-/readdirp-4.1.2.tgz#eb85801435fbf2a7ee58f19e0921b068fc69948d" - integrity sha512-GDhwkLfywWL2s6vEjyhri+eXmfH6j1L7JE27WhqLeYzoh/A3DBaYGEj2H/HFZCn/kMfim73FXxEJTw06WtxQwg== - -redent@^3.0.0: - version "3.0.0" - resolved "https://registry.yarnpkg.com/redent/-/redent-3.0.0.tgz#e557b7998316bb53c9f1f56fa626352c6963059f" - integrity sha512-6tDA8g98We0zd0GvVeMT9arEOnTw9qM03L9cJXaCjrip1OO764RDBLBfrB4cwzNGDj5OA5ioymC9GkizgWJDUg== - dependencies: - indent-string "^4.0.0" - strip-indent "^3.0.0" - -redux-persist@^6.0.0: - version "6.0.0" - resolved "https://registry.yarnpkg.com/redux-persist/-/redux-persist-6.0.0.tgz#b4d2972f9859597c130d40d4b146fecdab51b3a8" - integrity sha512-71LLMbUq2r02ng2We9S215LtPu3fY0KgaGE0k8WRgl6RkqxtGfl7HUozz1Dftwsb0D/5mZ8dwAaPbtnzfvbEwQ== - -redux-thunk@^2.4.2: - version "2.4.2" - resolved "https://registry.yarnpkg.com/redux-thunk/-/redux-thunk-2.4.2.tgz#b9d05d11994b99f7a91ea223e8b04cf0afa5ef3b" - integrity sha512-+P3TjtnP0k/FEjcBL5FZpoovtvrTNT/UXd4/sluaSyrURlSlhLSzEdfsTBW7WsKB6yPvgd7q/iZPICFjW4o57Q== - -redux@^4.2.0, redux@^4.2.1: - version "4.2.1" - resolved "https://registry.yarnpkg.com/redux/-/redux-4.2.1.tgz#c08f4306826c49b5e9dc901dee0452ea8fce6197" - integrity sha512-LAUYz4lc+Do8/g7aeRa8JkyDErK6ekstQaqWQrNRW//MY1TvCEpMtpTWvlQ+FPbWCx+Xixu/6SHt5N0HR+SB4w== - dependencies: - "@babel/runtime" "^7.9.2" - -reflect.getprototypeof@^1.0.6, reflect.getprototypeof@^1.0.9: - version "1.0.10" - resolved "https://registry.yarnpkg.com/reflect.getprototypeof/-/reflect.getprototypeof-1.0.10.tgz#c629219e78a3316d8b604c765ef68996964e7bf9" - integrity sha512-00o4I+DVrefhv+nX0ulyi3biSHCPDe+yLv5o/p6d/UVlirijB8E16FtfwSAi4g3tcqrQ4lRAqQSoFEZJehYEcw== - dependencies: - call-bind "^1.0.8" - define-properties "^1.2.1" - es-abstract "^1.23.9" - es-errors "^1.3.0" - es-object-atoms "^1.0.0" - get-intrinsic "^1.2.7" - get-proto "^1.0.1" - which-builtin-type "^1.2.1" - -regexp.prototype.flags@^1.5.3, regexp.prototype.flags@^1.5.4: - version "1.5.4" - resolved "https://registry.yarnpkg.com/regexp.prototype.flags/-/regexp.prototype.flags-1.5.4.tgz#1ad6c62d44a259007e55b3970e00f746efbcaa19" - integrity sha512-dYqgNSZbDwkaJ2ceRd9ojCGjBq+mOm9LmtXnAnEGyHhN/5R7iDW2TRw3h+o/jCFxus3P2LfWIIiwowAjANm7IA== - dependencies: - call-bind "^1.0.8" - define-properties "^1.2.1" - es-errors "^1.3.0" - get-proto "^1.0.1" - gopd "^1.2.0" - set-function-name "^2.0.2" - -require-from-string@^2.0.2: - version "2.0.2" - resolved "https://registry.yarnpkg.com/require-from-string/-/require-from-string-2.0.2.tgz#89a7fdd938261267318eafe14f9c32e598c36909" - integrity sha512-Xf0nWe6RseziFMu+Ap9biiUbmplq6S9/p+7w7YXP/JBHhrUDDUhwa+vANyubuqfZWTveU//DYVGsDG7RKL/vEw== - -reselect@^4.1.8: - version "4.1.8" - resolved "https://registry.yarnpkg.com/reselect/-/reselect-4.1.8.tgz#3f5dc671ea168dccdeb3e141236f69f02eaec524" - integrity sha512-ab9EmR80F/zQTMNeneUr4cv+jSwPJgIlvEmVwLerwrWVbpLlBuls9XHzIeTFy4cegU2NHBp3va0LKOzU5qFEYQ== - -resolve-from@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/resolve-from/-/resolve-from-4.0.0.tgz#4abcd852ad32dd7baabfe9b40e00a36db5f392e6" - integrity sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g== - -resolve@^1.19.0: - version "1.22.11" - resolved "https://registry.yarnpkg.com/resolve/-/resolve-1.22.11.tgz#aad857ce1ffb8bfa9b0b1ac29f1156383f68c262" - integrity sha512-RfqAvLnMl313r7c9oclB1HhUEAezcpLjz95wFH4LVuhk9JF/r22qmVP9AMmOU4vMX7Q8pN8jwNg/CSpdFnMjTQ== - dependencies: - is-core-module "^2.16.1" - path-parse "^1.0.7" - supports-preserve-symlinks-flag "^1.0.0" - -resolve@^2.0.0-next.5: - version "2.0.0-next.6" - resolved "https://registry.yarnpkg.com/resolve/-/resolve-2.0.0-next.6.tgz#b3961812be69ace7b3bc35d5bf259434681294af" - integrity sha512-3JmVl5hMGtJ3kMmB3zi3DL25KfkCEyy3Tw7Gmw7z5w8M9WlwoPFnIvwChzu1+cF3iaK3sp18hhPz8ANeimdJfA== - dependencies: - es-errors "^1.3.0" - is-core-module "^2.16.1" - node-exports-info "^1.6.0" - object-keys "^1.1.1" - path-parse "^1.0.7" - supports-preserve-symlinks-flag "^1.0.0" - -rimraf@2: - version "2.7.1" - resolved "https://registry.yarnpkg.com/rimraf/-/rimraf-2.7.1.tgz#35797f13a7fdadc566142c29d4f07ccad483e3ec" - integrity sha512-uWjbaKIK3T1OSVptzX7Nl6PvQ3qAGtKEtVRjRuazjfL3Bx5eI409VZSqgND+4UNnmzLVdPj9FqFJNPqBZFve4w== - dependencies: - glob "^7.1.3" - -robust-predicates@^3.0.2: - version "3.0.3" - resolved "https://registry.yarnpkg.com/robust-predicates/-/robust-predicates-3.0.3.tgz#1099061b3349e2c5abec6c2ab0acd440d24d4062" - integrity sha512-NS3levdsRIUOmiJ8FZWCP7LG3QpJyrs/TE0Zpf1yvZu8cAJJ6QMW92H1c7kWpdIHo8RvmLxN/o2JXTKHp74lUA== - -rolldown@1.0.0-rc.11: - version "1.0.0-rc.11" - resolved "https://registry.yarnpkg.com/rolldown/-/rolldown-1.0.0-rc.11.tgz#6eaf091b1bbb5ed92e5302171a3d59f0d026d9c0" - integrity sha512-NRjoKMusSjfRbSYiH3VSumlkgFe7kYAa3pzVOsVYVFY3zb5d7nS+a3KGQ7hJKXuYWbzJKPVQ9Wxq2UvyK+ENpw== - dependencies: - "@oxc-project/types" "=0.122.0" - "@rolldown/pluginutils" "1.0.0-rc.11" - optionalDependencies: - "@rolldown/binding-android-arm64" "1.0.0-rc.11" - "@rolldown/binding-darwin-arm64" "1.0.0-rc.11" - "@rolldown/binding-darwin-x64" "1.0.0-rc.11" - "@rolldown/binding-freebsd-x64" "1.0.0-rc.11" - "@rolldown/binding-linux-arm-gnueabihf" "1.0.0-rc.11" - "@rolldown/binding-linux-arm64-gnu" "1.0.0-rc.11" - "@rolldown/binding-linux-arm64-musl" "1.0.0-rc.11" - "@rolldown/binding-linux-ppc64-gnu" "1.0.0-rc.11" - "@rolldown/binding-linux-s390x-gnu" "1.0.0-rc.11" - "@rolldown/binding-linux-x64-gnu" "1.0.0-rc.11" - "@rolldown/binding-linux-x64-musl" "1.0.0-rc.11" - "@rolldown/binding-openharmony-arm64" "1.0.0-rc.11" - "@rolldown/binding-wasm32-wasi" "1.0.0-rc.11" - "@rolldown/binding-win32-arm64-msvc" "1.0.0-rc.11" - "@rolldown/binding-win32-x64-msvc" "1.0.0-rc.11" - -rollup@^4.20.0: - version "4.60.0" - resolved "https://registry.yarnpkg.com/rollup/-/rollup-4.60.0.tgz#d7d68c8cda873e96e08b2443505609b7e7be9eb8" - integrity sha512-yqjxruMGBQJ2gG4HtjZtAfXArHomazDHoFwFFmZZl0r7Pdo7qCIXKqKHZc8yeoMgzJJ+pO6pEEHa+V7uzWlrAQ== - dependencies: - "@types/estree" "1.0.8" - optionalDependencies: - "@rollup/rollup-android-arm-eabi" "4.60.0" - "@rollup/rollup-android-arm64" "4.60.0" - "@rollup/rollup-darwin-arm64" "4.60.0" - "@rollup/rollup-darwin-x64" "4.60.0" - "@rollup/rollup-freebsd-arm64" "4.60.0" - "@rollup/rollup-freebsd-x64" "4.60.0" - "@rollup/rollup-linux-arm-gnueabihf" "4.60.0" - "@rollup/rollup-linux-arm-musleabihf" "4.60.0" - "@rollup/rollup-linux-arm64-gnu" "4.60.0" - "@rollup/rollup-linux-arm64-musl" "4.60.0" - "@rollup/rollup-linux-loong64-gnu" "4.60.0" - "@rollup/rollup-linux-loong64-musl" "4.60.0" - "@rollup/rollup-linux-ppc64-gnu" "4.60.0" - "@rollup/rollup-linux-ppc64-musl" "4.60.0" - "@rollup/rollup-linux-riscv64-gnu" "4.60.0" - "@rollup/rollup-linux-riscv64-musl" "4.60.0" - "@rollup/rollup-linux-s390x-gnu" "4.60.0" - "@rollup/rollup-linux-x64-gnu" "4.60.0" - "@rollup/rollup-linux-x64-musl" "4.60.0" - "@rollup/rollup-openbsd-x64" "4.60.0" - "@rollup/rollup-openharmony-arm64" "4.60.0" - "@rollup/rollup-win32-arm64-msvc" "4.60.0" - "@rollup/rollup-win32-ia32-msvc" "4.60.0" - "@rollup/rollup-win32-x64-gnu" "4.60.0" - "@rollup/rollup-win32-x64-msvc" "4.60.0" - fsevents "~2.3.2" - -rope-sequence@^1.3.0: - version "1.3.4" - resolved "https://registry.npmjs.org/rope-sequence/-/rope-sequence-1.3.4.tgz#df85711aaecd32f1e756f76e43a415171235d425" - integrity sha512-UT5EDe2cu2E/6O4igUr5PSFs23nvvukicWHx6GnOPlHAiiYbzNuCRQCuiUdHJQcqKalLKlrYJnjY0ySGsXNQXQ== - -rw@1: - version "1.3.3" - resolved "https://registry.yarnpkg.com/rw/-/rw-1.3.3.tgz#3f862dfa91ab766b14885ef4d01124bfda074fb4" - integrity sha512-PdhdWy89SiZogBLaw42zdeqtRJ//zFd2PgQavcICDUgJT5oW10QCRKbJ6bg4r0/UY2M6BWd5tkxuGFRvCkgfHQ== - -rybitten@^0.22.0: - version "0.22.0" - resolved "https://registry.yarnpkg.com/rybitten/-/rybitten-0.22.0.tgz#007a0b0b4487a29ad8257d9291c5f4fc085ebc9a" - integrity sha512-w9aWDjaIo3YWLTBFiPDLzWWbdiKDkghLKzCeXDsXTqs64Ai0Dw8mHn9d/nnMvnds93GVpRwqjVM5VH+SDJsIsQ== - -safe-array-concat@^1.1.3: - version "1.1.3" - resolved "https://registry.yarnpkg.com/safe-array-concat/-/safe-array-concat-1.1.3.tgz#c9e54ec4f603b0bbb8e7e5007a5ee7aecd1538c3" - integrity sha512-AURm5f0jYEOydBj7VQlVvDrjeFgthDdEF5H1dP+6mNpoXOMo1quQqJ4wvJDyRZ9+pO3kGWoOdmV08cSv2aJV6Q== - dependencies: - call-bind "^1.0.8" - call-bound "^1.0.2" - get-intrinsic "^1.2.6" - has-symbols "^1.1.0" - isarray "^2.0.5" - -safe-buffer@^5.0.1, safe-buffer@~5.2.0: - version "5.2.1" - resolved "https://registry.yarnpkg.com/safe-buffer/-/safe-buffer-5.2.1.tgz#1eaf9fa9bdb1fdd4ec75f58f9cdb4e6b7827eec6" - integrity sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ== - -safe-buffer@~5.1.0, safe-buffer@~5.1.1: - version "5.1.2" - resolved "https://registry.yarnpkg.com/safe-buffer/-/safe-buffer-5.1.2.tgz#991ec69d296e0313747d59bdfd2b745c35f8828d" - integrity sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g== - -safe-push-apply@^1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/safe-push-apply/-/safe-push-apply-1.0.0.tgz#01850e981c1602d398c85081f360e4e6d03d27f5" - integrity sha512-iKE9w/Z7xCzUMIZqdBsp6pEQvwuEebH4vdpjcDWnyzaI6yl6O9FHvVpmGelvEHNsoY6wGblkxR6Zty/h00WiSA== - dependencies: - es-errors "^1.3.0" - isarray "^2.0.5" - -safe-regex-test@^1.0.3, safe-regex-test@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/safe-regex-test/-/safe-regex-test-1.1.0.tgz#7f87dfb67a3150782eaaf18583ff5d1711ac10c1" - integrity sha512-x/+Cz4YrimQxQccJf5mKEbIa1NzeCRNI5Ecl/ekmlYaampdNLPalVyIcCZNNH3MvmqBugV5TMYZXv0ljslUlaw== - dependencies: - call-bound "^1.0.2" - es-errors "^1.3.0" - is-regex "^1.2.1" - -"safer-buffer@>= 2.1.2 < 3.0.0": - version "2.1.2" - resolved "https://registry.yarnpkg.com/safer-buffer/-/safer-buffer-2.1.2.tgz#44fa161b0187b9549dd84bb91802f9bd8385cd6a" - integrity sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg== - -sass@^1.77.6: - version "1.98.0" - resolved "https://registry.yarnpkg.com/sass/-/sass-1.98.0.tgz#924ce85a3745ccaccd976262fdc1bc0c13aa8e57" - integrity sha512-+4N/u9dZ4PrgzGgPlKnaaRQx64RO0JBKs9sDhQ2pLgN6JQZ25uPQZKQYaBJU48Kd5BxgXoJ4e09Dq7nMcOUW3A== - dependencies: - chokidar "^4.0.0" - immutable "^5.1.5" - source-map-js ">=0.6.2 <2.0.0" - optionalDependencies: - "@parcel/watcher" "^2.4.1" - -saxes@^5.0.1: - version "5.0.1" - resolved "https://registry.yarnpkg.com/saxes/-/saxes-5.0.1.tgz#eebab953fa3b7608dbe94e5dadb15c888fa6696d" - integrity sha512-5LBh1Tls8c9xgGjw3QrMwETmTMVk0oFgvrFSvWx62llR2hcEInrKNZ2GZCCuuy2lvWrdl5jhbpeqc5hRYKFOcw== - dependencies: - xmlchars "^2.2.0" - -saxes@^6.0.0: - version "6.0.0" - resolved "https://registry.yarnpkg.com/saxes/-/saxes-6.0.0.tgz#fe5b4a4768df4f14a201b1ba6a65c1f3d9988cc5" - integrity sha512-xAg7SOnEhrm5zI3puOOKyy1OMcMlIJZYNJY7xLBwSze0UjhPLnWfj2GF2EpT0jmzaJKIWKHLsaSSajf35bcYnA== - dependencies: - xmlchars "^2.2.0" - -scheduler@^0.23.2: - version "0.23.2" - resolved "https://registry.yarnpkg.com/scheduler/-/scheduler-0.23.2.tgz#414ba64a3b282892e944cf2108ecc078d115cdc3" - integrity sha512-UOShsPwz7NrMUqhR6t0hWjFduvOzbtv7toDH1/hIrfRNIDBnnBWd0CwJTGvTpngVlmwGCdP9/Zl/tVrDqcuYzQ== - dependencies: - loose-envify "^1.1.0" - -semver@^6.3.1: - version "6.3.1" - resolved "https://registry.yarnpkg.com/semver/-/semver-6.3.1.tgz#556d2ef8689146e46dcea4bfdd095f3434dffcb4" - integrity sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA== - -semver@^7.1.3, semver@^7.3.5, semver@^7.6.3, semver@^7.7.3: - version "7.7.4" - resolved "https://registry.yarnpkg.com/semver/-/semver-7.7.4.tgz#28464e36060e991fa7a11d0279d2d3f3b57a7e8a" - integrity sha512-vFKC2IEtQnVhpT78h1Yp8wzwrf8CM+MzKMHGJZfBtzhZNycRFnXsHk6E5TxIkkMsgNS7mdX3AGB7x2QM2di4lA== - -seroval-plugins@~1.5.0: - version "1.5.1" - resolved "https://registry.yarnpkg.com/seroval-plugins/-/seroval-plugins-1.5.1.tgz#a0ebcaf43cb06ae0f33c777ebbd5e97223887ec6" - integrity sha512-4FbuZ/TMl02sqv0RTFexu0SP6V+ywaIe5bAWCCEik0fk17BhALgwvUDVF7e3Uvf9pxmwCEJsRPmlkUE6HdzLAw== - -seroval@~1.5.0: - version "1.5.1" - resolved "https://registry.yarnpkg.com/seroval/-/seroval-1.5.1.tgz#e35a01bcb8172ddcef12ef424a170f3ad93f64f0" - integrity sha512-OwrZRZAfhHww0WEnKHDY8OM0U/Qs8OTfIDWhUD4BLpNJUfXK4cGmjiagGze086m+mhI+V2nD0gfbHEnJjb9STA== - -set-function-length@^1.2.2: - version "1.2.2" - resolved "https://registry.yarnpkg.com/set-function-length/-/set-function-length-1.2.2.tgz#aac72314198eaed975cf77b2c3b6b880695e5449" - integrity sha512-pgRc4hJ4/sNjWCSS9AmnS40x3bNMDTknHgL5UaMBTMyJnU90EgWh1Rz+MC9eFu4BuN/UwZjKQuY/1v3rM7HMfg== - dependencies: - define-data-property "^1.1.4" - es-errors "^1.3.0" - function-bind "^1.1.2" - get-intrinsic "^1.2.4" - gopd "^1.0.1" - has-property-descriptors "^1.0.2" - -set-function-name@^2.0.2: - version "2.0.2" - resolved "https://registry.yarnpkg.com/set-function-name/-/set-function-name-2.0.2.tgz#16a705c5a0dc2f5e638ca96d8a8cd4e1c2b90985" - integrity sha512-7PGFlmtwsEADb0WYyvCMa1t+yke6daIG4Wirafur5kcf+MhUnPms1UeR0CKQdTZD81yESwMHbtn+TR+dMviakQ== - dependencies: - define-data-property "^1.1.4" - es-errors "^1.3.0" - functions-have-names "^1.2.3" - has-property-descriptors "^1.0.2" - -set-proto@^1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/set-proto/-/set-proto-1.0.0.tgz#0760dbcff30b2d7e801fd6e19983e56da337565e" - integrity sha512-RJRdvCo6IAnPdsvP/7m6bsQqNnn1FCBX5ZNtFL98MmFF/4xAIJTIg1YbHW5DC2W5SKZanrC6i4HsJqlajw/dZw== - dependencies: - dunder-proto "^1.0.1" - es-errors "^1.3.0" - es-object-atoms "^1.0.0" - -setimmediate@^1.0.5, setimmediate@~1.0.4: - version "1.0.5" - resolved "https://registry.yarnpkg.com/setimmediate/-/setimmediate-1.0.5.tgz#290cbb232e306942d7d7ea9b83732ab7856f8285" - integrity sha512-MATJdZp8sLqDl/68LfQmbP8zKPLQNV6BIZoIgrscFDQ+RsvK/BxeDQOgyxKKoh0y/8h3BqVFnCqQ/gd+reiIXA== - -shebang-command@^2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/shebang-command/-/shebang-command-2.0.0.tgz#ccd0af4f8835fbdc265b82461aaf0c36663f34ea" - integrity sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA== - dependencies: - shebang-regex "^3.0.0" - -shebang-regex@^3.0.0: - version "3.0.0" - resolved "https://registry.yarnpkg.com/shebang-regex/-/shebang-regex-3.0.0.tgz#ae16f1644d873ecad843b0307b143362d4c42172" - integrity sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A== - -side-channel-list@^1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/side-channel-list/-/side-channel-list-1.0.0.tgz#10cb5984263115d3b7a0e336591e290a830af8ad" - integrity sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA== - dependencies: - es-errors "^1.3.0" - object-inspect "^1.13.3" - -side-channel-map@^1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/side-channel-map/-/side-channel-map-1.0.1.tgz#d6bb6b37902c6fef5174e5f533fab4c732a26f42" - integrity sha512-VCjCNfgMsby3tTdo02nbjtM/ewra6jPHmpThenkTYh8pG9ucZ/1P8So4u4FGBek/BjpOVsDCMoLA/iuBKIFXRA== - dependencies: - call-bound "^1.0.2" - es-errors "^1.3.0" - get-intrinsic "^1.2.5" - object-inspect "^1.13.3" - -side-channel-weakmap@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/side-channel-weakmap/-/side-channel-weakmap-1.0.2.tgz#11dda19d5368e40ce9ec2bdc1fb0ecbc0790ecea" - integrity sha512-WPS/HvHQTYnHisLo9McqBHOJk2FkHO/tlpvldyrnem4aeQp4hai3gythswg6p01oSoTl58rcpiFAjF2br2Ak2A== - dependencies: - call-bound "^1.0.2" - es-errors "^1.3.0" - get-intrinsic "^1.2.5" - object-inspect "^1.13.3" - side-channel-map "^1.0.1" - -side-channel@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/side-channel/-/side-channel-1.1.0.tgz#c3fcff9c4da932784873335ec9765fa94ff66bc9" - integrity sha512-ZX99e6tRweoUXqR+VBrslhda51Nh5MTQwou5tnUDgbtyM0dBgmhEDtWGP/xbKn6hqfPRHujUNwz5fy/wbbhnpw== - dependencies: - es-errors "^1.3.0" - object-inspect "^1.13.3" - side-channel-list "^1.0.0" - side-channel-map "^1.0.1" - side-channel-weakmap "^1.0.2" - -siginfo@^2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/siginfo/-/siginfo-2.0.0.tgz#32e76c70b79724e3bb567cb9d543eb858ccfaf30" - integrity sha512-ybx0WO1/8bSBLEWXZvEd7gMW3Sn3JFlW3TvX1nREbDLRNQNaeNN8WK0meBwPdAaOI7TtRRRJn/Es1zhrrCHu7g== - -simple-concat@^1.0.0: - version "1.0.1" - resolved "https://registry.yarnpkg.com/simple-concat/-/simple-concat-1.0.1.tgz#f46976082ba35c2263f1c8ab5edfe26c41c9552f" - integrity sha512-cSFtAPtRhljv69IK0hTVZQ+OfE9nePi/rtJmw5UjHeVyVroEqJXP1sFztKUy1qU+xvz3u/sfYJLa947b7nAN2Q== - -simple-get@^4.0.0: - version "4.0.1" - resolved "https://registry.yarnpkg.com/simple-get/-/simple-get-4.0.1.tgz#4a39db549287c979d352112fa03fd99fd6bc3543" - integrity sha512-brv7p5WgH0jmQJr1ZDDfKDOSeWWg+OVypG99A/5vYGPqJ6pxiaHLy8nxtFjBA7oMa01ebA9gfh1uMCFqOuXxvA== - dependencies: - decompress-response "^6.0.0" - once "^1.3.1" - simple-concat "^1.0.0" - -solid-js@^1.9.5: - version "1.9.12" - resolved "https://registry.yarnpkg.com/solid-js/-/solid-js-1.9.12.tgz#9bc014fe9fbfa590681dc9b4597927153e6bc713" - integrity sha512-QzKaSJq2/iDrWR1As6MHZQ8fQkdOBf8GReYb7L5iKwMGceg7HxDcaOHk0at66tNgn9U2U7dXo8ZZpLIAmGMzgw== - dependencies: - csstype "^3.1.0" - seroval "~1.5.0" - seroval-plugins "~1.5.0" - -"source-map-js@>=0.6.2 <2.0.0", source-map-js@^1.2.1: - version "1.2.1" - resolved "https://registry.yarnpkg.com/source-map-js/-/source-map-js-1.2.1.tgz#1ce5650fddd87abc099eda37dcff024c2667ae46" - integrity sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA== - -source-map@^0.5.7: - version "0.5.7" - resolved "https://registry.yarnpkg.com/source-map/-/source-map-0.5.7.tgz#8a039d2d1021d22d1ea14c80d8ea468ba2ef3fcc" - integrity sha512-LbrmJOMUSdEVxIKvdcJzQC+nQhe8FUZQTXQy6+I75skNgn3OoQ0DZA8YnFa7gp8tqtL3KPf1kmo0R5DoApeSGQ== - -spectral.js@^2.0.2: - version "2.0.2" - resolved "https://registry.yarnpkg.com/spectral.js/-/spectral.js-2.0.2.tgz#afa4686226afc7e69ac4ac764cf0fe0569f4c5a3" - integrity sha512-g7NA/GMc2C50ez/foALJW8DcwvwbMgW5WF0/1fmAib5AN8NkJwMVyWgkPeSGAm4D6XAFXdtz9KM4AreuV+hJsg== - -stackback@0.0.2: - version "0.0.2" - resolved "https://registry.yarnpkg.com/stackback/-/stackback-0.0.2.tgz#1ac8a0d9483848d1695e418b6d031a3c3ce68e3b" - integrity sha512-1XMJE5fQo1jGH6Y/7ebnwPOBEkIEnT4QF32d5R1+VXdXveM0IBMJt8zfaxX1P3QhVwrYe+576+jkANtSS2mBbw== - -std-env@^4.0.0-rc.1: - version "4.0.0" - resolved "https://registry.yarnpkg.com/std-env/-/std-env-4.0.0.tgz#ba3dc31c3a46bc5ba21138aa20a6a4ceb5bb9b7e" - integrity sha512-zUMPtQ/HBY3/50VbpkupYHbRroTRZJPRLvreamgErJVys0ceuzMkD44J/QjqhHjOzK42GQ3QZIeFG1OYfOtKqQ== - -stop-iteration-iterator@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/stop-iteration-iterator/-/stop-iteration-iterator-1.1.0.tgz#f481ff70a548f6124d0312c3aa14cbfa7aa542ad" - integrity sha512-eLoXW/DHyl62zxY4SCaIgnRhuMr6ri4juEYARS8E6sCEqzKpOiE521Ucofdx+KnDZl5xmvGYaaKCk5FEOxJCoQ== - dependencies: - es-errors "^1.3.0" - internal-slot "^1.1.0" - -string-width@^7.0.0, string-width@^7.2.0: - version "7.2.0" - resolved "https://registry.yarnpkg.com/string-width/-/string-width-7.2.0.tgz#b5bb8e2165ce275d4d43476dd2700ad9091db6dc" - integrity sha512-tsaTIkKW9b4N+AEj+SVA+WhJzV7/zMhcSu78mLKWSk7cXMOSHsBKFWUs0fWwq8QyK3MgJBQRX6Gbi4kYbdvGkQ== - dependencies: - emoji-regex "^10.3.0" - get-east-asian-width "^1.0.0" - strip-ansi "^7.1.0" - -string.prototype.includes@^2.0.1: - version "2.0.1" - resolved "https://registry.yarnpkg.com/string.prototype.includes/-/string.prototype.includes-2.0.1.tgz#eceef21283640761a81dbe16d6c7171a4edf7d92" - integrity sha512-o7+c9bW6zpAdJHTtujeePODAhkuicdAryFsfVKwA+wGw89wJ4GTY484WTucM9hLtDEOpOvI+aHnzqnC5lHp4Rg== - dependencies: - call-bind "^1.0.7" - define-properties "^1.2.1" - es-abstract "^1.23.3" - -string.prototype.matchall@^4.0.12: - version "4.0.12" - resolved "https://registry.yarnpkg.com/string.prototype.matchall/-/string.prototype.matchall-4.0.12.tgz#6c88740e49ad4956b1332a911e949583a275d4c0" - integrity sha512-6CC9uyBL+/48dYizRf7H7VAYCMCNTBeM78x/VTUe9bFEaxBepPJDa1Ow99LqI/1yF7kuy7Q3cQsYMrcjGUcskA== - dependencies: - call-bind "^1.0.8" - call-bound "^1.0.3" - define-properties "^1.2.1" - es-abstract "^1.23.6" - es-errors "^1.3.0" - es-object-atoms "^1.0.0" - get-intrinsic "^1.2.6" - gopd "^1.2.0" - has-symbols "^1.1.0" - internal-slot "^1.1.0" - regexp.prototype.flags "^1.5.3" - set-function-name "^2.0.2" - side-channel "^1.1.0" - -string.prototype.repeat@^1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/string.prototype.repeat/-/string.prototype.repeat-1.0.0.tgz#e90872ee0308b29435aa26275f6e1b762daee01a" - integrity sha512-0u/TldDbKD8bFCQ/4f5+mNRrXwZ8hg2w7ZR8wa16e8z9XpePWl3eGEcUD0OXpEH/VJH/2G3gjUtR3ZOiBe2S/w== - dependencies: - define-properties "^1.1.3" - es-abstract "^1.17.5" - -string.prototype.trim@^1.2.10: - version "1.2.10" - resolved "https://registry.yarnpkg.com/string.prototype.trim/-/string.prototype.trim-1.2.10.tgz#40b2dd5ee94c959b4dcfb1d65ce72e90da480c81" - integrity sha512-Rs66F0P/1kedk5lyYyH9uBzuiI/kNRmwJAR9quK6VOtIpZ2G+hMZd+HQbbv25MgCA6gEffoMZYxlTod4WcdrKA== - dependencies: - call-bind "^1.0.8" - call-bound "^1.0.2" - define-data-property "^1.1.4" - define-properties "^1.2.1" - es-abstract "^1.23.5" - es-object-atoms "^1.0.0" - has-property-descriptors "^1.0.2" - -string.prototype.trimend@^1.0.9: - version "1.0.9" - resolved "https://registry.yarnpkg.com/string.prototype.trimend/-/string.prototype.trimend-1.0.9.tgz#62e2731272cd285041b36596054e9f66569b6942" - integrity sha512-G7Ok5C6E/j4SGfyLCloXTrngQIQU3PWtXGst3yM7Bea9FRURf1S42ZHlZZtsNque2FN2PoUhfZXYLNWwEr4dLQ== - dependencies: - call-bind "^1.0.8" - call-bound "^1.0.2" - define-properties "^1.2.1" - es-object-atoms "^1.0.0" - -string.prototype.trimstart@^1.0.8: - version "1.0.8" - resolved "https://registry.yarnpkg.com/string.prototype.trimstart/-/string.prototype.trimstart-1.0.8.tgz#7ee834dda8c7c17eff3118472bb35bfedaa34dde" - integrity sha512-UXSH262CSZY1tfu3G3Secr6uGLCFVPMhIqHjlgCUtCCcgihYc/xKs9djMTMUOb2j1mVSeU8EU6NWc/iQKU6Gfg== - dependencies: - call-bind "^1.0.7" - define-properties "^1.2.1" - es-object-atoms "^1.0.0" - -string_decoder@^1.1.1: - version "1.3.0" - resolved "https://registry.yarnpkg.com/string_decoder/-/string_decoder-1.3.0.tgz#42f114594a46cf1a8e30b0a84f56c78c3edac21e" - integrity sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA== - dependencies: - safe-buffer "~5.2.0" - -string_decoder@~1.1.1: - version "1.1.1" - resolved "https://registry.yarnpkg.com/string_decoder/-/string_decoder-1.1.1.tgz#9cf1611ba62685d7030ae9e4ba34149c3af03fc8" - integrity sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg== - dependencies: - safe-buffer "~5.1.0" - -strip-ansi@^7.1.0: - version "7.2.0" - resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-7.2.0.tgz#d22a269522836a627af8d04b5c3fd2c7fa3e32e3" - integrity sha512-yDPMNjp4WyfYBkHnjIRLfca1i6KMyGCtsVgoKe/z1+6vukgaENdgGBZt+ZmKPc4gavvEZ5OgHfHdrazhgNyG7w== - dependencies: - ansi-regex "^6.2.2" - -strip-indent@^3.0.0: - version "3.0.0" - resolved "https://registry.yarnpkg.com/strip-indent/-/strip-indent-3.0.0.tgz#c32e1cee940b6b3432c771bc2c54bcce73cd3001" - integrity sha512-laJTa3Jb+VQpaC6DseHhF7dXVqHTfJPCRDaEbid/drOhgitgYku/letMUqOXFoWV0zIIUbjpdH2t+tYj4bQMRQ== - dependencies: - min-indent "^1.0.0" - -strip-json-comments@^3.1.1: - version "3.1.1" - resolved "https://registry.yarnpkg.com/strip-json-comments/-/strip-json-comments-3.1.1.tgz#31f1281b3832630434831c310c01cccda8cbe006" - integrity sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig== - -strip-json-comments@~2.0.1: - version "2.0.1" - resolved "https://registry.yarnpkg.com/strip-json-comments/-/strip-json-comments-2.0.1.tgz#3c531942e908c2697c0ec344858c286c7ca0a60a" - integrity sha512-4gB8na07fecVVkOI6Rs4e7T6NOTki5EmL7TUduTs6bu3EdnSycntVJ4re8kgZA+wx9IueI2Y11bfbgwtzuE0KQ== - -stylis@4.2.0: - version "4.2.0" - resolved "https://registry.yarnpkg.com/stylis/-/stylis-4.2.0.tgz#79daee0208964c8fe695a42fcffcac633a211a51" - integrity sha512-Orov6g6BB1sDfYgzWfTHDOxamtX1bE/zo104Dh9e6fqJ3PooipYyfJ0pUmrZO2wAvO8YbEyeFrkV91XTsGMSrw== - -supports-color@^7.1.0: - version "7.2.0" - resolved "https://registry.yarnpkg.com/supports-color/-/supports-color-7.2.0.tgz#1b7dcdcb32b8138801b3e478ba6a51caa89648da" - integrity sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw== - dependencies: - has-flag "^4.0.0" - -supports-preserve-symlinks-flag@^1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/supports-preserve-symlinks-flag/-/supports-preserve-symlinks-flag-1.0.0.tgz#6eda4bd344a3c94aea376d4cc31bc77311039e09" - integrity sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w== - -symbol-tree@^3.2.4: - version "3.2.4" - resolved "https://registry.yarnpkg.com/symbol-tree/-/symbol-tree-3.2.4.tgz#430637d248ba77e078883951fb9aa0eed7c63fa2" - integrity sha512-9QNk5KwDF+Bvz+PyObkmSYjI5ksVUYtjW7AU22r2NKcfLJcXp96hkDWU3+XndOsUb+AQ9QhfzfCT2O+CNWT5Tw== - -tar-fs@^2.0.0: - version "2.1.4" - resolved "https://registry.yarnpkg.com/tar-fs/-/tar-fs-2.1.4.tgz#800824dbf4ef06ded9afea4acafe71c67c76b930" - integrity sha512-mDAjwmZdh7LTT6pNleZ05Yt65HC3E+NiQzl672vQG38jIrehtJk/J3mNwIg+vShQPcLF/LV7CMnDW6vjj6sfYQ== - dependencies: - chownr "^1.1.1" - mkdirp-classic "^0.5.2" - pump "^3.0.0" - tar-stream "^2.1.4" - -tar-stream@^2.1.4, tar-stream@^2.2.0: - version "2.2.0" - resolved "https://registry.yarnpkg.com/tar-stream/-/tar-stream-2.2.0.tgz#acad84c284136b060dc3faa64474aa9aebd77287" - integrity sha512-ujeqbceABgwMZxEJnk2HDY2DlnUZ+9oEcb1KzTVfYHio0UE6dG71n60d8D2I4qNvleWrrXpmjpt7vZeF1LnMZQ== - dependencies: - bl "^4.0.3" - end-of-stream "^1.4.1" - fs-constants "^1.0.0" - inherits "^2.0.3" - readable-stream "^3.1.1" - -text-segmentation@^1.0.3: - version "1.0.3" - resolved "https://registry.yarnpkg.com/text-segmentation/-/text-segmentation-1.0.3.tgz#52a388159efffe746b24a63ba311b6ac9f2d7943" - integrity sha512-iOiPUo/BGnZ6+54OsWxZidGCsdU8YbE4PSpdPinp7DeMtUJNJBoJ/ouUSTJjHkh1KntHaltHl/gDs2FC4i5+Nw== - dependencies: - utrie "^1.0.2" - -tinybench@^2.9.0: - version "2.9.0" - resolved "https://registry.yarnpkg.com/tinybench/-/tinybench-2.9.0.tgz#103c9f8ba6d7237a47ab6dd1dcff77251863426b" - integrity sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg== - -tinyexec@^1.0.2: - version "1.0.4" - resolved "https://registry.yarnpkg.com/tinyexec/-/tinyexec-1.0.4.tgz#6c60864fe1d01331b2f17c6890f535d7e5385408" - integrity sha512-u9r3uZC0bdpGOXtlxUIdwf9pkmvhqJdrVCH9fapQtgy/OeTTMZ1nqH7agtvEfmGui6e1XxjcdrlxvxJvc3sMqw== - -tinyglobby@^0.2.15: - version "0.2.15" - resolved "https://registry.yarnpkg.com/tinyglobby/-/tinyglobby-0.2.15.tgz#e228dd1e638cea993d2fdb4fcd2d4602a79951c2" - integrity sha512-j2Zq4NyQYG5XMST4cbs02Ak8iJUdxRM0XI5QyxXuZOzKOINmWurp3smXu3y5wDcJrptwpSjgXHzIQxR0omXljQ== - dependencies: - fdir "^6.5.0" - picomatch "^4.0.3" - -tinyrainbow@^3.0.3: - version "3.1.0" - resolved "https://registry.yarnpkg.com/tinyrainbow/-/tinyrainbow-3.1.0.tgz#1d8a623893f95cf0a2ddb9e5d11150e191409421" - integrity sha512-Bf+ILmBgretUrdJxzXM0SgXLZ3XfiaUuOj/IKQHuTXip+05Xn+uyEYdVg0kYDipTBcLrCVyUzAPz7QmArb0mmw== - -tiptap-markdown@^0.9.0: - version "0.9.0" - resolved "https://registry.npmjs.org/tiptap-markdown/-/tiptap-markdown-0.9.0.tgz#bbecae2eab01234e4ebb11502042ceef0fef4569" - integrity sha512-dKLQ9iiuGNgrlGVjrNauF/UBzWu4LYOx5pkD0jNkmQt/GOwfCJsBuzZTsf1jZ204ANHOm572mZ9PYvGh1S7tpQ== - dependencies: - "@types/markdown-it" "^13.0.7" - markdown-it "^14.1.0" - markdown-it-task-lists "^2.1.1" - prosemirror-markdown "^1.11.1" - -tldts-core@^7.0.27: - version "7.0.27" - resolved "https://registry.yarnpkg.com/tldts-core/-/tldts-core-7.0.27.tgz#4be95bd03b318f2232ea4c1554c4ae9980c77f69" - integrity sha512-YQ7uPjgWUibIK6DW5lrKujGwUKhLevU4hcGbP5O6TcIUb+oTjJYJVWPS4nZsIHrEEEG6myk/oqAJUEQmpZrHsg== - -tldts@^7.0.5: - version "7.0.27" - resolved "https://registry.yarnpkg.com/tldts/-/tldts-7.0.27.tgz#43c3fc6123eb07a3e12ae1868a9f2d1a5889028c" - integrity sha512-I4FZcVFcqCRuT0ph6dCDpPuO4Xgzvh+spkcTr1gK7peIvxWauoloVO0vuy1FQnijT63ss6AsHB6+OIM4aXHbPg== - dependencies: - tldts-core "^7.0.27" - -tmp@^0.2.0: - version "0.2.5" - resolved "https://registry.yarnpkg.com/tmp/-/tmp-0.2.5.tgz#b06bcd23f0f3c8357b426891726d16015abfd8f8" - integrity sha512-voyz6MApa1rQGUxT3E+BK7/ROe8itEx7vD8/HEvt4xwXucvQ5G5oeEiHkmHZJuBO21RpOf+YYm9MOivj709jow== - -topojson-client@^3.1.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/topojson-client/-/topojson-client-3.1.0.tgz#22e8b1ed08a2b922feeb4af6f53b6ef09a467b99" - integrity sha512-605uxS6bcYxGXw9qi62XyrV6Q3xwbndjachmNxu8HWTtVPxZfEJN9fd/SZS1Q54Sn2y0TMyMxFj/cJINqGHrKw== - dependencies: - commander "2" - -tough-cookie@^6.0.1: - version "6.0.1" - resolved "https://registry.yarnpkg.com/tough-cookie/-/tough-cookie-6.0.1.tgz#a495f833836609ed983c19bc65639cfbceb54c76" - integrity sha512-LktZQb3IeoUWB9lqR5EWTHgW/VTITCXg4D21M+lvybRVdylLrRMnqaIONLVb5mav8vM19m44HIcGq4qASeu2Qw== - dependencies: - tldts "^7.0.5" - -tr46@^6.0.0: - version "6.0.0" - resolved "https://registry.yarnpkg.com/tr46/-/tr46-6.0.0.tgz#f5a1ae546a0adb32a277a2278d0d17fa2f9093e6" - integrity sha512-bLVMLPtstlZ4iMQHpFHTR7GAGj2jxi8Dg0s2h2MafAE4uSWF98FC/3MomU51iQAMf8/qDUbKWf5GxuvvVcXEhw== - dependencies: - punycode "^2.3.1" - -"traverse@>=0.3.0 <0.4": - version "0.3.9" - resolved "https://registry.yarnpkg.com/traverse/-/traverse-0.3.9.tgz#717b8f220cc0bb7b44e40514c22b2e8bbc70d8b9" - integrity sha512-iawgk0hLP3SxGKDfnDJf8wTz4p2qImnyihM5Hh/sGvQ3K37dPi/w8sRhdNIxYA1TwFwc5mDhIJq+O0RsvXBKdQ== - -ts-api-utils@^2.4.0: - version "2.5.0" - resolved "https://registry.yarnpkg.com/ts-api-utils/-/ts-api-utils-2.5.0.tgz#4acd4a155e22734990a5ed1fe9e97f113bcb37c1" - integrity sha512-OJ/ibxhPlqrMM0UiNHJ/0CKQkoKF243/AEmplt3qpRgkW8VG7IfOS41h7V8TjITqdByHzrjcS/2si+y4lIh8NA== - -tslib@2.3.0: - version "2.3.0" - resolved "https://registry.yarnpkg.com/tslib/-/tslib-2.3.0.tgz#803b8cdab3e12ba581a4ca41c8839bbb0dacb09e" - integrity sha512-N82ooyxVNm6h1riLCoyS9e3fuJ3AMG2zIZs2Gd1ATcSFjSA23Q0fzjjZeh0jbJvWVDZ0cJT8yaNNaaXHzueNjg== - -tslib@^2.4.0, tslib@^2.8.1, tslib@~2.8.1: - version "2.8.1" - resolved "https://registry.yarnpkg.com/tslib/-/tslib-2.8.1.tgz#612efe4ed235d567e8aba5f2a5fab70280ade83f" - integrity sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w== - -tunnel-agent@^0.6.0: - version "0.6.0" - resolved "https://registry.yarnpkg.com/tunnel-agent/-/tunnel-agent-0.6.0.tgz#27a5dea06b36b04a0a9966774b290868f0fc40fd" - integrity sha512-McnNiV1l8RYeY8tBgEpuodCC1mLUdbSN+CYBL7kJsJNInOP8UjDDEwdk6Mw60vdLLrr5NHKZhMAOSrR2NZuQ+w== - dependencies: - safe-buffer "^5.0.1" - -type-check@^0.4.0, type-check@~0.4.0: - version "0.4.0" - resolved "https://registry.yarnpkg.com/type-check/-/type-check-0.4.0.tgz#07b8203bfa7056c0657050e3ccd2c37730bab8f1" - integrity sha512-XleUoc9uwGXqjWwXaUTZAmzMcFZ5858QA2vvx1Ur5xIcixXIP+8LnFDgRplU30us6teqdlskFfu+ae4K79Ooew== - dependencies: - prelude-ls "^1.2.1" - -typed-array-buffer@^1.0.3: - version "1.0.3" - resolved "https://registry.yarnpkg.com/typed-array-buffer/-/typed-array-buffer-1.0.3.tgz#a72395450a4869ec033fd549371b47af3a2ee536" - integrity sha512-nAYYwfY3qnzX30IkA6AQZjVbtK6duGontcQm1WSG1MD94YLqK0515GNApXkoxKOWMusVssAHWLh9SeaoefYFGw== - dependencies: - call-bound "^1.0.3" - es-errors "^1.3.0" - is-typed-array "^1.1.14" - -typed-array-byte-length@^1.0.3: - version "1.0.3" - resolved "https://registry.yarnpkg.com/typed-array-byte-length/-/typed-array-byte-length-1.0.3.tgz#8407a04f7d78684f3d252aa1a143d2b77b4160ce" - integrity sha512-BaXgOuIxz8n8pIq3e7Atg/7s+DpiYrxn4vdot3w9KbnBhcRQq6o3xemQdIfynqSeXeDrF32x+WvfzmOjPiY9lg== - dependencies: - call-bind "^1.0.8" - for-each "^0.3.3" - gopd "^1.2.0" - has-proto "^1.2.0" - is-typed-array "^1.1.14" - -typed-array-byte-offset@^1.0.4: - version "1.0.4" - resolved "https://registry.yarnpkg.com/typed-array-byte-offset/-/typed-array-byte-offset-1.0.4.tgz#ae3698b8ec91a8ab945016108aef00d5bff12355" - integrity sha512-bTlAFB/FBYMcuX81gbL4OcpH5PmlFHqlCCpAl8AlEzMz5k53oNDvN8p1PNOWLEmI2x4orp3raOFB51tv9X+MFQ== - dependencies: - available-typed-arrays "^1.0.7" - call-bind "^1.0.8" - for-each "^0.3.3" - gopd "^1.2.0" - has-proto "^1.2.0" - is-typed-array "^1.1.15" - reflect.getprototypeof "^1.0.9" - -typed-array-length@^1.0.7: - version "1.0.7" - resolved "https://registry.yarnpkg.com/typed-array-length/-/typed-array-length-1.0.7.tgz#ee4deff984b64be1e118b0de8c9c877d5ce73d3d" - integrity sha512-3KS2b+kL7fsuk/eJZ7EQdnEmQoaho/r6KUef7hxvltNA5DR8NAUM+8wJMbJyZ4G9/7i3v5zPBIMN5aybAh2/Jg== - dependencies: - call-bind "^1.0.7" - for-each "^0.3.3" - gopd "^1.0.1" - is-typed-array "^1.1.13" - possible-typed-array-names "^1.0.0" - reflect.getprototypeof "^1.0.6" - -typescript-eslint@^8.16.0: - version "8.57.2" - resolved "https://registry.yarnpkg.com/typescript-eslint/-/typescript-eslint-8.57.2.tgz#d64c6648dda5b15176708701537ab0b55ba3c83d" - integrity sha512-VEPQ0iPgWO/sBaZOU1xo4nuNdODVOajPnTIbog2GKYr31nIlZ0fWPoCQgGfF3ETyBl1vn63F/p50Um9Z4J8O8A== - dependencies: - "@typescript-eslint/eslint-plugin" "8.57.2" - "@typescript-eslint/parser" "8.57.2" - "@typescript-eslint/typescript-estree" "8.57.2" - "@typescript-eslint/utils" "8.57.2" - -typescript@^4.9.5: - version "4.9.5" - resolved "https://registry.yarnpkg.com/typescript/-/typescript-4.9.5.tgz#095979f9bcc0d09da324d58d03ce8f8374cbe65a" - integrity sha512-1FXk9E2Hm+QzZQ7z+McJiHL4NW1F2EzMu9Nq9i3zAaGqibafqYwCVU6WyWAuyQRRzOlxou8xZSyXLEN8oKj24g== - -uc.micro@^2.0.0, uc.micro@^2.1.0: - version "2.1.0" - resolved "https://registry.npmjs.org/uc.micro/-/uc.micro-2.1.0.tgz#f8d3f7d0ec4c3dea35a7e3c8efa4cb8b45c9e7ee" - integrity sha512-ARDJmphmdvUk6Glw7y9DQ2bFkKBHwQHLi2lsaH6PPmz/Ka9sFOBsBluozhDltWmnv9u/cF6Rt87znRTPV+yp/A== - -unbox-primitive@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/unbox-primitive/-/unbox-primitive-1.1.0.tgz#8d9d2c9edeea8460c7f35033a88867944934d1e2" - integrity sha512-nWJ91DjeOkej/TA8pXQ3myruKpKEYgqvpw9lz4OPHj/NWFNluYrjbz9j01CJ8yKQd2g4jFoOkINCTW2I5LEEyw== - dependencies: - call-bound "^1.0.3" - has-bigints "^1.0.2" - has-symbols "^1.1.0" - which-boxed-primitive "^1.1.1" - -undici-types@~6.21.0: - version "6.21.0" - resolved "https://registry.yarnpkg.com/undici-types/-/undici-types-6.21.0.tgz#691d00af3909be93a7faa13be61b3a5b50ef12cb" - integrity sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ== - -undici@^7.24.5: - version "7.24.5" - resolved "https://registry.yarnpkg.com/undici/-/undici-7.24.5.tgz#7debcf5623df2d1cb469b6face01645d9c852ae2" - integrity sha512-3IWdCpjgxp15CbJnsi/Y9TCDE7HWVN19j1hmzVhoAkY/+CJx449tVxT5wZc1Gwg8J+P0LWvzlBzxYRnHJ+1i7Q== - -unzipper@^0.10.11: - version "0.10.14" - resolved "https://registry.yarnpkg.com/unzipper/-/unzipper-0.10.14.tgz#d2b33c977714da0fbc0f82774ad35470a7c962b1" - integrity sha512-ti4wZj+0bQTiX2KmKWuwj7lhV+2n//uXEotUmGuQqrbVZSEGFMbI68+c6JCQ8aAmUWYvtHEz2A8K6wXvueR/6g== - dependencies: - big-integer "^1.6.17" - binary "~0.3.0" - bluebird "~3.4.1" - buffer-indexof-polyfill "~1.0.0" - duplexer2 "~0.1.4" - fstream "^1.0.12" - graceful-fs "^4.2.2" - listenercount "~1.0.1" - readable-stream "~2.3.6" - setimmediate "~1.0.4" - -uri-js@^4.2.2: - version "4.4.1" - resolved "https://registry.yarnpkg.com/uri-js/-/uri-js-4.4.1.tgz#9b1a52595225859e55f669d928f88c6c57f2a77e" - integrity sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg== - dependencies: - punycode "^2.1.0" - -use-sync-external-store@^1.0.0, use-sync-external-store@^1.4.0, use-sync-external-store@^1.6.0: - version "1.6.0" - resolved "https://registry.yarnpkg.com/use-sync-external-store/-/use-sync-external-store-1.6.0.tgz#b174bfa65cb2b526732d9f2ac0a408027876f32d" - integrity sha512-Pp6GSwGP/NrPIrxVFAIkOQeyw8lFenOHijQWkUTrDvrF4ALqylP2C/KCkeS9dpUM3KvYRQhna5vt7IL95+ZQ9w== - -usehooks-ts@^3.1.1: - version "3.1.1" - resolved "https://registry.yarnpkg.com/usehooks-ts/-/usehooks-ts-3.1.1.tgz#0bb7f38f36f8219ee4509cc5e944ae610fb97656" - integrity sha512-I4diPp9Cq6ieSUH2wu+fDAVQO43xwtulo+fKEidHUwZPnYImbtkTjzIJYcDcJqxgmX31GVqNFURodvcgHcW0pA== - dependencies: - lodash.debounce "^4.0.8" - -util-deprecate@^1.0.1, util-deprecate@~1.0.1: - version "1.0.2" - resolved "https://registry.yarnpkg.com/util-deprecate/-/util-deprecate-1.0.2.tgz#450d4dc9fa70de732762fbd2d4a28981419a0ccf" - integrity sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw== - -utrie@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/utrie/-/utrie-1.0.2.tgz#d42fe44de9bc0119c25de7f564a6ed1b2c87a645" - integrity sha512-1MLa5ouZiOmQzUbjbu9VmjLzn1QLXBhwpUa7kdLUQK+KQ5KA9I1vk5U4YHe/X2Ch7PYnJfWuWT+VbuxbGwljhw== - dependencies: - base64-arraybuffer "^1.0.2" - -uuid@^8.3.0: - version "8.3.2" - resolved "https://registry.yarnpkg.com/uuid/-/uuid-8.3.2.tgz#80d5b5ced271bb9af6c445f21a1a04c606cefbe2" - integrity sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg== - -validator@^13.15.20: - version "13.15.26" - resolved "https://registry.yarnpkg.com/validator/-/validator-13.15.26.tgz#36c3deeab30e97806a658728a155c66fcaa5b944" - integrity sha512-spH26xU080ydGggxRyR1Yhcbgx+j3y5jbNXk/8L+iRvdIEQ4uTRH2Sgf2dokud6Q4oAtsbNvJ1Ft+9xmm6IZcA== - -vega-canvas@^2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/vega-canvas/-/vega-canvas-2.0.0.tgz#4709deb68f9b4fd7475957bed99f16c38dbc07b8" - integrity sha512-9x+4TTw/USYST5nx4yN272sy9WcqSRjAR0tkQYZJ4cQIeon7uVsnohvoPQK1JZu7K1QXGUqzj08z0u/UegBVMA== - -vega-crossfilter@~5.1.0: - version "5.1.0" - resolved "https://registry.yarnpkg.com/vega-crossfilter/-/vega-crossfilter-5.1.0.tgz#f4c56d9e0c31705cae41cd0e35abcdee20c0c483" - integrity sha512-EmVhfP3p6AM7o/lPan/QAoqjblI19BxWUlvl2TSs0xjQd8KbaYYbS4Ixt3cmEvl0QjRdBMF6CdJJ/cy9DTS4Fw== - dependencies: - d3-array "^3.2.4" - vega-dataflow "^6.1.0" - vega-util "^2.1.0" - -vega-dataflow@^6.1.0, vega-dataflow@~6.1.0: - version "6.1.0" - resolved "https://registry.yarnpkg.com/vega-dataflow/-/vega-dataflow-6.1.0.tgz#1fc48ea6bbbe002d45a1a48eee67aea097a57c55" - integrity sha512-JxumGlODtFbzoQ4c/jQK8Tb/68ih0lrexlCozcMfTAwQ12XhTqCvlafh7MAKKTMBizjOfaQTHm4Jkyb1H5CfyQ== - dependencies: - vega-format "^2.1.0" - vega-loader "^5.1.0" - vega-util "^2.1.0" - -vega-embed@6.5.1: - version "6.5.1" - resolved "https://registry.yarnpkg.com/vega-embed/-/vega-embed-6.5.1.tgz#0eece1e5a616d37a479e8c9c00337e058b710aaa" - integrity sha512-yz/L1bN3+fLOpgXVb/8sCRv4GlZpD2/ngeKJAFRiHTIRm5zK6W0KuqZZvyGaO7E4s7RuYjW1TWhRIOqh5rS5hA== - dependencies: - fast-json-patch "^3.0.0-1" - json-stringify-pretty-compact "^2.0.0" - semver "^7.1.3" - vega-schema-url-parser "^1.1.0" - vega-themes "^2.8.2" - vega-tooltip "^0.22.0" - -vega-embed@^6.21.0: - version "6.29.0" - resolved "https://registry.yarnpkg.com/vega-embed/-/vega-embed-6.29.0.tgz#bf7d2e82b70cef0d56d1beb3202d8085e6653bac" - integrity sha512-PmlshTLtLFLgWtF/b23T1OwX53AugJ9RZ3qPE2c01VFAbgt3/GSNI/etzA/GzdrkceXFma+FDHNXUppKuM0U6Q== - dependencies: - fast-json-patch "^3.1.1" - json-stringify-pretty-compact "^4.0.0" - semver "^7.6.3" - tslib "^2.8.1" - vega-interpreter "^1.0.5" - vega-schema-url-parser "^2.2.0" - vega-themes "^2.15.0" - vega-tooltip "^0.35.2" - -vega-encode@~5.1.0: - version "5.1.0" - resolved "https://registry.yarnpkg.com/vega-encode/-/vega-encode-5.1.0.tgz#05f56b898822e09df96a5ca7f1017b9f9a1c4d3b" - integrity sha512-q26oI7B+MBQYcTQcr5/c1AMsX3FvjZLQOBi7yI0vV+GEn93fElDgvhQiYrgeYSD4Exi/jBPeUXuN6p4bLz16kA== - dependencies: - d3-array "^3.2.4" - d3-interpolate "^3.0.1" - vega-dataflow "^6.1.0" - vega-scale "^8.1.0" - vega-util "^2.1.0" - -vega-event-selector@^4.0.0, vega-event-selector@~4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/vega-event-selector/-/vega-event-selector-4.0.0.tgz#425e9f2671e858a1a45b4b6a7fc452ca0b22abbf" - integrity sha512-CcWF4m4KL/al1Oa5qSzZ5R776q8lRxCj3IafCHs5xipoEHrkgu1BWa7F/IH5HrDNXeIDnqOpSV1pFsAWRak4gQ== - -vega-expression@^6.1.0, vega-expression@~6.1.0: - version "6.1.0" - resolved "https://registry.yarnpkg.com/vega-expression/-/vega-expression-6.1.0.tgz#6ce358a39b9b953806bff200f6f84f44163c9e38" - integrity sha512-hHgNx/fQ1Vn1u6vHSamH7lRMsOa/yQeHGGcWVmh8fZafLdwdhCM91kZD9p7+AleNpgwiwzfGogtpATFaMmDFYg== - dependencies: - "@types/estree" "^1.0.8" - vega-util "^2.1.0" - -vega-force@~5.1.0: - version "5.1.0" - resolved "https://registry.yarnpkg.com/vega-force/-/vega-force-5.1.0.tgz#aa7cf8edbe2ae3bada070f343565dfb841e501a9" - integrity sha512-wdnchOSeXpF9Xx8Yp0s6Do9F7YkFeOn/E/nENtsI7NOcyHpICJ5+UkgjUo9QaQ/Yu+dIDU+sP/4NXsUtq6SMaQ== - dependencies: - d3-force "^3.0.0" - vega-dataflow "^6.1.0" - vega-util "^2.1.0" - -vega-format@^2.1.0, vega-format@~2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/vega-format/-/vega-format-2.1.0.tgz#4652c7ec9fb1b7ff9a2c50dcd498a36ba6146fda" - integrity sha512-i9Ht33IgqG36+S1gFDpAiKvXCPz+q+1vDhDGKK8YsgMxGOG4PzinKakI66xd7SdV4q97FgpR7odAXqtDN2wKqw== - dependencies: - d3-array "^3.2.4" - d3-format "^3.1.0" - d3-time-format "^4.1.0" - vega-time "^3.1.0" - vega-util "^2.1.0" - -vega-functions@^6.1.0, vega-functions@~6.1.0: - version "6.1.1" - resolved "https://registry.yarnpkg.com/vega-functions/-/vega-functions-6.1.1.tgz#5d4e9746aadde2b3b70d8da3e6338892d05cd9d2" - integrity sha512-Due6jP0y0FfsGMTrHnzUGnEwXPu7VwE+9relfo+LjL/tRPYnnKqwWvzt7n9JkeBuZqjkgYjMzm/WucNn6Hkw5A== - dependencies: - d3-array "^3.2.4" - d3-color "^3.1.0" - d3-geo "^3.1.1" - vega-dataflow "^6.1.0" - vega-expression "^6.1.0" - vega-scale "^8.1.0" - vega-scenegraph "^5.1.0" - vega-selections "^6.1.0" - vega-statistics "^2.0.0" - vega-time "^3.1.0" - vega-util "^2.1.0" - -vega-geo@~5.1.0: - version "5.1.0" - resolved "https://registry.yarnpkg.com/vega-geo/-/vega-geo-5.1.0.tgz#d8fe6ae912ad27cd2b1c21f545a74c07da093589" - integrity sha512-H8aBBHfthc3rzDbz/Th18+Nvp00J73q3uXGAPDQqizioDm/CoXCK8cX4pMePydBY9S6ikBiGJrLKFDa80wI20g== - dependencies: - d3-array "^3.2.4" - d3-color "^3.1.0" - d3-geo "^3.1.1" - vega-canvas "^2.0.0" - vega-dataflow "^6.1.0" - vega-projection "^2.1.0" - vega-statistics "^2.0.0" - vega-util "^2.1.0" - -vega-hierarchy@~5.1.0: - version "5.1.0" - resolved "https://registry.yarnpkg.com/vega-hierarchy/-/vega-hierarchy-5.1.0.tgz#423770dd1cb4684370f23a688dc5b6dad1399dbf" - integrity sha512-rZlU8QJNETlB6o73lGCPybZtw2fBBsRIRuFE77aCLFHdGsh6wIifhplVarqE9icBqjUHRRUOmcEYfzwVIPr65g== - dependencies: - d3-hierarchy "^3.1.2" - vega-dataflow "^6.1.0" - vega-util "^2.1.0" - -vega-interpreter@^1.0.5: - version "1.2.1" - resolved "https://registry.yarnpkg.com/vega-interpreter/-/vega-interpreter-1.2.1.tgz#54d6af77f45c0a8c0f0b3d11914a64b68a7f0dc9" - integrity sha512-EMHLGxJ+SWfh1K/fHDRlHEZtLA/2ZNAXItYb5e8CxuAIm/Ha/3DHX/8VlvbTGIciUpuwmcKx4tVhJWlKreQ/Yw== - dependencies: - vega-util "^1.17.4" - -vega-label@~2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/vega-label/-/vega-label-2.1.0.tgz#bd977cd14e9b062fce31593a2db2819aa9efb2c9" - integrity sha512-/hgf+zoA3FViDBehrQT42Lta3t8In6YwtMnwjYlh72zNn1p3c7E3YUBwqmAqTM1x+tudgzMRGLYig+bX1ewZxQ== - dependencies: - vega-canvas "^2.0.0" - vega-dataflow "^6.1.0" - vega-scenegraph "^5.1.0" - vega-util "^2.1.0" - -vega-lite@6.4.1: - version "6.4.1" - resolved "https://registry.yarnpkg.com/vega-lite/-/vega-lite-6.4.1.tgz#549634ecaefd46d00f17e7922577d0c97a4663c5" - integrity sha512-KO3ybHNouRK4A0al/+2fN9UqgTEfxrd/ntGLY933Hg5UOYotDVQdshR3zn7OfXwQ7uj0W96Vfa5R+QxO8am3IQ== - dependencies: - json-stringify-pretty-compact "~4.0.0" - tslib "~2.8.1" - vega-event-selector "~4.0.0" - vega-expression "~6.1.0" - vega-util "~2.1.0" - yargs "~18.0.0" - -vega-loader@^5.1.0, vega-loader@~5.1.0: - version "5.1.0" - resolved "https://registry.yarnpkg.com/vega-loader/-/vega-loader-5.1.0.tgz#69378fc4d46e8d4573ad308f76464e66b02579e6" - integrity sha512-GaY3BdSPbPNdtrBz8SYUBNmNd8mdPc3mtdZfdkFazQ0RD9m+Toz5oR8fKnTamNSk9fRTJX0Lp3uEqxrAlQVreg== - dependencies: - d3-dsv "^3.0.1" - topojson-client "^3.1.0" - vega-format "^2.1.0" - vega-util "^2.1.0" - -vega-parser@~7.1.0: - version "7.1.0" - resolved "https://registry.yarnpkg.com/vega-parser/-/vega-parser-7.1.0.tgz#20ee0e70a6ecdb8cb34ef16deed484ad68c40850" - integrity sha512-g0lrYxtmYVW8G6yXpIS4J3Uxt9OUSkc0bLu5afoYDo4rZmoOOdll3x3ebActp5LHPW+usZIE+p5nukRS2vEc7Q== - dependencies: - vega-dataflow "^6.1.0" - vega-event-selector "^4.0.0" - vega-functions "^6.1.0" - vega-scale "^8.1.0" - vega-util "^2.1.0" - -vega-projection@^2.1.0, vega-projection@~2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/vega-projection/-/vega-projection-2.1.0.tgz#ce46291ef78a7418c75679103296d62f49afac14" - integrity sha512-EjRjVSoMR5ibrU7q8LaOQKP327NcOAM1+eZ+NO4ANvvAutwmbNVTmfA1VpPH+AD0AlBYc39ND/wnRk7SieDiXA== - dependencies: - d3-geo "^3.1.1" - d3-geo-projection "^4.0.0" - vega-scale "^8.1.0" - -vega-regression@~2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/vega-regression/-/vega-regression-2.1.0.tgz#d3fd103e97a0aee55ae2a78ed81588fb5dcb9e03" - integrity sha512-HzC7MuoEwG1rIxRaNTqgcaYF03z/ZxYkQR2D5BN0N45kLnHY1HJXiEcZkcffTsqXdspLjn47yLi44UoCwF5fxQ== - dependencies: - d3-array "^3.2.4" - vega-dataflow "^6.1.0" - vega-statistics "^2.0.0" - vega-util "^2.1.0" - -vega-runtime@^7.1.0, vega-runtime@~7.1.0: - version "7.1.0" - resolved "https://registry.yarnpkg.com/vega-runtime/-/vega-runtime-7.1.0.tgz#1959d6168638f85bdce4d157117aca6ad1f69fac" - integrity sha512-mItI+WHimyEcZlZrQ/zYR3LwHVeyHCWwp7MKaBjkU8EwkSxEEGVceyGUY9X2YuJLiOgkLz/6juYDbMv60pfwYA== - dependencies: - vega-dataflow "^6.1.0" - vega-util "^2.1.0" - -vega-scale@^8.1.0, vega-scale@~8.1.0: - version "8.1.0" - resolved "https://registry.yarnpkg.com/vega-scale/-/vega-scale-8.1.0.tgz#a06b3aa8d60ae46ad8f3d89eae0e74eb3d1200e3" - integrity sha512-VEgDuEcOec8+C8+FzLcnAmcXrv2gAJKqQifCdQhkgnsLa978vYUgVfCut/mBSMMHbH8wlUV1D0fKZTjRukA1+A== - dependencies: - d3-array "^3.2.4" - d3-interpolate "^3.0.1" - d3-scale "^4.0.2" - d3-scale-chromatic "^3.1.0" - vega-time "^3.1.0" - vega-util "^2.1.0" - -vega-scenegraph@^5.1.0, vega-scenegraph@~5.1.0: - version "5.1.0" - resolved "https://registry.yarnpkg.com/vega-scenegraph/-/vega-scenegraph-5.1.0.tgz#3b3c0d871799fe84bc563256d7b9d54bc2e13368" - integrity sha512-4gA89CFIxkZX+4Nvl8SZF2MBOqnlj9J5zgdPh/HPx+JOwtzSlUqIhxFpFj7GWYfwzr/PyZnguBLPihPw1Og/cA== - dependencies: - d3-path "^3.1.0" - d3-shape "^3.2.0" - vega-canvas "^2.0.0" - vega-loader "^5.1.0" - vega-scale "^8.1.0" - vega-util "^2.1.0" - -vega-schema-url-parser@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/vega-schema-url-parser/-/vega-schema-url-parser-1.1.0.tgz#39168ec04e5468ce278a06c16ec0d126035a85b5" - integrity sha512-Tc85J2ofMZZOsxiqDM9sbvfsa+Vdo3GwNLjEEsPOsCDeYqsUHKAlc1IpbbhPLZ6jusyM9Lk0e1izF64GGklFDg== - -vega-schema-url-parser@^2.2.0: - version "2.2.0" - resolved "https://registry.yarnpkg.com/vega-schema-url-parser/-/vega-schema-url-parser-2.2.0.tgz#a0d1e02915adfbfcb1fd517c8c2ebe2419985c1e" - integrity sha512-yAtdBnfYOhECv9YC70H2gEiqfIbVkq09aaE4y/9V/ovEFmH9gPKaEgzIZqgT7PSPQjKhsNkb6jk6XvSoboxOBw== - -vega-selections@^6.1.0: - version "6.1.2" - resolved "https://registry.yarnpkg.com/vega-selections/-/vega-selections-6.1.2.tgz#3646db6a9fc1d725969b8b5841e5d333c1f0f803" - integrity sha512-xJ+V4qdd46nk2RBdwIRrQm2iSTMHdlu/omhLz1pqRL3jZDrkqNBXimrisci2kIKpH2WBpA1YVagwuZEKBmF2Qw== - dependencies: - d3-array "3.2.4" - vega-expression "^6.1.0" - vega-util "^2.1.0" - -vega-statistics@^2.0.0, vega-statistics@~2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/vega-statistics/-/vega-statistics-2.0.0.tgz#9c9636c20682ae98e8887f8fab0e82c2466a736a" - integrity sha512-dGPfDXnBlgXbZF3oxtkb8JfeRXd5TYHx25Z/tIoaa9jWua4Vf/AoW2wwh8J1qmMy8J03/29aowkp1yk4DOPazQ== - dependencies: - d3-array "^3.2.4" - -vega-themes@^2.15.0, vega-themes@^2.8.2: - version "2.15.0" - resolved "https://registry.yarnpkg.com/vega-themes/-/vega-themes-2.15.0.tgz#cf7592efb45406957e9beb67d7033ee5f7b7a511" - integrity sha512-DicRAKG9z+23A+rH/3w3QjJvKnlGhSbbUXGjBvYGseZ1lvj9KQ0BXZ2NS/+MKns59LNpFNHGi9us/wMlci4TOA== - -vega-time@^3.1.0, vega-time@~3.1.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/vega-time/-/vega-time-3.1.0.tgz#4e20c5d60e3f7e827a33db29bd4855f40a0ae3cb" - integrity sha512-G93mWzPwNa6UYQRkr8Ujur9uqxbBDjDT/WpXjbDY0yygdSkRT+zXF+Sb4gjhW0nPaqdiwkn0R6kZcSPMj1bMNA== - dependencies: - d3-array "^3.2.4" - d3-time "^3.1.0" - vega-util "^2.1.0" - -vega-tooltip@^0.22.0: - version "0.22.1" - resolved "https://registry.yarnpkg.com/vega-tooltip/-/vega-tooltip-0.22.1.tgz#231d6c8a105b6ac531bf8275cd0950c30373e558" - integrity sha512-mPmzxwvi6+2ZgbZ/+mNC7XbSu5I6Ckon8zdgUfH9neb+vV7CKlV/FYypMdVN/9iDMFUqGzybYdqNOiSPPIxFEQ== - dependencies: - vega-util "^1.13.1" - -vega-tooltip@^0.35.2: - version "0.35.2" - resolved "https://registry.yarnpkg.com/vega-tooltip/-/vega-tooltip-0.35.2.tgz#a019133d481ce1e0876c0dc948a0a13703d48bcc" - integrity sha512-kuYcsAAKYn39ye5wKf2fq1BAxVcjoz0alvKp/G+7BWfIb94J0PHmwrJ5+okGefeStZnbXxINZEOKo7INHaj9GA== - dependencies: - vega-util "^1.17.2" - optionalDependencies: - "@rollup/rollup-linux-x64-gnu" "^4.24.4" - -vega-transforms@~5.1.0: - version "5.1.0" - resolved "https://registry.yarnpkg.com/vega-transforms/-/vega-transforms-5.1.0.tgz#4e95cd7c4773aa560928d10385a0d33ea2748caa" - integrity sha512-mj/sO2tSuzzpiXX8JSl4DDlhEmVwM/46MTAzTNQUQzJPMI/n4ChCjr/SdEbfEyzlD4DPm1bjohZGjLc010yuMg== - dependencies: - d3-array "^3.2.4" - vega-dataflow "^6.1.0" - vega-statistics "^2.0.0" - vega-time "^3.1.0" - vega-util "^2.1.0" - -vega-typings@~2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/vega-typings/-/vega-typings-2.1.0.tgz#1c1fe548c0f00997820246ade0d3d813b87bfd76" - integrity sha512-zdis4Fg4gv37yEvTTSZEVMNhp8hwyEl7GZ4X4HHddRVRKxWFsbyKvZx/YW5Z9Ox4sjxVA2qHzEbod4Fdx+SEJA== - dependencies: - "@types/geojson" "7946.0.16" - vega-event-selector "^4.0.0" - vega-expression "^6.1.0" - vega-util "^2.1.0" - -vega-util@^1.13.1, vega-util@^1.17.2, vega-util@^1.17.4: - version "1.17.4" - resolved "https://registry.yarnpkg.com/vega-util/-/vega-util-1.17.4.tgz#b35781fe8e8d030e6519746682843d7ef9ff6d27" - integrity sha512-+y3ZW7dEqM8Ck+KRsd+jkMfxfE7MrQxUyIpNjkfhIpGEreym+aTn7XUw1DKXqclr8mqTQvbilPo16B3lnBr0wA== - -vega-util@^2.1.0, vega-util@~2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/vega-util/-/vega-util-2.1.0.tgz#54f42d6a80e5904ea9ac6c0327e6ac57601ce85f" - integrity sha512-PGfp0m0QCufDmcxKJCWQy4Ov23FoF8DSXmoJwSezi3itQaa2hbxK0+xwsTMP2vy4PR16Pu25HMzgMwXVW1+33w== - -vega-view-transforms@~5.1.0: - version "5.1.0" - resolved "https://registry.yarnpkg.com/vega-view-transforms/-/vega-view-transforms-5.1.0.tgz#1f31f75efcf99b38969e750043adb922fcec6f3e" - integrity sha512-fpigh/xn/32t+An1ShoY3MLeGzNdlbAp2+HvFKzPpmpMTZqJEWkk/J/wHU7Swyc28Ta7W1z3fO+8dZkOYO5TWQ== - dependencies: - vega-dataflow "^6.1.0" - vega-scenegraph "^5.1.0" - vega-util "^2.1.0" - -vega-view@~6.1.0: - version "6.1.0" - resolved "https://registry.yarnpkg.com/vega-view/-/vega-view-6.1.0.tgz#5596f78c5ebca8dcb57feca40fd31cb8265fd04e" - integrity sha512-hmHDm/zC65lb23mb9Tr9Gx0wkxP0TMS31LpMPYxIZpvInxvUn7TYitkOtz1elr63k2YZrgmF7ztdGyQ4iCQ5fQ== - dependencies: - d3-array "^3.2.4" - d3-timer "^3.0.1" - vega-dataflow "^6.1.0" - vega-format "^2.1.0" - vega-functions "^6.1.0" - vega-runtime "^7.1.0" - vega-scenegraph "^5.1.0" - vega-util "^2.1.0" - -vega-voronoi@~5.1.0: - version "5.1.0" - resolved "https://registry.yarnpkg.com/vega-voronoi/-/vega-voronoi-5.1.0.tgz#92956b9d78f06e3918970fc84d06974e24b9f52f" - integrity sha512-uKdsoR9x60mz7eYtVG+NhlkdQXeVdMr6jHNAHxs+W+i6kawkUp5S9jp1xf1FmW/uZvtO1eqinHQNwATcDRsiUg== - dependencies: - d3-delaunay "^6.0.4" - vega-dataflow "^6.1.0" - vega-util "^2.1.0" - -vega-wordcloud@~5.1.0: - version "5.1.0" - resolved "https://registry.yarnpkg.com/vega-wordcloud/-/vega-wordcloud-5.1.0.tgz#7aa8dcbf6c83b193fe71fb6410be15ad2c7285e6" - integrity sha512-sSdNmT8y2D7xXhM2h76dKyaYn3PA4eV49WUUkfYfqHz/vpcu10GSAoFxLhQQTkbZXR+q5ZB63tFUow9W2IFo6g== - dependencies: - vega-canvas "^2.0.0" - vega-dataflow "^6.1.0" - vega-scale "^8.1.0" - vega-statistics "^2.0.0" - vega-util "^2.1.0" - -vega@^6.2.0: - version "6.2.0" - resolved "https://registry.yarnpkg.com/vega/-/vega-6.2.0.tgz#34c2de83b00e701e040029738b26f1ec992f327f" - integrity sha512-BIwalIcEGysJdQDjeVUmMWB3e50jPDNAMfLJscjEvpunU9bSt7X1OYnQxkg3uBwuRRI4nWfFZO9uIW910nLeGw== - dependencies: - vega-crossfilter "~5.1.0" - vega-dataflow "~6.1.0" - vega-encode "~5.1.0" - vega-event-selector "~4.0.0" - vega-expression "~6.1.0" - vega-force "~5.1.0" - vega-format "~2.1.0" - vega-functions "~6.1.0" - vega-geo "~5.1.0" - vega-hierarchy "~5.1.0" - vega-label "~2.1.0" - vega-loader "~5.1.0" - vega-parser "~7.1.0" - vega-projection "~2.1.0" - vega-regression "~2.1.0" - vega-runtime "~7.1.0" - vega-scale "~8.1.0" - vega-scenegraph "~5.1.0" - vega-statistics "~2.0.0" - vega-time "~3.1.0" - vega-transforms "~5.1.0" - vega-typings "~2.1.0" - vega-util "~2.1.0" - vega-view "~6.1.0" - vega-view-transforms "~5.1.0" - vega-voronoi "~5.1.0" - vega-wordcloud "~5.1.0" - -vite@^5.4.21: - version "5.4.21" - resolved "https://registry.yarnpkg.com/vite/-/vite-5.4.21.tgz#84a4f7c5d860b071676d39ba513c0d598fdc7027" - integrity sha512-o5a9xKjbtuhY6Bi5S3+HvbRERmouabWbyUcpXXUA1u+GNUKoROi9byOJ8M0nHbHYHkYICiMlqxkg1KkYmm25Sw== - dependencies: - esbuild "^0.21.3" - postcss "^8.4.43" - rollup "^4.20.0" - optionalDependencies: - fsevents "~2.3.3" - -"vite@^6.0.0 || ^7.0.0 || ^8.0.0": - version "8.0.2" - resolved "https://registry.yarnpkg.com/vite/-/vite-8.0.2.tgz#fcee428eb0ad3d4aa9843d7f7ba981679bbe5edc" - integrity sha512-1gFhNi+bHhRE/qKZOJXACm6tX4bA3Isy9KuKF15AgSRuRazNBOJfdDemPBU16/mpMxApDPrWvZ08DcLPEoRnuA== - dependencies: - lightningcss "^1.32.0" - picomatch "^4.0.3" - postcss "^8.5.8" - rolldown "1.0.0-rc.11" - tinyglobby "^0.2.15" - optionalDependencies: - fsevents "~2.3.3" - -vitest@^4.1.0: - version "4.1.1" - resolved "https://registry.yarnpkg.com/vitest/-/vitest-4.1.1.tgz#04700de9cb16898640ebfb4613abecfa83fac4fc" - integrity sha512-yF+o4POL41rpAzj5KVILUxm1GCjKnELvaqmU9TLLUbMfDzuN0UpUR9uaDs+mCtjPe+uYPksXDRLQGGPvj1cTmA== - dependencies: - "@vitest/expect" "4.1.1" - "@vitest/mocker" "4.1.1" - "@vitest/pretty-format" "4.1.1" - "@vitest/runner" "4.1.1" - "@vitest/snapshot" "4.1.1" - "@vitest/spy" "4.1.1" - "@vitest/utils" "4.1.1" - es-module-lexer "^2.0.0" - expect-type "^1.3.0" - magic-string "^0.30.21" - obug "^2.1.1" - pathe "^2.0.3" - picomatch "^4.0.3" - std-env "^4.0.0-rc.1" - tinybench "^2.9.0" - tinyexec "^1.0.2" - tinyglobby "^0.2.15" - tinyrainbow "^3.0.3" - vite "^6.0.0 || ^7.0.0 || ^8.0.0" - why-is-node-running "^2.3.0" - -vm-browserify@^1.1.2: - version "1.1.2" - resolved "https://registry.yarnpkg.com/vm-browserify/-/vm-browserify-1.1.2.tgz#78641c488b8e6ca91a75f511e7a3b32a86e5dda0" - integrity sha512-2ham8XPWTONajOR0ohOKOHXkm3+gaBmGut3SRuu75xLd/RRaY6vqgh8NBYYk7+RW3u5AtzPQZG8F10LHkl0lAQ== - -void-elements@3.1.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/void-elements/-/void-elements-3.1.0.tgz#614f7fbf8d801f0bb5f0661f5b2f5785750e4f09" - integrity sha512-Dhxzh5HZuiHQhbvTW9AMetFfBHDMYpo23Uo9btPXgdYP+3T5S+p+jgNy7spra+veYhBP2dCSgxR/i2Y02h5/6w== - -w3c-keyname@^2.2.0: - version "2.2.8" - resolved "https://registry.npmjs.org/w3c-keyname/-/w3c-keyname-2.2.8.tgz#7b17c8c6883d4e8b86ac8aba79d39e880f8869c5" - integrity sha512-dpojBhNsCNN7T82Tm7k26A6G9ML3NkhDsnw9n/eoxSRlVBB4CEtIQ/KTCLI2Fwf3ataSXRhYFkQi3SlnFwPvPQ== - -w3c-xmlserializer@^5.0.0: - version "5.0.0" - resolved "https://registry.yarnpkg.com/w3c-xmlserializer/-/w3c-xmlserializer-5.0.0.tgz#f925ba26855158594d907313cedd1476c5967f6c" - integrity sha512-o8qghlI8NZHU1lLPrpi2+Uq7abh4GGPpYANlalzWxyWteJOCsr/P+oPBA49TOLu5FTZO4d3F9MnWJfiMo4BkmA== - dependencies: - xml-name-validator "^5.0.0" - -webidl-conversions@^8.0.1: - version "8.0.1" - resolved "https://registry.yarnpkg.com/webidl-conversions/-/webidl-conversions-8.0.1.tgz#0657e571fe6f06fcb15ca50ed1fdbcb495cd1686" - integrity sha512-BMhLD/Sw+GbJC21C/UgyaZX41nPt8bUTg+jWyDeg7e7YN4xOM05YPSIXceACnXVtqyEw/LMClUQMtMZ+PGGpqQ== - -whatwg-mimetype@^5.0.0: - version "5.0.0" - resolved "https://registry.yarnpkg.com/whatwg-mimetype/-/whatwg-mimetype-5.0.0.tgz#d8232895dbd527ceaee74efd4162008fb8a8cf48" - integrity sha512-sXcNcHOC51uPGF0P/D4NVtrkjSU2fNsm9iog4ZvZJsL3rjoDAzXZhkm2MWt1y+PUdggKAYVoMAIYcs78wJ51Cw== - -whatwg-url@^16.0.0, whatwg-url@^16.0.1: - version "16.0.1" - resolved "https://registry.yarnpkg.com/whatwg-url/-/whatwg-url-16.0.1.tgz#047f7f4bd36ef76b7198c172d1b1cebc66f764dd" - integrity sha512-1to4zXBxmXHV3IiSSEInrreIlu02vUOvrhxJJH5vcxYTBDAx51cqZiKdyTxlecdKNSjj8EcxGBxNf6Vg+945gw== - dependencies: - "@exodus/bytes" "^1.11.0" - tr46 "^6.0.0" - webidl-conversions "^8.0.1" - -which-boxed-primitive@^1.1.0, which-boxed-primitive@^1.1.1: - version "1.1.1" - resolved "https://registry.yarnpkg.com/which-boxed-primitive/-/which-boxed-primitive-1.1.1.tgz#d76ec27df7fa165f18d5808374a5fe23c29b176e" - integrity sha512-TbX3mj8n0odCBFVlY8AxkqcHASw3L60jIuF8jFP78az3C2YhmGvqbHBpAjTRH2/xqYunrJ9g1jSyjCjpoWzIAA== - dependencies: - is-bigint "^1.1.0" - is-boolean-object "^1.2.1" - is-number-object "^1.1.1" - is-string "^1.1.1" - is-symbol "^1.1.1" - -which-builtin-type@^1.2.1: - version "1.2.1" - resolved "https://registry.yarnpkg.com/which-builtin-type/-/which-builtin-type-1.2.1.tgz#89183da1b4907ab089a6b02029cc5d8d6574270e" - integrity sha512-6iBczoX+kDQ7a3+YJBnh3T+KZRxM/iYNPXicqk66/Qfm1b93iu+yOImkg0zHbj5LNOcNv1TEADiZ0xa34B4q6Q== - dependencies: - call-bound "^1.0.2" - function.prototype.name "^1.1.6" - has-tostringtag "^1.0.2" - is-async-function "^2.0.0" - is-date-object "^1.1.0" - is-finalizationregistry "^1.1.0" - is-generator-function "^1.0.10" - is-regex "^1.2.1" - is-weakref "^1.0.2" - isarray "^2.0.5" - which-boxed-primitive "^1.1.0" - which-collection "^1.0.2" - which-typed-array "^1.1.16" - -which-collection@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/which-collection/-/which-collection-1.0.2.tgz#627ef76243920a107e7ce8e96191debe4b16c2a0" - integrity sha512-K4jVyjnBdgvc86Y6BkaLZEN933SwYOuBFkdmBu9ZfkcAbdVbpITnDmjvZ/aQjRXQrv5EPkTnD1s39GiiqbngCw== - dependencies: - is-map "^2.0.3" - is-set "^2.0.3" - is-weakmap "^2.0.2" - is-weakset "^2.0.3" - -which-typed-array@^1.1.16, which-typed-array@^1.1.19: - version "1.1.20" - resolved "https://registry.yarnpkg.com/which-typed-array/-/which-typed-array-1.1.20.tgz#3fdb7adfafe0ea69157b1509f3a1cd892bd1d122" - integrity sha512-LYfpUkmqwl0h9A2HL09Mms427Q1RZWuOHsukfVcKRq9q95iQxdw0ix1JQrqbcDR9PH1QDwf5Qo8OZb5lksZ8Xg== - dependencies: - available-typed-arrays "^1.0.7" - call-bind "^1.0.8" - call-bound "^1.0.4" - for-each "^0.3.5" - get-proto "^1.0.1" - gopd "^1.2.0" - has-tostringtag "^1.0.2" - -which@^2.0.1: - version "2.0.2" - resolved "https://registry.yarnpkg.com/which/-/which-2.0.2.tgz#7c6a8dd0a636a0327e10b59c9286eee93f3f51b1" - integrity sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA== - dependencies: - isexe "^2.0.0" - -why-is-node-running@^2.3.0: - version "2.3.0" - resolved "https://registry.yarnpkg.com/why-is-node-running/-/why-is-node-running-2.3.0.tgz#a3f69a97107f494b3cdc3bdddd883a7d65cebf04" - integrity sha512-hUrmaWBdVDcxvYqnyh09zunKzROWjbZTiNy8dBEjkS7ehEDQibXJ7XvlmtbwuTclUiIyN+CyXQD4Vmko8fNm8w== - dependencies: - siginfo "^2.0.0" - stackback "0.0.2" - -word-wrap@^1.2.5: - version "1.2.5" - resolved "https://registry.yarnpkg.com/word-wrap/-/word-wrap-1.2.5.tgz#d2c45c6dd4fbce621a66f136cbe328afd0410b34" - integrity sha512-BN22B5eaMMI9UMtjrGd5g5eCYPpCPDUy0FJXbYsaT5zYxjFOckS53SQDE3pWkVoWpHXVb3BrYcEN4Twa55B5cA== - -wrap-ansi@^9.0.0: - version "9.0.2" - resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-9.0.2.tgz#956832dea9494306e6d209eb871643bb873d7c98" - integrity sha512-42AtmgqjV+X1VpdOfyTGOYRi0/zsoLqtXQckTmqTeybT+BDIbM/Guxo7x3pE2vtpr1ok6xRqM9OpBe+Jyoqyww== - dependencies: - ansi-styles "^6.2.1" - string-width "^7.0.0" - strip-ansi "^7.1.0" - -wrappy@1: - version "1.0.2" - resolved "https://registry.yarnpkg.com/wrappy/-/wrappy-1.0.2.tgz#b5243d8f3ec1aa35f1364605bc0d1036e30ab69f" - integrity sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ== - -xml-name-validator@^5.0.0: - version "5.0.0" - resolved "https://registry.yarnpkg.com/xml-name-validator/-/xml-name-validator-5.0.0.tgz#82be9b957f7afdacf961e5980f1bf227c0bf7673" - integrity sha512-EvGK8EJ3DhaHfbRlETOWAS5pO9MZITeauHKJyb8wyajUfQUenkIg2MvLDTZ4T/TgIcm3HU0TFBgWWboAZ30UHg== - -xmlchars@^2.2.0: - version "2.2.0" - resolved "https://registry.yarnpkg.com/xmlchars/-/xmlchars-2.2.0.tgz#060fe1bcb7f9c76fe2a17db86a9bc3ab894210cb" - integrity sha512-JZnDKK8B0RCDw84FNdDAIpZK+JuJw+s7Lz8nksI7SIuU3UXJJslUthsi+uWBUYOwPFwW7W7PRLRfUKpxjtjFCw== - -y18n@^5.0.5: - version "5.0.8" - resolved "https://registry.yarnpkg.com/y18n/-/y18n-5.0.8.tgz#7f4934d0f7ca8c56f95314939ddcd2dd91ce1d55" - integrity sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA== - -yaml@^1.10.0: - version "1.10.3" - resolved "https://registry.yarnpkg.com/yaml/-/yaml-1.10.3.tgz#76e407ed95c42684fb8e14641e5de62fe65bbcb3" - integrity sha512-vIYeF1u3CjlhAFekPPAk2h/Kv4T3mAkMox5OymRiJQB0spDP10LHvt+K7G9Ny6NuuMAb25/6n1qyUjAcGNf/AA== - -yargs-parser@^22.0.0: - version "22.0.0" - resolved "https://registry.yarnpkg.com/yargs-parser/-/yargs-parser-22.0.0.tgz#87b82094051b0567717346ecd00fd14804b357c8" - integrity sha512-rwu/ClNdSMpkSrUb+d6BRsSkLUq1fmfsY6TOpYzTwvwkg1/NRG85KBy3kq++A8LKQwX6lsu+aWad+2khvuXrqw== - -yargs@~18.0.0: - version "18.0.0" - resolved "https://registry.yarnpkg.com/yargs/-/yargs-18.0.0.tgz#6c84259806273a746b09f579087b68a3c2d25bd1" - integrity sha512-4UEqdc2RYGHZc7Doyqkrqiln3p9X2DZVxaGbwhn2pi7MrRagKaOcIKe8L3OxYcbhXLgLFUS3zAYuQjKBQgmuNg== - dependencies: - cliui "^9.0.1" - escalade "^3.1.1" - get-caller-file "^2.0.5" - string-width "^7.2.0" - y18n "^5.0.5" - yargs-parser "^22.0.0" - -yocto-queue@^0.1.0: - version "0.1.0" - resolved "https://registry.yarnpkg.com/yocto-queue/-/yocto-queue-0.1.0.tgz#0294eb3dee05028d31ee1a5fa2c556a6aaf10a1b" - integrity sha512-rVksvsnNCdJ/ohGc6xgPwyN8eheCxsiLM8mxuE/t/mOVqJewPuO1miLpTHQiRgTKCLexL4MeAFVagts7HmNZ2Q== - -zip-stream@^4.1.0: - version "4.1.1" - resolved "https://registry.yarnpkg.com/zip-stream/-/zip-stream-4.1.1.tgz#1337fe974dbaffd2fa9a1ba09662a66932bd7135" - integrity sha512-9qv4rlDiopXg4E69k+vMHjNN63YFMe9sZMrdlvKnCjlCRWeCBswPPMPUfx+ipsAWq1LXHe70RcbaHdJJpS6hyQ== - dependencies: - archiver-utils "^3.0.4" - compress-commons "^4.1.2" - readable-stream "^3.6.0" - -zrender@6.0.0: - version "6.0.0" - resolved "https://registry.yarnpkg.com/zrender/-/zrender-6.0.0.tgz#947077bc69cdea744134984927f132f3727f8079" - integrity sha512-41dFXEEXuJpNecuUQq6JlbybmnHaqqpGlbH1yxnA5V9MMP4SbohSVZsJIwz+zdjQXSSlR1Vc34EgH1zxyTDvhg== - dependencies: - tslib "2.3.0" +# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY. +# yarn lockfile v1 + + +"@adobe/css-tools@^4.4.0": + version "4.4.4" + resolved "https://registry.npmjs.org/@adobe/css-tools/-/css-tools-4.4.4.tgz" + integrity sha512-Elp+iwUx5rN5+Y8xLt5/GRoG20WGoDCQ/1Fb+1LiGtvwbDavuSk0jhD/eZdckHAuzcDzccnkv+rEjyWfRx18gg== + +"@asamuzakjp/css-color@^5.0.1": + version "5.0.1" + resolved "https://registry.npmjs.org/@asamuzakjp/css-color/-/css-color-5.0.1.tgz" + integrity sha512-2SZFvqMyvboVV1d15lMf7XiI3m7SDqXUuKaTymJYLN6dSGadqp+fVojqJlVoMlbZnlTmu3S0TLwLTJpvBMO1Aw== + dependencies: + "@csstools/css-calc" "^3.1.1" + "@csstools/css-color-parser" "^4.0.2" + "@csstools/css-parser-algorithms" "^4.0.0" + "@csstools/css-tokenizer" "^4.0.0" + lru-cache "^11.2.6" + +"@asamuzakjp/dom-selector@^7.0.3": + version "7.0.4" + resolved "https://registry.npmjs.org/@asamuzakjp/dom-selector/-/dom-selector-7.0.4.tgz" + integrity sha512-jXR6x4AcT3eIrS2fSNAwJpwirOkGcd+E7F7CP3zjdTqz9B/2huHOL8YJZBgekKwLML+u7qB/6P1LXQuMScsx0w== + dependencies: + "@asamuzakjp/nwsapi" "^2.3.9" + bidi-js "^1.0.3" + css-tree "^3.2.1" + is-potential-custom-element-name "^1.0.1" + lru-cache "^11.2.7" + +"@asamuzakjp/nwsapi@^2.3.9": + version "2.3.9" + resolved "https://registry.npmjs.org/@asamuzakjp/nwsapi/-/nwsapi-2.3.9.tgz" + integrity sha512-n8GuYSrI9bF7FFZ/SjhwevlHc8xaVlb/7HmHelnc/PZXBD2ZR49NnN9sMMuDdEGPeeRQ5d0hqlSlEpgCX3Wl0Q== + +"@azure/msal-browser@^5.6.3": + version "5.6.3" + resolved "https://registry.npmjs.org/@azure/msal-browser/-/msal-browser-5.6.3.tgz" + integrity sha512-sTjMtUm+bJpENU/1WlRzHEsgEHppZDZ1EtNyaOODg/sQBtMxxJzGB+MOCM+T2Q5Qe1fKBrdxUmjyRxm0r7Ez9w== + dependencies: + "@azure/msal-common" "16.4.1" + +"@azure/msal-common@16.4.1": + version "16.4.1" + resolved "https://registry.npmjs.org/@azure/msal-common/-/msal-common-16.4.1.tgz" + integrity sha512-Bl8f+w37xkXsYh7QRkAKCFGYtWMYuOVO7Lv+BxILrvGz3HbIEF22Pt0ugyj0QPOl6NLrHcnNUQ9yeew98P/5iw== + +"@babel/code-frame@^7.0.0", "@babel/code-frame@^7.10.4", "@babel/code-frame@^7.28.6", "@babel/code-frame@^7.29.0": + version "7.29.0" + resolved "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.29.0.tgz" + integrity sha512-9NhCeYjq9+3uxgdtp20LSiJXJvN0FeCtNGpJxuMFZ1Kv3cWUNb6DOhJwUvcVCzKGR66cw4njwM6hrJLqgOwbcw== + dependencies: + "@babel/helper-validator-identifier" "^7.28.5" + js-tokens "^4.0.0" + picocolors "^1.1.1" + +"@babel/generator@^7.29.0": + version "7.29.1" + resolved "https://registry.npmjs.org/@babel/generator/-/generator-7.29.1.tgz" + integrity sha512-qsaF+9Qcm2Qv8SRIMMscAvG4O3lJ0F1GuMo5HR/Bp02LopNgnZBC/EkbevHFeGs4ls/oPz9v+Bsmzbkbe+0dUw== + dependencies: + "@babel/parser" "^7.29.0" + "@babel/types" "^7.29.0" + "@jridgewell/gen-mapping" "^0.3.12" + "@jridgewell/trace-mapping" "^0.3.28" + jsesc "^3.0.2" + +"@babel/helper-globals@^7.28.0": + version "7.28.0" + resolved "https://registry.npmjs.org/@babel/helper-globals/-/helper-globals-7.28.0.tgz" + integrity sha512-+W6cISkXFa1jXsDEdYA8HeevQT/FULhxzR99pxphltZcVaugps53THCeiWA8SguxxpSp3gKPiuYfSWopkLQ4hw== + +"@babel/helper-module-imports@^7.16.7": + version "7.28.6" + resolved "https://registry.npmjs.org/@babel/helper-module-imports/-/helper-module-imports-7.28.6.tgz" + integrity sha512-l5XkZK7r7wa9LucGw9LwZyyCUscb4x37JWTPz7swwFE/0FMQAGpiWUZn8u9DzkSBWEcK25jmvubfpw2dnAMdbw== + dependencies: + "@babel/traverse" "^7.28.6" + "@babel/types" "^7.28.6" + +"@babel/helper-string-parser@^7.27.1": + version "7.27.1" + resolved "https://registry.npmjs.org/@babel/helper-string-parser/-/helper-string-parser-7.27.1.tgz" + integrity sha512-qMlSxKbpRlAridDExk92nSobyDdpPijUq2DW6oDnUqd0iOGxmQjyqhMIihI9+zv4LPyZdRje2cavWPbCbWm3eA== + +"@babel/helper-validator-identifier@^7.28.5": + version "7.28.5" + resolved "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.28.5.tgz" + integrity sha512-qSs4ifwzKJSV39ucNjsvc6WVHs6b7S03sOh2OcHF9UHfVPqWWALUsNUVzhSBiItjRZoLHx7nIarVjqKVusUZ1Q== + +"@babel/parser@^7.28.6", "@babel/parser@^7.29.0": + version "7.29.2" + resolved "https://registry.npmjs.org/@babel/parser/-/parser-7.29.2.tgz" + integrity sha512-4GgRzy/+fsBa72/RZVJmGKPmZu9Byn8o4MoLpmNe1m8ZfYnz5emHLQz3U4gLud6Zwl0RZIcgiLD7Uq7ySFuDLA== + dependencies: + "@babel/types" "^7.29.0" + +"@babel/runtime@^7.12.1", "@babel/runtime@^7.12.5", "@babel/runtime@^7.18.3", "@babel/runtime@^7.23.2", "@babel/runtime@^7.28.6", "@babel/runtime@^7.29.2", "@babel/runtime@^7.5.5", "@babel/runtime@^7.8.7", "@babel/runtime@^7.9.2": + version "7.29.2" + resolved "https://registry.npmjs.org/@babel/runtime/-/runtime-7.29.2.tgz" + integrity sha512-JiDShH45zKHWyGe4ZNVRrCjBz8Nh9TMmZG1kh4QTK8hCBTWBi8Da+i7s1fJw7/lYpM4ccepSNfqzZ/QvABBi5g== + +"@babel/template@^7.28.6": + version "7.28.6" + resolved "https://registry.npmjs.org/@babel/template/-/template-7.28.6.tgz" + integrity sha512-YA6Ma2KsCdGb+WC6UpBVFJGXL58MDA6oyONbjyF/+5sBgxY/dwkhLogbMT2GXXyU84/IhRw/2D1Os1B/giz+BQ== + dependencies: + "@babel/code-frame" "^7.28.6" + "@babel/parser" "^7.28.6" + "@babel/types" "^7.28.6" + +"@babel/traverse@^7.28.6": + version "7.29.0" + resolved "https://registry.npmjs.org/@babel/traverse/-/traverse-7.29.0.tgz" + integrity sha512-4HPiQr0X7+waHfyXPZpWPfWL/J7dcN1mx9gL6WdQVMbPnF3+ZhSMs8tCxN7oHddJE9fhNE7+lxdnlyemKfJRuA== + dependencies: + "@babel/code-frame" "^7.29.0" + "@babel/generator" "^7.29.0" + "@babel/helper-globals" "^7.28.0" + "@babel/parser" "^7.29.0" + "@babel/template" "^7.28.6" + "@babel/types" "^7.29.0" + debug "^4.3.1" + +"@babel/types@^7.28.6", "@babel/types@^7.29.0": + version "7.29.0" + resolved "https://registry.npmjs.org/@babel/types/-/types-7.29.0.tgz" + integrity sha512-LwdZHpScM4Qz8Xw2iKSzS+cfglZzJGvofQICy7W7v4caru4EaAmyUuO6BGrbyQ2mYV11W0U8j5mBhd14dd3B0A== + dependencies: + "@babel/helper-string-parser" "^7.27.1" + "@babel/helper-validator-identifier" "^7.28.5" + +"@bramus/specificity@^2.4.2": + version "2.4.2" + resolved "https://registry.npmjs.org/@bramus/specificity/-/specificity-2.4.2.tgz" + integrity sha512-ctxtJ/eA+t+6q2++vj5j7FYX3nRu311q1wfYH3xjlLOsczhlhxAg2FWNUXhpGvAw3BWo1xBcvOV6/YLc2r5FJw== + dependencies: + css-tree "^3.0.0" + +"@csstools/color-helpers@^6.0.2": + version "6.0.2" + resolved "https://registry.npmjs.org/@csstools/color-helpers/-/color-helpers-6.0.2.tgz" + integrity sha512-LMGQLS9EuADloEFkcTBR3BwV/CGHV7zyDxVRtVDTwdI2Ca4it0CCVTT9wCkxSgokjE5Ho41hEPgb8OEUwoXr6Q== + +"@csstools/css-calc@^3.1.1": + version "3.1.1" + resolved "https://registry.npmjs.org/@csstools/css-calc/-/css-calc-3.1.1.tgz" + integrity sha512-HJ26Z/vmsZQqs/o3a6bgKslXGFAungXGbinULZO3eMsOyNJHeBBZfup5FiZInOghgoM4Hwnmw+OgbJCNg1wwUQ== + +"@csstools/css-color-parser@^4.0.2": + version "4.0.2" + resolved "https://registry.npmjs.org/@csstools/css-color-parser/-/css-color-parser-4.0.2.tgz" + integrity sha512-0GEfbBLmTFf0dJlpsNU7zwxRIH0/BGEMuXLTCvFYxuL1tNhqzTbtnFICyJLTNK4a+RechKP75e7w42ClXSnJQw== + dependencies: + "@csstools/color-helpers" "^6.0.2" + "@csstools/css-calc" "^3.1.1" + +"@csstools/css-parser-algorithms@^4.0.0": + version "4.0.0" + resolved "https://registry.npmjs.org/@csstools/css-parser-algorithms/-/css-parser-algorithms-4.0.0.tgz" + integrity sha512-+B87qS7fIG3L5h3qwJ/IFbjoVoOe/bpOdh9hAjXbvx0o8ImEmUsGXN0inFOnk2ChCFgqkkGFQ+TpM5rbhkKe4w== + +"@csstools/css-syntax-patches-for-csstree@^1.1.1": + version "1.1.1" + resolved "https://registry.npmjs.org/@csstools/css-syntax-patches-for-csstree/-/css-syntax-patches-for-csstree-1.1.1.tgz" + integrity sha512-BvqN0AMWNAnLk9G8jnUT77D+mUbY/H2b3uDTvg2isJkHaOufUE2R3AOwxWo7VBQKT1lOdwdvorddo2B/lk64+w== + +"@csstools/css-tokenizer@^4.0.0": + version "4.0.0" + resolved "https://registry.npmjs.org/@csstools/css-tokenizer/-/css-tokenizer-4.0.0.tgz" + integrity sha512-QxULHAm7cNu72w97JUNCBFODFaXpbDg+dP8b/oWFAZ2MTRppA3U00Y2L1HqaS4J6yBqxwa/Y3nMBaxVKbB/NsA== + +"@emotion/babel-plugin@^11.13.5": + version "11.13.5" + resolved "https://registry.npmjs.org/@emotion/babel-plugin/-/babel-plugin-11.13.5.tgz" + integrity sha512-pxHCpT2ex+0q+HH91/zsdHkw/lXd468DIN2zvfvLtPKLLMo6gQj7oLObq8PhkrxOZb/gGCq03S3Z7PDhS8pduQ== + dependencies: + "@babel/helper-module-imports" "^7.16.7" + "@babel/runtime" "^7.18.3" + "@emotion/hash" "^0.9.2" + "@emotion/memoize" "^0.9.0" + "@emotion/serialize" "^1.3.3" + babel-plugin-macros "^3.1.0" + convert-source-map "^1.5.0" + escape-string-regexp "^4.0.0" + find-root "^1.1.0" + source-map "^0.5.7" + stylis "4.2.0" + +"@emotion/cache@^11.14.0": + version "11.14.0" + resolved "https://registry.npmjs.org/@emotion/cache/-/cache-11.14.0.tgz" + integrity sha512-L/B1lc/TViYk4DcpGxtAVbx0ZyiKM5ktoIyafGkH6zg/tj+mA+NE//aPYKG0k8kCHSHVJrpLpcAlOBEXQ3SavA== + dependencies: + "@emotion/memoize" "^0.9.0" + "@emotion/sheet" "^1.4.0" + "@emotion/utils" "^1.4.2" + "@emotion/weak-memoize" "^0.4.0" + stylis "4.2.0" + +"@emotion/hash@^0.9.2": + version "0.9.2" + resolved "https://registry.npmjs.org/@emotion/hash/-/hash-0.9.2.tgz" + integrity sha512-MyqliTZGuOm3+5ZRSaaBGP3USLw6+EGykkwZns2EPC5g8jJ4z9OrdZY9apkl3+UP9+sdz76YYkwCKP5gh8iY3g== + +"@emotion/is-prop-valid@^1.3.0": + version "1.4.0" + resolved "https://registry.npmjs.org/@emotion/is-prop-valid/-/is-prop-valid-1.4.0.tgz" + integrity sha512-QgD4fyscGcbbKwJmqNvUMSE02OsHUa+lAWKdEUIJKgqe5IwRSKd7+KhibEWdaKwgjLj0DRSHA9biAIqGBk05lw== + dependencies: + "@emotion/memoize" "^0.9.0" + +"@emotion/memoize@^0.9.0": + version "0.9.0" + resolved "https://registry.npmjs.org/@emotion/memoize/-/memoize-0.9.0.tgz" + integrity sha512-30FAj7/EoJ5mwVPOWhAyCX+FPfMDrVecJAM+Iw9NRoSl4BBAQeqj4cApHHUXOVvIPgLVDsCFoz/hGD+5QQD1GQ== + +"@emotion/react@^11.14.0": + version "11.14.0" + resolved "https://registry.npmjs.org/@emotion/react/-/react-11.14.0.tgz" + integrity sha512-O000MLDBDdk/EohJPFUqvnp4qnHeYkVP5B0xEG0D/L7cOKP9kefu2DXn8dj74cQfsEzUqh+sr1RzFqiL1o+PpA== + dependencies: + "@babel/runtime" "^7.18.3" + "@emotion/babel-plugin" "^11.13.5" + "@emotion/cache" "^11.14.0" + "@emotion/serialize" "^1.3.3" + "@emotion/use-insertion-effect-with-fallbacks" "^1.2.0" + "@emotion/utils" "^1.4.2" + "@emotion/weak-memoize" "^0.4.0" + hoist-non-react-statics "^3.3.1" + +"@emotion/serialize@^1.3.3": + version "1.3.3" + resolved "https://registry.npmjs.org/@emotion/serialize/-/serialize-1.3.3.tgz" + integrity sha512-EISGqt7sSNWHGI76hC7x1CksiXPahbxEOrC5RjmFRJTqLyEK9/9hZvBbiYn70dw4wuwMKiEMCUlR6ZXTSWQqxA== + dependencies: + "@emotion/hash" "^0.9.2" + "@emotion/memoize" "^0.9.0" + "@emotion/unitless" "^0.10.0" + "@emotion/utils" "^1.4.2" + csstype "^3.0.2" + +"@emotion/sheet@^1.4.0": + version "1.4.0" + resolved "https://registry.npmjs.org/@emotion/sheet/-/sheet-1.4.0.tgz" + integrity sha512-fTBW9/8r2w3dXWYM4HCB1Rdp8NLibOw2+XELH5m5+AkWiL/KqYX6dc0kKYlaYyKjrQ6ds33MCdMPEwgs2z1rqg== + +"@emotion/styled@^11.14.0": + version "11.14.1" + resolved "https://registry.npmjs.org/@emotion/styled/-/styled-11.14.1.tgz" + integrity sha512-qEEJt42DuToa3gurlH4Qqc1kVpNq8wO8cJtDzU46TjlzWjDlsVyevtYCRijVq3SrHsROS+gVQ8Fnea108GnKzw== + dependencies: + "@babel/runtime" "^7.18.3" + "@emotion/babel-plugin" "^11.13.5" + "@emotion/is-prop-valid" "^1.3.0" + "@emotion/serialize" "^1.3.3" + "@emotion/use-insertion-effect-with-fallbacks" "^1.2.0" + "@emotion/utils" "^1.4.2" + +"@emotion/unitless@^0.10.0": + version "0.10.0" + resolved "https://registry.npmjs.org/@emotion/unitless/-/unitless-0.10.0.tgz" + integrity sha512-dFoMUuQA20zvtVTuxZww6OHoJYgrzfKM1t52mVySDJnMSEa08ruEvdYQbhvyu6soU+NeLVd3yKfTfT0NeV6qGg== + +"@emotion/use-insertion-effect-with-fallbacks@^1.2.0": + version "1.2.0" + resolved "https://registry.npmjs.org/@emotion/use-insertion-effect-with-fallbacks/-/use-insertion-effect-with-fallbacks-1.2.0.tgz" + integrity sha512-yJMtVdH59sxi/aVJBpk9FQq+OR8ll5GT8oWd57UpeaKEVGab41JWaCFA7FRLoMLloOZF/c/wsPoe+bfGmRKgDg== + +"@emotion/utils@^1.4.2": + version "1.4.2" + resolved "https://registry.npmjs.org/@emotion/utils/-/utils-1.4.2.tgz" + integrity sha512-3vLclRofFziIa3J2wDh9jjbkUz9qk5Vi3IZ/FSTKViB0k+ef0fPV7dYrUIugbgupYDx7v9ud/SjrtEP8Y4xLoA== + +"@emotion/weak-memoize@^0.4.0": + version "0.4.0" + resolved "https://registry.npmjs.org/@emotion/weak-memoize/-/weak-memoize-0.4.0.tgz" + integrity sha512-snKqtPW01tN0ui7yu9rGv69aJXr/a/Ywvl11sUjNtEcRc+ng/mQriFL0wLXMef74iHa/EkftbDzU9F8iFbH+zg== + +"@esbuild/darwin-arm64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.21.5.tgz" + integrity sha512-DwqXqZyuk5AiWWf3UfLiRDJ5EDd49zg6O9wclZ7kUMv2WRFr4HKjXp/5t8JZ11QbQfUS6/cRCKGwYhtNAY88kQ== + +"@eslint-community/eslint-utils@^4.8.0", "@eslint-community/eslint-utils@^4.9.1": + version "4.9.1" + resolved "https://registry.npmjs.org/@eslint-community/eslint-utils/-/eslint-utils-4.9.1.tgz" + integrity sha512-phrYmNiYppR7znFEdqgfWHXR6NCkZEK7hwWDHZUjit/2/U0r6XvkDl0SYnoM51Hq7FhCGdLDT6zxCCOY1hexsQ== + dependencies: + eslint-visitor-keys "^3.4.3" + +"@eslint-community/regexpp@^4.12.1", "@eslint-community/regexpp@^4.12.2": + version "4.12.2" + resolved "https://registry.npmjs.org/@eslint-community/regexpp/-/regexpp-4.12.2.tgz" + integrity sha512-EriSTlt5OC9/7SXkRSCAhfSxxoSUgBm33OH+IkwbdpgoqsSsUg7y3uh+IICI/Qg4BBWr3U2i39RpmycbxMq4ew== + +"@eslint/config-array@^0.21.2": + version "0.21.2" + resolved "https://registry.npmjs.org/@eslint/config-array/-/config-array-0.21.2.tgz" + integrity sha512-nJl2KGTlrf9GjLimgIru+V/mzgSK0ABCDQRvxw5BjURL7WfH5uoWmizbH7QB6MmnMBd8cIC9uceWnezL1VZWWw== + dependencies: + "@eslint/object-schema" "^2.1.7" + debug "^4.3.1" + minimatch "^3.1.5" + +"@eslint/config-helpers@^0.4.2": + version "0.4.2" + resolved "https://registry.npmjs.org/@eslint/config-helpers/-/config-helpers-0.4.2.tgz" + integrity sha512-gBrxN88gOIf3R7ja5K9slwNayVcZgK6SOUORm2uBzTeIEfeVaIhOpCtTox3P6R7o2jLFwLFTLnC7kU/RGcYEgw== + dependencies: + "@eslint/core" "^0.17.0" + +"@eslint/core@^0.17.0": + version "0.17.0" + resolved "https://registry.npmjs.org/@eslint/core/-/core-0.17.0.tgz" + integrity sha512-yL/sLrpmtDaFEiUj1osRP4TI2MDz1AddJL+jZ7KSqvBuliN4xqYY54IfdN8qD8Toa6g1iloph1fxQNkjOxrrpQ== + dependencies: + "@types/json-schema" "^7.0.15" + +"@eslint/eslintrc@^3.3.5": + version "3.3.5" + resolved "https://registry.npmjs.org/@eslint/eslintrc/-/eslintrc-3.3.5.tgz" + integrity sha512-4IlJx0X0qftVsN5E+/vGujTRIFtwuLbNsVUe7TO6zYPDR1O6nFwvwhIKEKSrl6dZchmYBITazxKoUYOjdtjlRg== + dependencies: + ajv "^6.14.0" + debug "^4.3.2" + espree "^10.0.1" + globals "^14.0.0" + ignore "^5.2.0" + import-fresh "^3.2.1" + js-yaml "^4.1.1" + minimatch "^3.1.5" + strip-json-comments "^3.1.1" + +"@eslint/js@^9.15.0", "@eslint/js@9.39.4": + version "9.39.4" + resolved "https://registry.npmjs.org/@eslint/js/-/js-9.39.4.tgz" + integrity sha512-nE7DEIchvtiFTwBw4Lfbu59PG+kCofhjsKaCWzxTpt4lfRjRMqG6uMBzKXuEcyXhOHoUp9riAm7/aWYGhXZ9cw== + +"@eslint/object-schema@^2.1.7": + version "2.1.7" + resolved "https://registry.npmjs.org/@eslint/object-schema/-/object-schema-2.1.7.tgz" + integrity sha512-VtAOaymWVfZcmZbp6E2mympDIHvyjXs/12LqWYjVw6qjrfF+VK+fyG33kChz3nnK+SU5/NeHOqrTEHS8sXO3OA== + +"@eslint/plugin-kit@^0.4.1": + version "0.4.1" + resolved "https://registry.npmjs.org/@eslint/plugin-kit/-/plugin-kit-0.4.1.tgz" + integrity sha512-43/qtrDUokr7LJqoF2c3+RInu/t4zfrpYdoSDfYyhg52rwLV6TnOvdG4fXm7IkSB3wErkcmJS9iEhjVtOSEjjA== + dependencies: + "@eslint/core" "^0.17.0" + levn "^0.4.1" + +"@exodus/bytes@^1.11.0", "@exodus/bytes@^1.15.0", "@exodus/bytes@^1.6.0": + version "1.15.0" + resolved "https://registry.npmjs.org/@exodus/bytes/-/bytes-1.15.0.tgz" + integrity sha512-UY0nlA+feH81UGSHv92sLEPLCeZFjXOuHhrIo0HQydScuQc8s0A7kL/UdgwgDq8g8ilksmuoF35YVTNphV2aBQ== + +"@fast-csv/format@4.3.5": + version "4.3.5" + resolved "https://registry.npmjs.org/@fast-csv/format/-/format-4.3.5.tgz" + integrity sha512-8iRn6QF3I8Ak78lNAa+Gdl5MJJBM5vRHivFtMRUWINdevNo00K7OXxS2PshawLKTejVwieIlPmK5YlLu6w4u8A== + dependencies: + "@types/node" "^14.0.1" + lodash.escaperegexp "^4.1.2" + lodash.isboolean "^3.0.3" + lodash.isequal "^4.5.0" + lodash.isfunction "^3.0.9" + lodash.isnil "^4.0.0" + +"@fast-csv/parse@4.3.6": + version "4.3.6" + resolved "https://registry.npmjs.org/@fast-csv/parse/-/parse-4.3.6.tgz" + integrity sha512-uRsLYksqpbDmWaSmzvJcuApSEe38+6NQZBUsuAyMZKqHxH0g1wcJgsKUvN3WC8tewaqFjBMMGrkHmC+T7k8LvA== + dependencies: + "@types/node" "^14.0.1" + lodash.escaperegexp "^4.1.2" + lodash.groupby "^4.6.0" + lodash.isfunction "^3.0.9" + lodash.isnil "^4.0.0" + lodash.isundefined "^3.0.1" + lodash.uniq "^4.5.0" + +"@floating-ui/core@^1.7.5": + version "1.7.5" + resolved "https://registry.npmjs.org/@floating-ui/core/-/core-1.7.5.tgz" + integrity sha512-1Ih4WTWyw0+lKyFMcBHGbb5U5FtuHJuujoyyr5zTaWS5EYMeT6Jb2AuDeftsCsEuchO+mM2ij5+q9crhydzLhQ== + dependencies: + "@floating-ui/utils" "^0.2.11" + +"@floating-ui/dom@^1.0.0": + version "1.7.6" + resolved "https://registry.npmjs.org/@floating-ui/dom/-/dom-1.7.6.tgz" + integrity sha512-9gZSAI5XM36880PPMm//9dfiEngYoC6Am2izES1FF406YFsjvyBMmeJ2g4SAju3xWwtuynNRFL2s9hgxpLI5SQ== + dependencies: + "@floating-ui/core" "^1.7.5" + "@floating-ui/utils" "^0.2.11" + +"@floating-ui/utils@^0.2.11": + version "0.2.11" + resolved "https://registry.npmjs.org/@floating-ui/utils/-/utils-0.2.11.tgz" + integrity sha512-RiB/yIh78pcIxl6lLMG0CgBXAZ2Y0eVHqMPYugu+9U0AeT6YBeiJpf7lbdJNIugFP5SIjwNRgo4DhR1Qxi26Gg== + +"@fontsource/roboto@^4.5.5": + version "4.5.8" + resolved "https://registry.npmjs.org/@fontsource/roboto/-/roboto-4.5.8.tgz" + integrity sha512-CnD7zLItIzt86q4Sj3kZUiLcBk1dSk81qcqgMGaZe7SQ1P8hFNxhMl5AZthK1zrDM5m74VVhaOpuMGIL4gagaA== + +"@humanfs/core@^0.19.1": + version "0.19.1" + resolved "https://registry.npmjs.org/@humanfs/core/-/core-0.19.1.tgz" + integrity sha512-5DyQ4+1JEUzejeK1JGICcideyfUbGixgS9jNgex5nqkW+cY7WZhxBigmieN5Qnw9ZosSNVC9KQKyb+GUaGyKUA== + +"@humanfs/node@^0.16.6": + version "0.16.7" + resolved "https://registry.npmjs.org/@humanfs/node/-/node-0.16.7.tgz" + integrity sha512-/zUx+yOsIrG4Y43Eh2peDeKCxlRt/gET6aHfaKpuq267qXdYDFViVHfMaLyygZOnl0kGWxFIgsBy8QFuTLUXEQ== + dependencies: + "@humanfs/core" "^0.19.1" + "@humanwhocodes/retry" "^0.4.0" + +"@humanwhocodes/module-importer@^1.0.1": + version "1.0.1" + resolved "https://registry.npmjs.org/@humanwhocodes/module-importer/-/module-importer-1.0.1.tgz" + integrity sha512-bxveV4V8v5Yb4ncFTT3rPSgZBOpCkjfK0y4oVVVJwIuDVBRMDXrPyXRL988i5ap9m9bnyEEjWfm5WkBmtffLfA== + +"@humanwhocodes/retry@^0.4.0", "@humanwhocodes/retry@^0.4.2": + version "0.4.3" + resolved "https://registry.npmjs.org/@humanwhocodes/retry/-/retry-0.4.3.tgz" + integrity sha512-bV0Tgo9K4hfPCek+aMAn81RppFKv2ySDQeMoSZuvTASywNTnVJCArCZE2FWqpvIatKu7VMRLWlR1EazvVhDyhQ== + +"@jridgewell/gen-mapping@^0.3.12": + version "0.3.13" + resolved "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.13.tgz" + integrity sha512-2kkt/7niJ6MgEPxF0bYdQ6etZaA+fQvDcLKckhy1yIQOzaoKjBBjSj63/aLVjYE3qhRt5dvM+uUyfCg6UKCBbA== + dependencies: + "@jridgewell/sourcemap-codec" "^1.5.0" + "@jridgewell/trace-mapping" "^0.3.24" + +"@jridgewell/resolve-uri@^3.1.0": + version "3.1.2" + resolved "https://registry.npmjs.org/@jridgewell/resolve-uri/-/resolve-uri-3.1.2.tgz" + integrity sha512-bRISgCIjP20/tbWSPWMEi54QVPRZExkuD9lJL+UIxUKtwVJA8wW1Trb1jMs1RFXo1CBTNZ/5hpC9QvmKWdopKw== + +"@jridgewell/sourcemap-codec@^1.4.14", "@jridgewell/sourcemap-codec@^1.5.0", "@jridgewell/sourcemap-codec@^1.5.5": + version "1.5.5" + resolved "https://registry.npmjs.org/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.5.5.tgz" + integrity sha512-cYQ9310grqxueWbl+WuIUIaiUaDcj7WOq5fVhEljNVgRfOUhY9fy2zTvfoqWsnebh8Sl70VScFbICvJnLKB0Og== + +"@jridgewell/trace-mapping@^0.3.24", "@jridgewell/trace-mapping@^0.3.28": + version "0.3.31" + resolved "https://registry.npmjs.org/@jridgewell/trace-mapping/-/trace-mapping-0.3.31.tgz" + integrity sha512-zzNR+SdQSDJzc8joaeP8QQoCQr8NuYx2dIIytl1QeBEZHJ9uW6hebsrYgbz8hJwUQao3TWCMtmfV8Nu1twOLAw== + dependencies: + "@jridgewell/resolve-uri" "^3.1.0" + "@jridgewell/sourcemap-codec" "^1.4.14" + +"@kurkle/color@^0.3.0": + version "0.3.4" + resolved "https://registry.npmjs.org/@kurkle/color/-/color-0.3.4.tgz" + integrity sha512-M5UknZPHRu3DEDWoipU6sE8PdkZ6Z/S+v4dD+Ke8IaNlpdSQah50lz1KtcFBa2vsdOnwbbnxJwVM4wty6udA5w== + +"@mui/core-downloads-tracker@^7.3.9": + version "7.3.9" + resolved "https://registry.npmjs.org/@mui/core-downloads-tracker/-/core-downloads-tracker-7.3.9.tgz" + integrity sha512-MOkOCTfbMJwLshlBCKJ59V2F/uaLYfmKnN76kksj6jlGUVdI25A9Hzs08m+zjBRdLv+sK7Rqdsefe8X7h/6PCw== + +"@mui/icons-material@^7.1.1": + version "7.3.9" + resolved "https://registry.npmjs.org/@mui/icons-material/-/icons-material-7.3.9.tgz" + integrity sha512-BT+zPJXss8Hg/oEMRmHl17Q97bPACG4ufFSfGEdhiE96jOyR5Dz1ty7ZWt1fVGR0y1p+sSgEwQT/MNZQmoWDCw== + dependencies: + "@babel/runtime" "^7.28.6" + +"@mui/lab@^7.0.1-beta.18": + version "7.0.1-beta.23" + resolved "https://registry.npmjs.org/@mui/lab/-/lab-7.0.1-beta.23.tgz" + integrity sha512-661LhBtL33DWeRk7DXXu4LvbHUmTRkoybiVgKkdLx6gA4Nbr1r6B1U+yZGcTm5GfY25nrtS083aoy3P0wuuJ3A== + dependencies: + "@babel/runtime" "^7.28.6" + "@mui/system" "^7.3.9" + "@mui/types" "^7.4.12" + "@mui/utils" "^7.3.9" + clsx "^2.1.1" + prop-types "^15.8.1" + +"@mui/material@^7.1.1": + version "7.3.9" + resolved "https://registry.npmjs.org/@mui/material/-/material-7.3.9.tgz" + integrity sha512-I8yO3t4T0y7bvDiR1qhIN6iBWZOTBfVOnmLlM7K6h3dx5YX2a7rnkuXzc2UkZaqhxY9NgTnEbdPlokR1RxCNRQ== + dependencies: + "@babel/runtime" "^7.28.6" + "@mui/core-downloads-tracker" "^7.3.9" + "@mui/system" "^7.3.9" + "@mui/types" "^7.4.12" + "@mui/utils" "^7.3.9" + "@popperjs/core" "^2.11.8" + "@types/react-transition-group" "^4.4.12" + clsx "^2.1.1" + csstype "^3.2.3" + prop-types "^15.8.1" + react-is "^19.2.3" + react-transition-group "^4.4.5" + +"@mui/private-theming@^7.3.9": + version "7.3.9" + resolved "https://registry.npmjs.org/@mui/private-theming/-/private-theming-7.3.9.tgz" + integrity sha512-ErIyRQvsiQEq7Yvcvfw9UDHngaqjMy9P3JDPnRAaKG5qhpl2C4tX/W1S4zJvpu+feihmZJStjIyvnv6KDbIrlw== + dependencies: + "@babel/runtime" "^7.28.6" + "@mui/utils" "^7.3.9" + prop-types "^15.8.1" + +"@mui/styled-engine@^7.3.9": + version "7.3.9" + resolved "https://registry.npmjs.org/@mui/styled-engine/-/styled-engine-7.3.9.tgz" + integrity sha512-JqujWt5bX4okjUPGpVof/7pvgClqh7HvIbsIBIOOlCh2u3wG/Bwp4+E1bc1dXSwkrkp9WUAoNdI5HEC+5HKvMw== + dependencies: + "@babel/runtime" "^7.28.6" + "@emotion/cache" "^11.14.0" + "@emotion/serialize" "^1.3.3" + "@emotion/sheet" "^1.4.0" + csstype "^3.2.3" + prop-types "^15.8.1" + +"@mui/system@^7.3.9": + version "7.3.9" + resolved "https://registry.npmjs.org/@mui/system/-/system-7.3.9.tgz" + integrity sha512-aL1q9am8XpRrSabv9qWf5RHhJICJql34wnrc1nz0MuOglPRYF/liN+c8VqZdTvUn9qg+ZjRVbKf4sJVFfIDtmg== + dependencies: + "@babel/runtime" "^7.28.6" + "@mui/private-theming" "^7.3.9" + "@mui/styled-engine" "^7.3.9" + "@mui/types" "^7.4.12" + "@mui/utils" "^7.3.9" + clsx "^2.1.1" + csstype "^3.2.3" + prop-types "^15.8.1" + +"@mui/types@^7.4.12": + version "7.4.12" + resolved "https://registry.npmjs.org/@mui/types/-/types-7.4.12.tgz" + integrity sha512-iKNAF2u9PzSIj40CjvKJWxFXJo122jXVdrmdh0hMYd+FR+NuJMkr/L88XwWLCRiJ5P1j+uyac25+Kp6YC4hu6w== + dependencies: + "@babel/runtime" "^7.28.6" + +"@mui/utils@^7.3.9": + version "7.3.9" + resolved "https://registry.npmjs.org/@mui/utils/-/utils-7.3.9.tgz" + integrity sha512-U6SdZaGbfb65fqTsH3V5oJdFj9uYwyLE2WVuNvmbggTSDBb8QHrFsqY8BN3taK9t3yJ8/BPHD/kNvLNyjwM7Yw== + dependencies: + "@babel/runtime" "^7.28.6" + "@mui/types" "^7.4.12" + "@types/prop-types" "^15.7.15" + clsx "^2.1.1" + prop-types "^15.8.1" + react-is "^19.2.3" + +"@oxc-project/types@=0.122.0": + version "0.122.0" + resolved "https://registry.npmjs.org/@oxc-project/types/-/types-0.122.0.tgz" + integrity sha512-oLAl5kBpV4w69UtFZ9xqcmTi+GENWOcPF7FCrczTiBbmC0ibXxCwyvZGbO39rCVEuLGAZM84DH0pUIyyv/YJzA== + +"@parcel/watcher-darwin-arm64@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-darwin-arm64/-/watcher-darwin-arm64-2.5.6.tgz" + integrity sha512-Z2ZdrnwyXvvvdtRHLmM4knydIdU9adO3D4n/0cVipF3rRiwP+3/sfzpAwA/qKFL6i1ModaabkU7IbpeMBgiVEA== + +"@parcel/watcher@^2.4.1": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher/-/watcher-2.5.6.tgz" + integrity sha512-tmmZ3lQxAe/k/+rNnXQRawJ4NjxO2hqiOLTHvWchtGZULp4RyFeh6aU4XdOYBFe2KE1oShQTv4AblOs2iOrNnQ== + dependencies: + detect-libc "^2.0.3" + is-glob "^4.0.3" + node-addon-api "^7.0.0" + picomatch "^4.0.3" + optionalDependencies: + "@parcel/watcher-android-arm64" "2.5.6" + "@parcel/watcher-darwin-arm64" "2.5.6" + "@parcel/watcher-darwin-x64" "2.5.6" + "@parcel/watcher-freebsd-x64" "2.5.6" + "@parcel/watcher-linux-arm-glibc" "2.5.6" + "@parcel/watcher-linux-arm-musl" "2.5.6" + "@parcel/watcher-linux-arm64-glibc" "2.5.6" + "@parcel/watcher-linux-arm64-musl" "2.5.6" + "@parcel/watcher-linux-x64-glibc" "2.5.6" + "@parcel/watcher-linux-x64-musl" "2.5.6" + "@parcel/watcher-win32-arm64" "2.5.6" + "@parcel/watcher-win32-ia32" "2.5.6" + "@parcel/watcher-win32-x64" "2.5.6" + +"@popperjs/core@^2.11.8": + version "2.11.8" + resolved "https://registry.npmjs.org/@popperjs/core/-/core-2.11.8.tgz" + integrity sha512-P1st0aksCrn9sGZhp8GMYwBnQsbvAWsZAX44oXNNvLHGqAOcoVxmjZiohstwQ7SqKnbR47akdNi+uleWD8+g6A== + +"@react-dnd/asap@^5.0.1": + version "5.0.2" + resolved "https://registry.npmjs.org/@react-dnd/asap/-/asap-5.0.2.tgz" + integrity sha512-WLyfoHvxhs0V9U+GTsGilGgf2QsPl6ZZ44fnv0/b8T3nQyvzxidxsg/ZltbWssbsRDlYW8UKSQMTGotuTotZ6A== + +"@react-dnd/invariant@^4.0.1": + version "4.0.2" + resolved "https://registry.npmjs.org/@react-dnd/invariant/-/invariant-4.0.2.tgz" + integrity sha512-xKCTqAK/FFauOM9Ta2pswIyT3D8AQlfrYdOi/toTPEhqCuAs1v5tcJ3Y08Izh1cJ5Jchwy9SeAXmMg6zrKs2iw== + +"@react-dnd/shallowequal@^4.0.1": + version "4.0.2" + resolved "https://registry.npmjs.org/@react-dnd/shallowequal/-/shallowequal-4.0.2.tgz" + integrity sha512-/RVXdLvJxLg4QKvMoM5WlwNR9ViO9z8B/qPcc+C0Sa/teJY7QG7kJ441DwzOjMYEY7GmU4dj5EcGHIkKZiQZCA== + +"@reduxjs/toolkit@^1.8.6": + version "1.9.7" + resolved "https://registry.npmjs.org/@reduxjs/toolkit/-/toolkit-1.9.7.tgz" + integrity sha512-t7v8ZPxhhKgOKtU+uyJT13lu4vL7az5aFi4IdoDs/eS548edn2M8Ik9h8fxgvMjGoAUVFSt6ZC1P5cWmQ014QQ== + dependencies: + immer "^9.0.21" + redux "^4.2.1" + redux-thunk "^2.4.2" + reselect "^4.1.8" + +"@remirror/core-constants@3.0.0": + version "3.0.0" + resolved "https://registry.npmjs.org/@remirror/core-constants/-/core-constants-3.0.0.tgz" + integrity sha512-42aWfPrimMfDKDi4YegyS7x+/0tlzaqwPQCULLanv3DMIlu96KTJR0fM5isWX2UViOqlGnX6YFgqWepcX+XMNg== + +"@remix-run/router@1.23.2": + version "1.23.2" + resolved "https://registry.npmjs.org/@remix-run/router/-/router-1.23.2.tgz" + integrity sha512-Ic6m2U/rMjTkhERIa/0ZtXJP17QUi2CbWE7cqx4J58M8aA3QTfW+2UlQ4psvTX9IO1RfNVhK3pcpdjej7L+t2w== + +"@rolldown/binding-darwin-arm64@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-darwin-arm64/-/binding-darwin-arm64-1.0.0-rc.11.tgz" + integrity sha512-7WQgR8SfOPwmDZGFkThUvsmd/nwAWv91oCO4I5LS7RKrssPZmOt7jONN0cW17ydGC1n/+puol1IpoieKqQidmg== + +"@rolldown/pluginutils@1.0.0-beta.27": + version "1.0.0-beta.27" + resolved "https://registry.npmjs.org/@rolldown/pluginutils/-/pluginutils-1.0.0-beta.27.tgz" + integrity sha512-+d0F4MKMCbeVUJwG96uQ4SgAznZNSq93I3V+9NHA4OpvqG8mRCpGdKmK8l/dl02h2CCDHwW2FqilnTyDcAnqjA== + +"@rolldown/pluginutils@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/pluginutils/-/pluginutils-1.0.0-rc.11.tgz" + integrity sha512-xQO9vbwBecJRv9EUcQ/y0dzSTJgA7Q6UVN7xp6B81+tBGSLVAK03yJ9NkJaUA7JFD91kbjxRSC/mDnmvXzbHoQ== + +"@rollup/rollup-darwin-arm64@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-darwin-arm64/-/rollup-darwin-arm64-4.60.0.tgz" + integrity sha512-qEF7CsKKzSRc20Ciu2Zw1wRrBz4g56F7r/vRwY430UPp/nt1x21Q/fpJ9N5l47WWvJlkNCPJz3QRVw008fi7yA== + +"@standard-schema/spec@^1.1.0": + version "1.1.0" + resolved "https://registry.npmjs.org/@standard-schema/spec/-/spec-1.1.0.tgz" + integrity sha512-l2aFy5jALhniG5HgqrD6jXLi/rUWrKvqN/qJx6yoJsgKhblVd+iqqU4RCXavm/jPityDo5TCvKMnpjKnOriy0w== + +"@swc/core-darwin-arm64@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-darwin-arm64/-/core-darwin-arm64-1.15.21.tgz" + integrity sha512-SA8SFg9dp0qKRH8goWsax6bptFE2EdmPf2YRAQW9WoHGf3XKM1bX0nd5UdwxmC5hXsBUZAYf7xSciCler6/oyA== + +"@swc/core@^1.12.11": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core/-/core-1.15.21.tgz" + integrity sha512-fkk7NJcBscrR3/F8jiqlMptRHP650NxqDnspBMrRe5d8xOoCy9MLL5kOBLFXjFLfMo3KQQHhk+/jUULOMlR1uQ== + dependencies: + "@swc/counter" "^0.1.3" + "@swc/types" "^0.1.25" + optionalDependencies: + "@swc/core-darwin-arm64" "1.15.21" + "@swc/core-darwin-x64" "1.15.21" + "@swc/core-linux-arm-gnueabihf" "1.15.21" + "@swc/core-linux-arm64-gnu" "1.15.21" + "@swc/core-linux-arm64-musl" "1.15.21" + "@swc/core-linux-ppc64-gnu" "1.15.21" + "@swc/core-linux-s390x-gnu" "1.15.21" + "@swc/core-linux-x64-gnu" "1.15.21" + "@swc/core-linux-x64-musl" "1.15.21" + "@swc/core-win32-arm64-msvc" "1.15.21" + "@swc/core-win32-ia32-msvc" "1.15.21" + "@swc/core-win32-x64-msvc" "1.15.21" + +"@swc/counter@^0.1.3": + version "0.1.3" + resolved "https://registry.npmjs.org/@swc/counter/-/counter-0.1.3.tgz" + integrity sha512-e2BR4lsJkkRlKZ/qCHPw9ZaSxc0MVUd7gtbtaB7aMvHeJVYe8sOB8DBZkP2DtISHGSku9sCK6T6cnY0CtXrOCQ== + +"@swc/types@^0.1.25": + version "0.1.26" + resolved "https://registry.npmjs.org/@swc/types/-/types-0.1.26.tgz" + integrity sha512-lyMwd7WGgG79RS7EERZV3T8wMdmPq3xwyg+1nmAM64kIhx5yl+juO2PYIHb7vTiPgPCj8LYjsNV2T5wiQHUEaw== + dependencies: + "@swc/counter" "^0.1.3" + +"@testing-library/dom@^10.4.1": + version "10.4.1" + resolved "https://registry.npmjs.org/@testing-library/dom/-/dom-10.4.1.tgz" + integrity sha512-o4PXJQidqJl82ckFaXUeoAW+XysPLauYI43Abki5hABd853iMhitooc6znOnczgbTYmEP6U6/y1ZyKAIsvMKGg== + dependencies: + "@babel/code-frame" "^7.10.4" + "@babel/runtime" "^7.12.5" + "@types/aria-query" "^5.0.1" + aria-query "5.3.0" + dom-accessibility-api "^0.5.9" + lz-string "^1.5.0" + picocolors "1.1.1" + pretty-format "^27.0.2" + +"@testing-library/jest-dom@^6.9.1": + version "6.9.1" + resolved "https://registry.npmjs.org/@testing-library/jest-dom/-/jest-dom-6.9.1.tgz" + integrity sha512-zIcONa+hVtVSSep9UT3jZ5rizo2BsxgyDYU7WFD5eICBE7no3881HGeb/QkGfsJs6JTkY1aQhT7rIPC7e+0nnA== + dependencies: + "@adobe/css-tools" "^4.4.0" + aria-query "^5.0.0" + css.escape "^1.5.1" + dom-accessibility-api "^0.6.3" + picocolors "^1.1.1" + redent "^3.0.0" + +"@testing-library/react@^16.3.2": + version "16.3.2" + resolved "https://registry.npmjs.org/@testing-library/react/-/react-16.3.2.tgz" + integrity sha512-XU5/SytQM+ykqMnAnvB2umaJNIOsLF3PVv//1Ew4CTcpz0/BRyy/af40qqrt7SjKpDdT1saBMc42CUok5gaw+g== + dependencies: + "@babel/runtime" "^7.12.5" + +"@tiptap/core@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/core/-/core-3.22.2.tgz" + integrity sha512-atq35NkpeEphH6vNYJ0pTLLBA73FAbvTV9Ovd3AaTC5s99/KF5Q86zVJXvml8xPRcMGM6dLp+eSSd06oTscMSA== + +"@tiptap/extension-blockquote@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-blockquote/-/extension-blockquote-3.22.2.tgz" + integrity sha512-iTdlmGFcgxi4LKaOW2Rc9/yD83qTXgRm5BN3vCHWy5+TbEnReYxYqU5qKsbtTbKy30sO8TJTdAXTZ29uomShQQ== + +"@tiptap/extension-bold@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-bold/-/extension-bold-3.22.2.tgz" + integrity sha512-bqsPJyKcT/RWse4e16U2EKhraR8a2+98TUuk1amG3yCyFJZStoO/j+pN0IqZdZZjr3WtxFyvwWp7Kc59UN+jUA== + +"@tiptap/extension-bubble-menu@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-bubble-menu/-/extension-bubble-menu-3.22.2.tgz" + integrity sha512-5hbyDOSkJwA2uh0v9Mm0Dd9bb9inx6tHBEDSH2tCB9Rm23poz3yOreB7SNX8xDMe5L0/PQesfWC14RitcmhKPg== + dependencies: + "@floating-ui/dom" "^1.0.0" + +"@tiptap/extension-bullet-list@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-bullet-list/-/extension-bullet-list-3.22.2.tgz" + integrity sha512-llrTJnA72RGcWLLO+ro0QN4sjHynhaCerhpV+GZE/ATd8BqV/ekQFdBLJrvC/09My2XQfCwLsyCh92NPXUdELA== + +"@tiptap/extension-code-block@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-code-block/-/extension-code-block-3.22.2.tgz" + integrity sha512-PEwFlDyvtKF19WCrOFg77qJV9WqhvjCY4ZoXlHP9Hx0KTcOA8W39mtw8d4NWU5pLRK94yHKF1DVVL8UUkEOnww== + +"@tiptap/extension-code@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-code/-/extension-code-3.22.2.tgz" + integrity sha512-iYFY+yzfYA9MKt7nupyW/PzqL9XC2D0mC8l1z2Y10i0/fGL8NbqIYjhNUAyXGqH3QWcI+DirI66842y2OadPOg== + +"@tiptap/extension-document@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-document/-/extension-document-3.22.2.tgz" + integrity sha512-yPw9pQeVC4QDh86TuyKCZxxM4g0NAw7mEtGnAo6EpxaBQr1wyBr9yFpys+QTsQpRTmyTf1VHp4iTTLuWHMljIw== + +"@tiptap/extension-dropcursor@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-dropcursor/-/extension-dropcursor-3.22.2.tgz" + integrity sha512-sDv3fv4LtX0X4nqwh9Gn3C/aZXT+C2JlK7tJovPOpaYP/a6hr03Sn35X5moAfgMCSiWFygEvlTriqwmCsJuxog== + +"@tiptap/extension-floating-menu@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-floating-menu/-/extension-floating-menu-3.22.2.tgz" + integrity sha512-r0ZTeh9rNtj9Api+G0YyaB+tAKPDn7aYWg+qSrmAC5EyUPee6Zjn3zlw0q4renCeQflvNRK20xHM8zokC41jOA== + +"@tiptap/extension-gapcursor@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-gapcursor/-/extension-gapcursor-3.22.2.tgz" + integrity sha512-rR2OLrl/k2kj7xehaZHq0Y7T+1wy2DOTabir9LsTrktTFEcklrh9qY1KC6rEBkwMKaWrmignR1l39kS6RlKFNw== + +"@tiptap/extension-hard-break@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-hard-break/-/extension-hard-break-3.22.2.tgz" + integrity sha512-ChsoqF4XRp6EWatTRlXL4LMFh/ggwRVCyt09brSfjJV5knFaXlECSa5/+rKLMLMULaj6dVlJqoAD15exgu2HHA== + +"@tiptap/extension-heading@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-heading/-/extension-heading-3.22.2.tgz" + integrity sha512-QPHLef+ikAyf7RVc4EdGeKxH4OEGb3ueCEwJ41RcYPtZ1BX9ueei7FC936guTdL1U7w3vQ65qfy86HznzkYgvw== + +"@tiptap/extension-horizontal-rule@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-horizontal-rule/-/extension-horizontal-rule-3.22.2.tgz" + integrity sha512-Oz8KN5KJAWV1mFNE9UIWXdMD6xa5zPf/0yLsT8V4sgaRm+VsdFKllN58BY9qCZf/kIZbaOez5KkaoeAcm0MAZg== + +"@tiptap/extension-image@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-image/-/extension-image-3.22.2.tgz" + integrity sha512-xFCgwreF6sn5mQ/hFDQKn41NIbbfks/Ou9j763Djf3pWsastgzdgwifQOpXVI3aSsqlKUO3o8/8R/yQczvZcwg== + +"@tiptap/extension-italic@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-italic/-/extension-italic-3.22.2.tgz" + integrity sha512-fmtQu2HDnV3sOZPdz0+1lOLI7UtrIhusohJj2UwOLQxG8qqhLwbvWx2OQTlfblgY0z+CjLRr6ANbNDxOTIblfg== + +"@tiptap/extension-link@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-link/-/extension-link-3.22.2.tgz" + integrity sha512-TXfSoKmng5pecvQUZqdsx6ICeob5V5hhYOj2vCEtjfcjWsyCndqFIl1w+Nt/yI5ehrFNOVPyj3ZvcELuuAW6pw== + dependencies: + linkifyjs "^4.3.2" + +"@tiptap/extension-list-item@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-list-item/-/extension-list-item-3.22.2.tgz" + integrity sha512-Mk+iiLIFh8Pfuarr6mWfTO7QJbd2ZQd0nGNhNWXlGAO7DJCb4BP9nj4bEIJ17SbcykGRjsi4WMqY50z4MHXqKQ== + +"@tiptap/extension-list-keymap@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-list-keymap/-/extension-list-keymap-3.22.2.tgz" + integrity sha512-TozU9V2vldMUPpTXnfLCO33EO06jLxn7uEJTMBnN4iX/dLV3cBVCbE4kHyDKS0sLd7joUeekS06vYP9uQb1hFw== + +"@tiptap/extension-list@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-list/-/extension-list-3.22.2.tgz" + integrity sha512-Vq9xScgkA2A3Zj9dQ4WUBKK7u7UCzeSFRz9FcKTQVZHRPbZoqFGnlRUVngqsE7JXrCOthXQ1dXxgk40nAsBFRw== + +"@tiptap/extension-ordered-list@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-ordered-list/-/extension-ordered-list-3.22.2.tgz" + integrity sha512-K7qxoBKmsVkAd3kW64ZRCUPFrDcNGpXRDUBx9YgAO/bTfsfxtH2oil+igsUWGXPczpP4yoHPKjTfhpBpLjGl6Q== + +"@tiptap/extension-paragraph@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-paragraph/-/extension-paragraph-3.22.2.tgz" + integrity sha512-EHZZzxVhvzEPDPWtRBF1YKhB+WCUjd1C2NhjHfL3Dl71PBqM3ZWA6qN7NDGPyNyGGWauui/NR/4X+5AfPqlHyA== + +"@tiptap/extension-strike@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-strike/-/extension-strike-3.22.2.tgz" + integrity sha512-YFC3elKU1L8PiGbcB6tqd/7vWPF5IbydJz0POJpHzSjstX+VfT8VsvS7ubxVuSIWQ11kGkH3mzX6LX8JHsHZxg== + +"@tiptap/extension-text@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-text/-/extension-text-3.22.2.tgz" + integrity sha512-J1w7JwijfSD7ah0WfiwZ/DVWCIGT9x369RM4RJc57i44mIBElj7tl1dh+N5KPGOXKUup4gr7sSJAE38lgeaDMg== + +"@tiptap/extension-underline@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extension-underline/-/extension-underline-3.22.2.tgz" + integrity sha512-BaV6WOowxdkGTLWiU7DdZ3Twh633O4RGqwUM5dDas5LvaqL8AMWGTO8Wg9yAaaKXzd9MtKI1ZCqS/+MtzusgkQ== + +"@tiptap/extensions@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/extensions/-/extensions-3.22.2.tgz" + integrity sha512-s7MZmm2Xdq+8feIXgY3v7gVpQ5ClqBZi20KheouS7KSbBlrY4fu2irYR1EGc6r1UUVaHMxEa+cx5knhx+mIPUw== + +"@tiptap/pm@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/pm/-/pm-3.22.2.tgz" + integrity sha512-G2ENwIazoSKkAnN5MN5yN91TIZNFm6TxB74kPf3Empr2k9W51Hkcier70jHGpArhgcEaL4BVreuU1PRDRwCeGw== + dependencies: + prosemirror-changeset "^2.3.0" + prosemirror-collab "^1.3.1" + prosemirror-commands "^1.6.2" + prosemirror-dropcursor "^1.8.1" + prosemirror-gapcursor "^1.3.2" + prosemirror-history "^1.4.1" + prosemirror-inputrules "^1.4.0" + prosemirror-keymap "^1.2.2" + prosemirror-markdown "^1.13.1" + prosemirror-menu "^1.2.4" + prosemirror-model "^1.24.1" + prosemirror-schema-basic "^1.2.3" + prosemirror-schema-list "^1.5.0" + prosemirror-state "^1.4.3" + prosemirror-tables "^1.6.4" + prosemirror-trailing-node "^3.0.0" + prosemirror-transform "^1.10.2" + prosemirror-view "^1.38.1" + +"@tiptap/react@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/react/-/react-3.22.2.tgz" + integrity sha512-tyGKG69e/MkpoD/JTpVPz0XydEHxh1MSAYnLb3gRvyvBDv2r/veLea+cApkmjQaCfkKC/CWwTFXBYlOB0caSBA== + dependencies: + "@types/use-sync-external-store" "^0.0.6" + fast-equals "^5.3.3" + use-sync-external-store "^1.4.0" + optionalDependencies: + "@tiptap/extension-bubble-menu" "^3.22.2" + "@tiptap/extension-floating-menu" "^3.22.2" + +"@tiptap/starter-kit@^3.22.2": + version "3.22.2" + resolved "https://registry.npmjs.org/@tiptap/starter-kit/-/starter-kit-3.22.2.tgz" + integrity sha512-+CCKX8tOQ/ZPb2k/z6em4AQCFYAcdd8+0TOzPWiuLxRyCHRPBBVhnPsXOKgKwE4OO3E8BsezquuYRYRwsyzCqg== + dependencies: + "@tiptap/core" "^3.22.2" + "@tiptap/extension-blockquote" "^3.22.2" + "@tiptap/extension-bold" "^3.22.2" + "@tiptap/extension-bullet-list" "^3.22.2" + "@tiptap/extension-code" "^3.22.2" + "@tiptap/extension-code-block" "^3.22.2" + "@tiptap/extension-document" "^3.22.2" + "@tiptap/extension-dropcursor" "^3.22.2" + "@tiptap/extension-gapcursor" "^3.22.2" + "@tiptap/extension-hard-break" "^3.22.2" + "@tiptap/extension-heading" "^3.22.2" + "@tiptap/extension-horizontal-rule" "^3.22.2" + "@tiptap/extension-italic" "^3.22.2" + "@tiptap/extension-link" "^3.22.2" + "@tiptap/extension-list" "^3.22.2" + "@tiptap/extension-list-item" "^3.22.2" + "@tiptap/extension-list-keymap" "^3.22.2" + "@tiptap/extension-ordered-list" "^3.22.2" + "@tiptap/extension-paragraph" "^3.22.2" + "@tiptap/extension-strike" "^3.22.2" + "@tiptap/extension-text" "^3.22.2" + "@tiptap/extension-underline" "^3.22.2" + "@tiptap/extensions" "^3.22.2" + "@tiptap/pm" "^3.22.2" + +"@types/aria-query@^5.0.1": + version "5.0.4" + resolved "https://registry.npmjs.org/@types/aria-query/-/aria-query-5.0.4.tgz" + integrity sha512-rfT93uj5s0PRL7EzccGMs3brplhcrghnDoV26NqKhCAS1hVo+WdNsPvE/yb6ilfr5hi2MEk6d5EWJTKdxg8jVw== + +"@types/chai@^5.2.2": + version "5.2.3" + resolved "https://registry.npmjs.org/@types/chai/-/chai-5.2.3.tgz" + integrity sha512-Mw558oeA9fFbv65/y4mHtXDs9bPnFMZAL/jxdPFUpOHHIXX91mcgEHbS5Lahr+pwZFR8A7GQleRWeI6cGFC2UA== + dependencies: + "@types/deep-eql" "*" + assertion-error "^2.0.1" + +"@types/d3-array@*", "@types/d3-array@^3.2.1": + version "3.2.2" + resolved "https://registry.npmjs.org/@types/d3-array/-/d3-array-3.2.2.tgz" + integrity sha512-hOLWVbm7uRza0BYXpIIW5pxfrKe0W+D5lrFiAEYR+pb6w3N2SwSMaJbXdUfSEv+dT4MfHBLtn5js0LAWaO6otw== + +"@types/d3-axis@*": + version "3.0.6" + resolved "https://registry.npmjs.org/@types/d3-axis/-/d3-axis-3.0.6.tgz" + integrity sha512-pYeijfZuBd87T0hGn0FO1vQ/cgLk6E1ALJjfkC0oJ8cbwkZl3TpgS8bVBLZN+2jjGgg38epgxb2zmoGtSfvgMw== + dependencies: + "@types/d3-selection" "*" + +"@types/d3-brush@*": + version "3.0.6" + resolved "https://registry.npmjs.org/@types/d3-brush/-/d3-brush-3.0.6.tgz" + integrity sha512-nH60IZNNxEcrh6L1ZSMNA28rj27ut/2ZmI3r96Zd+1jrZD++zD3LsMIjWlvg4AYrHn/Pqz4CF3veCxGjtbqt7A== + dependencies: + "@types/d3-selection" "*" + +"@types/d3-chord@*": + version "3.0.6" + resolved "https://registry.npmjs.org/@types/d3-chord/-/d3-chord-3.0.6.tgz" + integrity sha512-LFYWWd8nwfwEmTZG9PfQxd17HbNPksHBiJHaKuY1XeqscXacsS2tyoo6OdRsjf+NQYeB6XrNL3a25E3gH69lcg== + +"@types/d3-color@*": + version "3.1.3" + resolved "https://registry.npmjs.org/@types/d3-color/-/d3-color-3.1.3.tgz" + integrity sha512-iO90scth9WAbmgv7ogoq57O9YpKmFBbmoEoCHDB2xMBY0+/KVrqAaCDyCE16dUspeOvIxFFRI+0sEtqDqy2b4A== + +"@types/d3-contour@*": + version "3.0.6" + resolved "https://registry.npmjs.org/@types/d3-contour/-/d3-contour-3.0.6.tgz" + integrity sha512-BjzLgXGnCWjUSYGfH1cpdo41/hgdWETu4YxpezoztawmqsvCeep+8QGfiY6YbDvfgHz/DkjeIkkZVJavB4a3rg== + dependencies: + "@types/d3-array" "*" + "@types/geojson" "*" + +"@types/d3-delaunay@*": + version "6.0.4" + resolved "https://registry.npmjs.org/@types/d3-delaunay/-/d3-delaunay-6.0.4.tgz" + integrity sha512-ZMaSKu4THYCU6sV64Lhg6qjf1orxBthaC161plr5KuPHo3CNm8DTHiLw/5Eq2b6TsNP0W0iJrUOFscY6Q450Hw== + +"@types/d3-dispatch@*": + version "3.0.7" + resolved "https://registry.npmjs.org/@types/d3-dispatch/-/d3-dispatch-3.0.7.tgz" + integrity sha512-5o9OIAdKkhN1QItV2oqaE5KMIiXAvDWBDPrD85e58Qlz1c1kI/J0NcqbEG88CoTwJrYe7ntUCVfeUl2UJKbWgA== + +"@types/d3-drag@*": + version "3.0.7" + resolved "https://registry.npmjs.org/@types/d3-drag/-/d3-drag-3.0.7.tgz" + integrity sha512-HE3jVKlzU9AaMazNufooRJ5ZpWmLIoc90A37WU2JMmeq28w1FQqCZswHZ3xR+SuxYftzHq6WU6KJHvqxKzTxxQ== + dependencies: + "@types/d3-selection" "*" + +"@types/d3-dsv@*": + version "3.0.7" + resolved "https://registry.npmjs.org/@types/d3-dsv/-/d3-dsv-3.0.7.tgz" + integrity sha512-n6QBF9/+XASqcKK6waudgL0pf/S5XHPPI8APyMLLUHd8NqouBGLsU8MgtO7NINGtPBtk9Kko/W4ea0oAspwh9g== + +"@types/d3-ease@*": + version "3.0.2" + resolved "https://registry.npmjs.org/@types/d3-ease/-/d3-ease-3.0.2.tgz" + integrity sha512-NcV1JjO5oDzoK26oMzbILE6HW7uVXOHLQvHshBUW4UMdZGfiY6v5BeQwh9a9tCzv+CeefZQHJt5SRgK154RtiA== + +"@types/d3-fetch@*": + version "3.0.7" + resolved "https://registry.npmjs.org/@types/d3-fetch/-/d3-fetch-3.0.7.tgz" + integrity sha512-fTAfNmxSb9SOWNB9IoG5c8Hg6R+AzUHDRlsXsDZsNp6sxAEOP0tkP3gKkNSO/qmHPoBFTxNrjDprVHDQDvo5aA== + dependencies: + "@types/d3-dsv" "*" + +"@types/d3-force@*": + version "3.0.10" + resolved "https://registry.npmjs.org/@types/d3-force/-/d3-force-3.0.10.tgz" + integrity sha512-ZYeSaCF3p73RdOKcjj+swRlZfnYpK1EbaDiYICEEp5Q6sUiqFaFQ9qgoshp5CzIyyb/yD09kD9o2zEltCexlgw== + +"@types/d3-format@*": + version "3.0.4" + resolved "https://registry.npmjs.org/@types/d3-format/-/d3-format-3.0.4.tgz" + integrity sha512-fALi2aI6shfg7vM5KiR1wNJnZ7r6UuggVqtDA+xiEdPZQwy/trcQaHnwShLuLdta2rTymCNpxYTiMZX/e09F4g== + +"@types/d3-geo@*": + version "3.1.0" + resolved "https://registry.npmjs.org/@types/d3-geo/-/d3-geo-3.1.0.tgz" + integrity sha512-856sckF0oP/diXtS4jNsiQw/UuK5fQG8l/a9VVLeSouf1/PPbBE1i1W852zVwKwYCBkFJJB7nCFTbk6UMEXBOQ== + dependencies: + "@types/geojson" "*" + +"@types/d3-hierarchy@*": + version "3.1.7" + resolved "https://registry.npmjs.org/@types/d3-hierarchy/-/d3-hierarchy-3.1.7.tgz" + integrity sha512-tJFtNoYBtRtkNysX1Xq4sxtjK8YgoWUNpIiUee0/jHGRwqvzYxkq0hGVbbOGSz+JgFxxRu4K8nb3YpG3CMARtg== + +"@types/d3-interpolate@*": + version "3.0.4" + resolved "https://registry.npmjs.org/@types/d3-interpolate/-/d3-interpolate-3.0.4.tgz" + integrity sha512-mgLPETlrpVV1YRJIglr4Ez47g7Yxjl1lj7YKsiMCb27VJH9W8NVM6Bb9d8kkpG/uAQS5AmbA48q2IAolKKo1MA== + dependencies: + "@types/d3-color" "*" + +"@types/d3-path@*": + version "3.1.1" + resolved "https://registry.npmjs.org/@types/d3-path/-/d3-path-3.1.1.tgz" + integrity sha512-VMZBYyQvbGmWyWVea0EHs/BwLgxc+MKi1zLDCONksozI4YJMcTt8ZEuIR4Sb1MMTE8MMW49v0IwI5+b7RmfWlg== + +"@types/d3-polygon@*": + version "3.0.2" + resolved "https://registry.npmjs.org/@types/d3-polygon/-/d3-polygon-3.0.2.tgz" + integrity sha512-ZuWOtMaHCkN9xoeEMr1ubW2nGWsp4nIql+OPQRstu4ypeZ+zk3YKqQT0CXVe/PYqrKpZAi+J9mTs05TKwjXSRA== + +"@types/d3-quadtree@*": + version "3.0.6" + resolved "https://registry.npmjs.org/@types/d3-quadtree/-/d3-quadtree-3.0.6.tgz" + integrity sha512-oUzyO1/Zm6rsxKRHA1vH0NEDG58HrT5icx/azi9MF1TWdtttWl0UIUsjEQBBh+SIkrpd21ZjEv7ptxWys1ncsg== + +"@types/d3-random@*": + version "3.0.3" + resolved "https://registry.npmjs.org/@types/d3-random/-/d3-random-3.0.3.tgz" + integrity sha512-Imagg1vJ3y76Y2ea0871wpabqp613+8/r0mCLEBfdtqC7xMSfj9idOnmBYyMoULfHePJyxMAw3nWhJxzc+LFwQ== + +"@types/d3-scale-chromatic@*": + version "3.1.0" + resolved "https://registry.npmjs.org/@types/d3-scale-chromatic/-/d3-scale-chromatic-3.1.0.tgz" + integrity sha512-iWMJgwkK7yTRmWqRB5plb1kadXyQ5Sj8V/zYlFGMUBbIPKQScw+Dku9cAAMgJG+z5GYDoMjWGLVOvjghDEFnKQ== + +"@types/d3-scale@*": + version "4.0.9" + resolved "https://registry.npmjs.org/@types/d3-scale/-/d3-scale-4.0.9.tgz" + integrity sha512-dLmtwB8zkAeO/juAMfnV+sItKjlsw2lKdZVVy6LRr0cBmegxSABiLEpGVmSJJ8O08i4+sGR6qQtb6WtuwJdvVw== + dependencies: + "@types/d3-time" "*" + +"@types/d3-selection@*": + version "3.0.11" + resolved "https://registry.npmjs.org/@types/d3-selection/-/d3-selection-3.0.11.tgz" + integrity sha512-bhAXu23DJWsrI45xafYpkQ4NtcKMwWnAC/vKrd2l+nxMFuvOT3XMYTIj2opv8vq8AO5Yh7Qac/nSeP/3zjTK0w== + +"@types/d3-shape@*": + version "3.1.8" + resolved "https://registry.npmjs.org/@types/d3-shape/-/d3-shape-3.1.8.tgz" + integrity sha512-lae0iWfcDeR7qt7rA88BNiqdvPS5pFVPpo5OfjElwNaT2yyekbM0C9vK+yqBqEmHr6lDkRnYNoTBYlAgJa7a4w== + dependencies: + "@types/d3-path" "*" + +"@types/d3-time-format@*": + version "4.0.3" + resolved "https://registry.npmjs.org/@types/d3-time-format/-/d3-time-format-4.0.3.tgz" + integrity sha512-5xg9rC+wWL8kdDj153qZcsJ0FWiFt0J5RB6LYUNZjwSnesfblqrI/bJ1wBdJ8OQfncgbJG5+2F+qfqnqyzYxyg== + +"@types/d3-time@*": + version "3.0.4" + resolved "https://registry.npmjs.org/@types/d3-time/-/d3-time-3.0.4.tgz" + integrity sha512-yuzZug1nkAAaBlBBikKZTgzCeA+k1uy4ZFwWANOfKw5z5LRhV0gNA7gNkKm7HoK+HRN0wX3EkxGk0fpbWhmB7g== + +"@types/d3-timer@*": + version "3.0.2" + resolved "https://registry.npmjs.org/@types/d3-timer/-/d3-timer-3.0.2.tgz" + integrity sha512-Ps3T8E8dZDam6fUyNiMkekK3XUsaUEik+idO9/YjPtfj2qruF8tFBXS7XhtE4iIXBLxhmLjP3SXpLhVf21I9Lw== + +"@types/d3-transition@*": + version "3.0.9" + resolved "https://registry.npmjs.org/@types/d3-transition/-/d3-transition-3.0.9.tgz" + integrity sha512-uZS5shfxzO3rGlu0cC3bjmMFKsXv+SmZZcgp0KD22ts4uGXp5EVYGzu/0YdwZeKmddhcAccYtREJKkPfXkZuCg== + dependencies: + "@types/d3-selection" "*" + +"@types/d3-zoom@*": + version "3.0.8" + resolved "https://registry.npmjs.org/@types/d3-zoom/-/d3-zoom-3.0.8.tgz" + integrity sha512-iqMC4/YlFCSlO8+2Ii1GGGliCAY4XdeG748w5vQUbevlbDu0zSjH/+jojorQVBK/se0j6DUFNPBGSqD3YWYnDw== + dependencies: + "@types/d3-interpolate" "*" + "@types/d3-selection" "*" + +"@types/d3@^7.4.3": + version "7.4.3" + resolved "https://registry.npmjs.org/@types/d3/-/d3-7.4.3.tgz" + integrity sha512-lZXZ9ckh5R8uiFVt8ogUNf+pIrK4EsWrx2Np75WvF/eTpJ0FMHNhjXk8CKEx/+gpHbNQyJWehbFaTvqmHWB3ww== + dependencies: + "@types/d3-array" "*" + "@types/d3-axis" "*" + "@types/d3-brush" "*" + "@types/d3-chord" "*" + "@types/d3-color" "*" + "@types/d3-contour" "*" + "@types/d3-delaunay" "*" + "@types/d3-dispatch" "*" + "@types/d3-drag" "*" + "@types/d3-dsv" "*" + "@types/d3-ease" "*" + "@types/d3-fetch" "*" + "@types/d3-force" "*" + "@types/d3-format" "*" + "@types/d3-geo" "*" + "@types/d3-hierarchy" "*" + "@types/d3-interpolate" "*" + "@types/d3-path" "*" + "@types/d3-polygon" "*" + "@types/d3-quadtree" "*" + "@types/d3-random" "*" + "@types/d3-scale" "*" + "@types/d3-scale-chromatic" "*" + "@types/d3-selection" "*" + "@types/d3-shape" "*" + "@types/d3-time" "*" + "@types/d3-time-format" "*" + "@types/d3-timer" "*" + "@types/d3-transition" "*" + "@types/d3-zoom" "*" + +"@types/deep-eql@*": + version "4.0.2" + resolved "https://registry.npmjs.org/@types/deep-eql/-/deep-eql-4.0.2.tgz" + integrity sha512-c9h9dVVMigMPc4bwTvC5dxqtqJZwQPePsWjPlpSOnojbor6pGqdk541lfA7AqFQr5pB1BRdq0juY9db81BwyFw== + +"@types/dompurify@^3.0.5": + version "3.2.0" + resolved "https://registry.npmjs.org/@types/dompurify/-/dompurify-3.2.0.tgz" + integrity sha512-Fgg31wv9QbLDA0SpTOXO3MaxySc4DKGLi8sna4/Utjo4r3ZRPdCt4UQee8BWr+Q5z21yifghREPJGYaEOEIACg== + dependencies: + dompurify "*" + +"@types/estree@^1.0.0", "@types/estree@^1.0.6", "@types/estree@^1.0.8", "@types/estree@1.0.8": + version "1.0.8" + resolved "https://registry.npmjs.org/@types/estree/-/estree-1.0.8.tgz" + integrity sha512-dWHzHa2WqEXI/O1E9OjrocMTKJl2mSrEolh1Iomrv6U+JuNwaHXsXx9bLu5gG7BUWFIN0skIQJQ/L1rIex4X6w== + +"@types/geojson@*", "@types/geojson@7946.0.16": + version "7946.0.16" + resolved "https://registry.npmjs.org/@types/geojson/-/geojson-7946.0.16.tgz" + integrity sha512-6C8nqWur3j98U6+lXDfTUWIfgvZU+EumvpHKcYjujKH7woYyLj2sUmff0tRhrqM7BohUw7Pz3ZB1jj2gW9Fvmg== + +"@types/hoist-non-react-statics@^3.3.1": + version "3.3.7" + resolved "https://registry.npmjs.org/@types/hoist-non-react-statics/-/hoist-non-react-statics-3.3.7.tgz" + integrity sha512-PQTyIulDkIDro8P+IHbKCsw7U2xxBYflVzW/FgWdCAePD9xGSidgA76/GeJ6lBKoblyhf9pBY763gbrN+1dI8g== + dependencies: + hoist-non-react-statics "^3.3.0" + +"@types/json-schema@^7.0.15": + version "7.0.15" + resolved "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.15.tgz" + integrity sha512-5+fP8P8MFNC+AyZCDxrB2pkZFPGzqQWUzpSeuuVLvm8VMcorNYavBqoFcxK8bQz4Qsbn4oUEEem4wDLfcysGHA== + +"@types/linkify-it@^3": + version "3.0.5" + resolved "https://registry.npmjs.org/@types/linkify-it/-/linkify-it-3.0.5.tgz" + integrity sha512-yg6E+u0/+Zjva+buc3EIb+29XEg4wltq7cSmd4Uc2EE/1nUVmxyzpX6gUXD0V8jIrG0r7YeOGVIbYRkxeooCtw== + +"@types/linkify-it@^5": + version "5.0.0" + resolved "https://registry.npmjs.org/@types/linkify-it/-/linkify-it-5.0.0.tgz" + integrity sha512-sVDA58zAw4eWAffKOaQH5/5j3XeayukzDk+ewSsnv3p4yJEZHCCzMDiZM8e0OUrRvmpGZ85jf4yDHkHsgBNr9Q== + +"@types/lodash@^4.17.7": + version "4.17.24" + resolved "https://registry.npmjs.org/@types/lodash/-/lodash-4.17.24.tgz" + integrity sha512-gIW7lQLZbue7lRSWEFql49QJJWThrTFFeIMJdp3eH4tKoxm1OvEPg02rm4wCCSHS0cL3/Fizimb35b7k8atwsQ== + +"@types/markdown-it@^13.0.7": + version "13.0.9" + resolved "https://registry.npmjs.org/@types/markdown-it/-/markdown-it-13.0.9.tgz" + integrity sha512-1XPwR0+MgXLWfTn9gCsZ55AHOKW1WN+P9vr0PaQh5aerR9LLQXUbjfEAFhjmEmyoYFWAyuN2Mqkn40MZ4ukjBw== + dependencies: + "@types/linkify-it" "^3" + "@types/mdurl" "^1" + +"@types/markdown-it@^14.0.0": + version "14.1.2" + resolved "https://registry.npmjs.org/@types/markdown-it/-/markdown-it-14.1.2.tgz" + integrity sha512-promo4eFwuiW+TfGxhi+0x3czqTYJkG8qB17ZUJiVF10Xm7NLVRSLUsfRTU/6h1e24VvRnXCx+hG7li58lkzog== + dependencies: + "@types/linkify-it" "^5" + "@types/mdurl" "^2" + +"@types/mdurl@^1": + version "1.0.5" + resolved "https://registry.npmjs.org/@types/mdurl/-/mdurl-1.0.5.tgz" + integrity sha512-6L6VymKTzYSrEf4Nev4Xa1LCHKrlTlYCBMTlQKFuddo1CvQcE52I0mwfOJayueUC7MJuXOeHTcIU683lzd0cUA== + +"@types/mdurl@^2": + version "2.0.0" + resolved "https://registry.npmjs.org/@types/mdurl/-/mdurl-2.0.0.tgz" + integrity sha512-RGdgjQUZba5p6QEFAVx2OGb8rQDL/cPRG7GiedRzMcJ1tYnUANBncjbSB1NRGwbvjcPeikRABz2nshyPk1bhWg== + +"@types/node@^14.0.1": + version "14.18.63" + resolved "https://registry.npmjs.org/@types/node/-/node-14.18.63.tgz" + integrity sha512-fAtCfv4jJg+ExtXhvCkCqUKZ+4ok/JQk01qDKhL5BDDoS3AxKXhV5/MAVUZyQnSEd2GT92fkgZl0pz0Q0AzcIQ== + +"@types/node@^20.14.10": + version "20.19.37" + resolved "https://registry.npmjs.org/@types/node/-/node-20.19.37.tgz" + integrity sha512-8kzdPJ3FsNsVIurqBs7oodNnCEVbni9yUEkaHbgptDACOPW04jimGagZ51E6+lXUwJjgnBw+hyko/lkFWCldqw== + dependencies: + undici-types "~6.21.0" + +"@types/parse-json@^4.0.0": + version "4.0.2" + resolved "https://registry.npmjs.org/@types/parse-json/-/parse-json-4.0.2.tgz" + integrity sha512-dISoDXWWQwUquiKsyZ4Ng+HX2KsPL7LyHKHQwgGFEA3IaKac4Obd+h2a/a6waisAoepJlBcx9paWqjA8/HVjCw== + +"@types/prismjs@^1.26.0": + version "1.26.6" + resolved "https://registry.npmjs.org/@types/prismjs/-/prismjs-1.26.6.tgz" + integrity sha512-vqlvI7qlMvcCBbVe0AKAb4f97//Hy0EBTaiW8AalRnG/xAN5zOiWWyrNqNXeq8+KAuvRewjCVY1+IPxk4RdNYw== + +"@types/prop-types@*", "@types/prop-types@^15.7.15": + version "15.7.15" + resolved "https://registry.npmjs.org/@types/prop-types/-/prop-types-15.7.15.tgz" + integrity sha512-F6bEyamV9jKGAFBEmlQnesRPGOQqS2+Uwi0Em15xenOxHaf2hv6L8YCVn3rPdPJOiJfPiCnLIRyvwVaqMY3MIw== + +"@types/react-dom@^18.3.0": + version "18.3.7" + resolved "https://registry.npmjs.org/@types/react-dom/-/react-dom-18.3.7.tgz" + integrity sha512-MEe3UeoENYVFXzoXEWsvcpg6ZvlrFNlOQ7EOsvhI3CfAXwzPfO8Qwuxd40nepsYKqyyVQnTdEfv68q91yLcKrQ== + +"@types/react-katex@^3.0.4": + version "3.0.4" + resolved "https://registry.npmjs.org/@types/react-katex/-/react-katex-3.0.4.tgz" + integrity sha512-aLkykKzSKLpXI6REJ3uClao6P47HAFfR1gcHOZwDeTuALsyjgMhz+oynLV4gX0kiJVnvHrBKF/TLXqyNTpHDUg== + dependencies: + "@types/react" "*" + +"@types/react-transition-group@^4.4.12": + version "4.4.12" + resolved "https://registry.npmjs.org/@types/react-transition-group/-/react-transition-group-4.4.12.tgz" + integrity sha512-8TV6R3h2j7a91c+1DXdJi3Syo69zzIZbz7Lg5tORM5LEJG7X/E6a1V3drRyBRZq7/utz7A+c4OgYLiLcYGHG6w== + +"@types/react@*": + version "19.2.14" + resolved "https://registry.npmjs.org/@types/react/-/react-19.2.14.tgz" + integrity sha512-ilcTH/UniCkMdtexkoCN0bI7pMcJDvmQFPvuPvmEaYA/NSfFTAgdUSLAoVjaRJm7+6PvcM+q1zYOwS4wTYMF9w== + dependencies: + csstype "^3.2.2" + +"@types/react@^18.3.3": + version "18.3.28" + resolved "https://registry.npmjs.org/@types/react/-/react-18.3.28.tgz" + integrity sha512-z9VXpC7MWrhfWipitjNdgCauoMLRdIILQsAEV+ZesIzBq/oUlxk0m3ApZuMFCXdnS4U7KrI+l3WRUEGQ8K1QKw== + dependencies: + "@types/prop-types" "*" + csstype "^3.2.2" + +"@types/trusted-types@^2.0.7": + version "2.0.7" + resolved "https://registry.npmjs.org/@types/trusted-types/-/trusted-types-2.0.7.tgz" + integrity sha512-ScaPdn1dQczgbl0QFTeTOmVHFULt394XJgOQNoyVhZ6r2vLnMLJfBPd53SB52T/3G36VI1/g2MZaX0cwDuXsfw== + +"@types/use-sync-external-store@^0.0.3": + version "0.0.3" + resolved "https://registry.npmjs.org/@types/use-sync-external-store/-/use-sync-external-store-0.0.3.tgz" + integrity sha512-EwmlvuaxPNej9+T4v5AuBPJa2x2UOJVdjCtDHgcDqitUeOtjnJKJ+apYjVcAoBEMjKW1VVFGZLUb5+qqa09XFA== + +"@types/use-sync-external-store@^0.0.6": + version "0.0.6" + resolved "https://registry.npmjs.org/@types/use-sync-external-store/-/use-sync-external-store-0.0.6.tgz" + integrity sha512-zFDAD+tlpf2r4asuHEj0XH6pY6i0g5NeAHPn+15wk3BV6JA69eERFXC1gyGThDkVa1zCyKr5jox1+2LbV/AMLg== + +"@types/validator@^13.12.2": + version "13.15.10" + resolved "https://registry.npmjs.org/@types/validator/-/validator-13.15.10.tgz" + integrity sha512-T8L6i7wCuyoK8A/ZeLYt1+q0ty3Zb9+qbSSvrIVitzT3YjZqkTZ40IbRsPanlB4h1QB3JVL1SYCdR6ngtFYcuA== + +"@typescript-eslint/eslint-plugin@^8.16.0", "@typescript-eslint/eslint-plugin@8.57.2": + version "8.57.2" + resolved "https://registry.npmjs.org/@typescript-eslint/eslint-plugin/-/eslint-plugin-8.57.2.tgz" + integrity sha512-NZZgp0Fm2IkD+La5PR81sd+g+8oS6JwJje+aRWsDocxHkjyRw0J5L5ZTlN3LI1LlOcGL7ph3eaIUmTXMIjLk0w== + dependencies: + "@eslint-community/regexpp" "^4.12.2" + "@typescript-eslint/scope-manager" "8.57.2" + "@typescript-eslint/type-utils" "8.57.2" + "@typescript-eslint/utils" "8.57.2" + "@typescript-eslint/visitor-keys" "8.57.2" + ignore "^7.0.5" + natural-compare "^1.4.0" + ts-api-utils "^2.4.0" + +"@typescript-eslint/parser@^8.16.0", "@typescript-eslint/parser@8.57.2": + version "8.57.2" + resolved "https://registry.npmjs.org/@typescript-eslint/parser/-/parser-8.57.2.tgz" + integrity sha512-30ScMRHIAD33JJQkgfGW1t8CURZtjc2JpTrq5n2HFhOefbAhb7ucc7xJwdWcrEtqUIYJ73Nybpsggii6GtAHjA== + dependencies: + "@typescript-eslint/scope-manager" "8.57.2" + "@typescript-eslint/types" "8.57.2" + "@typescript-eslint/typescript-estree" "8.57.2" + "@typescript-eslint/visitor-keys" "8.57.2" + debug "^4.4.3" + +"@typescript-eslint/project-service@8.57.2": + version "8.57.2" + resolved "https://registry.npmjs.org/@typescript-eslint/project-service/-/project-service-8.57.2.tgz" + integrity sha512-FuH0wipFywXRTHf+bTTjNyuNQQsQC3qh/dYzaM4I4W0jrCqjCVuUh99+xd9KamUfmCGPvbO8NDngo/vsnNVqgw== + dependencies: + "@typescript-eslint/tsconfig-utils" "^8.57.2" + "@typescript-eslint/types" "^8.57.2" + debug "^4.4.3" + +"@typescript-eslint/scope-manager@8.57.2": + version "8.57.2" + resolved "https://registry.npmjs.org/@typescript-eslint/scope-manager/-/scope-manager-8.57.2.tgz" + integrity sha512-snZKH+W4WbWkrBqj4gUNRIGb/jipDW3qMqVJ4C9rzdFc+wLwruxk+2a5D+uoFcKPAqyqEnSb4l2ULuZf95eSkw== + dependencies: + "@typescript-eslint/types" "8.57.2" + "@typescript-eslint/visitor-keys" "8.57.2" + +"@typescript-eslint/tsconfig-utils@^8.57.2", "@typescript-eslint/tsconfig-utils@8.57.2": + version "8.57.2" + resolved "https://registry.npmjs.org/@typescript-eslint/tsconfig-utils/-/tsconfig-utils-8.57.2.tgz" + integrity sha512-3Lm5DSM+DCowsUOJC+YqHHnKEfFh5CoGkj5Z31NQSNF4l5wdOwqGn99wmwN/LImhfY3KJnmordBq/4+VDe2eKw== + +"@typescript-eslint/type-utils@8.57.2": + version "8.57.2" + resolved "https://registry.npmjs.org/@typescript-eslint/type-utils/-/type-utils-8.57.2.tgz" + integrity sha512-Co6ZCShm6kIbAM/s+oYVpKFfW7LBc6FXoPXjTRQ449PPNBY8U0KZXuevz5IFuuUj2H9ss40atTaf9dlGLzbWZg== + dependencies: + "@typescript-eslint/types" "8.57.2" + "@typescript-eslint/typescript-estree" "8.57.2" + "@typescript-eslint/utils" "8.57.2" + debug "^4.4.3" + ts-api-utils "^2.4.0" + +"@typescript-eslint/types@^8.57.2", "@typescript-eslint/types@8.57.2": + version "8.57.2" + resolved "https://registry.npmjs.org/@typescript-eslint/types/-/types-8.57.2.tgz" + integrity sha512-/iZM6FnM4tnx9csuTxspMW4BOSegshwX5oBDznJ7S4WggL7Vczz5d2W11ecc4vRrQMQHXRSxzrCsyG5EsPPTbA== + +"@typescript-eslint/typescript-estree@8.57.2": + version "8.57.2" + resolved "https://registry.npmjs.org/@typescript-eslint/typescript-estree/-/typescript-estree-8.57.2.tgz" + integrity sha512-2MKM+I6g8tJxfSmFKOnHv2t8Sk3T6rF20A1Puk0svLK+uVapDZB/4pfAeB7nE83uAZrU6OxW+HmOd5wHVdXwXA== + dependencies: + "@typescript-eslint/project-service" "8.57.2" + "@typescript-eslint/tsconfig-utils" "8.57.2" + "@typescript-eslint/types" "8.57.2" + "@typescript-eslint/visitor-keys" "8.57.2" + debug "^4.4.3" + minimatch "^10.2.2" + semver "^7.7.3" + tinyglobby "^0.2.15" + ts-api-utils "^2.4.0" + +"@typescript-eslint/utils@8.57.2": + version "8.57.2" + resolved "https://registry.npmjs.org/@typescript-eslint/utils/-/utils-8.57.2.tgz" + integrity sha512-krRIbvPK1ju1WBKIefiX+bngPs+odIQUtR7kymzPfo1POVw3jlF+nLkmexdSSd4UCbDcQn+wMBATOOmpBbqgKg== + dependencies: + "@eslint-community/eslint-utils" "^4.9.1" + "@typescript-eslint/scope-manager" "8.57.2" + "@typescript-eslint/types" "8.57.2" + "@typescript-eslint/typescript-estree" "8.57.2" + +"@typescript-eslint/visitor-keys@8.57.2": + version "8.57.2" + resolved "https://registry.npmjs.org/@typescript-eslint/visitor-keys/-/visitor-keys-8.57.2.tgz" + integrity sha512-zhahknjobV2FiD6Ee9iLbS7OV9zi10rG26odsQdfBO/hjSzUQbkIYgda+iNKK1zNiW2ey+Lf8MU5btN17V3dUw== + dependencies: + "@typescript-eslint/types" "8.57.2" + eslint-visitor-keys "^5.0.0" + +"@vitejs/plugin-react-swc@^3.7.0": + version "3.11.0" + resolved "https://registry.npmjs.org/@vitejs/plugin-react-swc/-/plugin-react-swc-3.11.0.tgz" + integrity sha512-YTJCGFdNMHCMfjODYtxRNVAYmTWQ1Lb8PulP/2/f/oEEtglw8oKxKIZmmRkyXrVrHfsKOaVkAc3NT9/dMutO5w== + dependencies: + "@rolldown/pluginutils" "1.0.0-beta.27" + "@swc/core" "^1.12.11" + +"@vitest/expect@4.1.1": + version "4.1.1" + resolved "https://registry.npmjs.org/@vitest/expect/-/expect-4.1.1.tgz" + integrity sha512-xAV0fqBTk44Rn6SjJReEQkHP3RrqbJo6JQ4zZ7/uVOiJZRarBtblzrOfFIZeYUrukp2YD6snZG6IBqhOoHTm+A== + dependencies: + "@standard-schema/spec" "^1.1.0" + "@types/chai" "^5.2.2" + "@vitest/spy" "4.1.1" + "@vitest/utils" "4.1.1" + chai "^6.2.2" + tinyrainbow "^3.0.3" + +"@vitest/mocker@4.1.1": + version "4.1.1" + resolved "https://registry.npmjs.org/@vitest/mocker/-/mocker-4.1.1.tgz" + integrity sha512-h3BOylsfsCLPeceuCPAAJ+BvNwSENgJa4hXoXu4im0bs9Lyp4URc4JYK4pWLZ4pG/UQn7AT92K6IByi6rE6g3A== + dependencies: + "@vitest/spy" "4.1.1" + estree-walker "^3.0.3" + magic-string "^0.30.21" + +"@vitest/pretty-format@4.1.1": + version "4.1.1" + resolved "https://registry.npmjs.org/@vitest/pretty-format/-/pretty-format-4.1.1.tgz" + integrity sha512-GM+TEQN5WhOygr1lp7skeVjdLPqqWMHsfzXrcHAqZJi/lIVh63H0kaRCY8MDhNWikx19zBUK8ceaLB7X5AH9NQ== + dependencies: + tinyrainbow "^3.0.3" + +"@vitest/runner@4.1.1": + version "4.1.1" + resolved "https://registry.npmjs.org/@vitest/runner/-/runner-4.1.1.tgz" + integrity sha512-f7+FPy75vN91QGWsITueq0gedwUZy1fLtHOCMeQpjs8jTekAHeKP80zfDEnhrleviLHzVSDXIWuCIOFn3D3f8A== + dependencies: + "@vitest/utils" "4.1.1" + pathe "^2.0.3" + +"@vitest/snapshot@4.1.1": + version "4.1.1" + resolved "https://registry.npmjs.org/@vitest/snapshot/-/snapshot-4.1.1.tgz" + integrity sha512-kMVSgcegWV2FibXEx9p9WIKgje58lcTbXgnJixfcg15iK8nzCXhmalL0ZLtTWLW9PH1+1NEDShiFFedB3tEgWg== + dependencies: + "@vitest/pretty-format" "4.1.1" + "@vitest/utils" "4.1.1" + magic-string "^0.30.21" + pathe "^2.0.3" + +"@vitest/spy@4.1.1": + version "4.1.1" + resolved "https://registry.npmjs.org/@vitest/spy/-/spy-4.1.1.tgz" + integrity sha512-6Ti/KT5OVaiupdIZEuZN7l3CZcR0cxnxt70Z0//3CtwgObwA6jZhmVBA3yrXSVN3gmwjgd7oDNLlsXz526gpRA== + +"@vitest/utils@4.1.1": + version "4.1.1" + resolved "https://registry.npmjs.org/@vitest/utils/-/utils-4.1.1.tgz" + integrity sha512-cNxAlaB3sHoCdL6pj6yyUXv9Gry1NHNg0kFTXdvSIZXLHsqKH7chiWOkwJ5s5+d/oMwcoG9T0bKU38JZWKusrQ== + dependencies: + "@vitest/pretty-format" "4.1.1" + convert-source-map "^2.0.0" + tinyrainbow "^3.0.3" + +acorn-jsx@^5.3.2: + version "5.3.2" + resolved "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-5.3.2.tgz" + integrity sha512-rq9s+JNhf0IChjtDXxllJ7g41oZk5SlXtp0LHwyA5cejwn7vKmKp4pPri6YEePv2PU65sAsegbXtIinmDFDXgQ== + +acorn@^8.15.0: + version "8.16.0" + resolved "https://registry.npmjs.org/acorn/-/acorn-8.16.0.tgz" + integrity sha512-UVJyE9MttOsBQIDKw1skb9nAwQuR5wuGD3+82K6JgJlm/Y+KI92oNsMNGZCYdDsVtRHSak0pcV5Dno5+4jh9sw== + +ajv@^6.14.0: + version "6.14.0" + resolved "https://registry.npmjs.org/ajv/-/ajv-6.14.0.tgz" + integrity sha512-IWrosm/yrn43eiKqkfkHis7QioDleaXQHdDVPKg0FSwwd/DuvyX79TZnFOnYpB7dcsFAMmtFztZuXPDvSePkFw== + dependencies: + fast-deep-equal "^3.1.1" + fast-json-stable-stringify "^2.0.0" + json-schema-traverse "^0.4.1" + uri-js "^4.2.2" + +allotment@^1.20.4: + version "1.20.5" + resolved "https://registry.npmjs.org/allotment/-/allotment-1.20.5.tgz" + integrity sha512-7i4NT7ieXEyAd5lBrXmE7WHz/e7hRuo97+j+TwrPE85ha6kyFURoc76nom0dWSZ1pTKVEAMJy/+f3/Isfu/41A== + dependencies: + classnames "^2.3.0" + eventemitter3 "^5.0.0" + fast-deep-equal "^3.1.3" + lodash.clamp "^4.0.0" + lodash.debounce "^4.0.0" + usehooks-ts "^3.1.1" + +ansi-regex@^5.0.1: + version "5.0.1" + resolved "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz" + integrity sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ== + +ansi-regex@^6.2.2: + version "6.2.2" + resolved "https://registry.npmjs.org/ansi-regex/-/ansi-regex-6.2.2.tgz" + integrity sha512-Bq3SmSpyFHaWjPk8If9yc6svM8c56dB5BAtW4Qbw5jHTwwXXcTLoRMkpDJp6VL0XzlWaCHTXrkFURMYmD0sLqg== + +ansi-styles@^4.1.0: + version "4.3.0" + resolved "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz" + integrity sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg== + dependencies: + color-convert "^2.0.1" + +ansi-styles@^5.0.0: + version "5.2.0" + resolved "https://registry.npmjs.org/ansi-styles/-/ansi-styles-5.2.0.tgz" + integrity sha512-Cxwpt2SfTzTtXcfOlzGEee8O+c+MmUgGrNiBcXnuWxuFJHe6a5Hz7qwhwe5OgaSYI0IJvkLqWX1ASG+cJOkEiA== + +ansi-styles@^6.2.1: + version "6.2.3" + resolved "https://registry.npmjs.org/ansi-styles/-/ansi-styles-6.2.3.tgz" + integrity sha512-4Dj6M28JB+oAH8kFkTLUo+a2jwOFkuqb3yucU0CANcRRUbxS0cP0nZYCGjcc3BNXwRIsUVmDGgzawme7zvJHvg== + +archiver-utils@^2.1.0: + version "2.1.0" + resolved "https://registry.npmjs.org/archiver-utils/-/archiver-utils-2.1.0.tgz" + integrity sha512-bEL/yUb/fNNiNTuUz979Z0Yg5L+LzLxGJz8x79lYmR54fmTIb6ob/hNQgkQnIUDWIFjZVQwl9Xs356I6BAMHfw== + dependencies: + glob "^7.1.4" + graceful-fs "^4.2.0" + lazystream "^1.0.0" + lodash.defaults "^4.2.0" + lodash.difference "^4.5.0" + lodash.flatten "^4.4.0" + lodash.isplainobject "^4.0.6" + lodash.union "^4.6.0" + normalize-path "^3.0.0" + readable-stream "^2.0.0" + +archiver-utils@^3.0.4: + version "3.0.4" + resolved "https://registry.npmjs.org/archiver-utils/-/archiver-utils-3.0.4.tgz" + integrity sha512-KVgf4XQVrTjhyWmx6cte4RxonPLR9onExufI1jhvw/MQ4BB6IsZD5gT8Lq+u/+pRkWna/6JoHpiQioaqFP5Rzw== + dependencies: + glob "^7.2.3" + graceful-fs "^4.2.0" + lazystream "^1.0.0" + lodash.defaults "^4.2.0" + lodash.difference "^4.5.0" + lodash.flatten "^4.4.0" + lodash.isplainobject "^4.0.6" + lodash.union "^4.6.0" + normalize-path "^3.0.0" + readable-stream "^3.6.0" + +archiver@^5.0.0: + version "5.3.2" + resolved "https://registry.npmjs.org/archiver/-/archiver-5.3.2.tgz" + integrity sha512-+25nxyyznAXF7Nef3y0EbBeqmGZgeN/BxHX29Rs39djAfaFalmQ89SE6CWyDCHzGL0yt/ycBtNOmGTW0FyGWNw== + dependencies: + archiver-utils "^2.1.0" + async "^3.2.4" + buffer-crc32 "^0.2.1" + readable-stream "^3.6.0" + readdir-glob "^1.1.2" + tar-stream "^2.2.0" + zip-stream "^4.1.0" + +argparse@^2.0.1: + version "2.0.1" + resolved "https://registry.npmjs.org/argparse/-/argparse-2.0.1.tgz" + integrity sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q== + +aria-query@^5.0.0, aria-query@^5.3.2: + version "5.3.2" + resolved "https://registry.npmjs.org/aria-query/-/aria-query-5.3.2.tgz" + integrity sha512-COROpnaoap1E2F000S62r6A60uHZnmlvomhfyT2DlTcrY1OrBKn2UhH7qn5wTC9zMvD0AY7csdPSNwKP+7WiQw== + +aria-query@5.3.0: + version "5.3.0" + resolved "https://registry.npmjs.org/aria-query/-/aria-query-5.3.0.tgz" + integrity sha512-b0P0sZPKtyu8HkeRAfCq0IfURZK+SuwMjY1UXGBU27wpAiTwQAIlq56IbIO+ytk/JjS1fMR14ee5WBBfKi5J6A== + dependencies: + dequal "^2.0.3" + +array-buffer-byte-length@^1.0.1, array-buffer-byte-length@^1.0.2: + version "1.0.2" + resolved "https://registry.npmjs.org/array-buffer-byte-length/-/array-buffer-byte-length-1.0.2.tgz" + integrity sha512-LHE+8BuR7RYGDKvnrmcuSq3tDcKv9OFEXQt/HpbZhY7V6h0zlUXutnAD82GiFx9rdieCMjkvtcsPqBwgUl1Iiw== + dependencies: + call-bound "^1.0.3" + is-array-buffer "^3.0.5" + +array-includes@^3.1.6, array-includes@^3.1.8: + version "3.1.9" + resolved "https://registry.npmjs.org/array-includes/-/array-includes-3.1.9.tgz" + integrity sha512-FmeCCAenzH0KH381SPT5FZmiA/TmpndpcaShhfgEN9eCVjnFBqq3l1xrI42y8+PPLI6hypzou4GXw00WHmPBLQ== + dependencies: + call-bind "^1.0.8" + call-bound "^1.0.4" + define-properties "^1.2.1" + es-abstract "^1.24.0" + es-object-atoms "^1.1.1" + get-intrinsic "^1.3.0" + is-string "^1.1.1" + math-intrinsics "^1.1.0" + +array.prototype.findlast@^1.2.5: + version "1.2.5" + resolved "https://registry.npmjs.org/array.prototype.findlast/-/array.prototype.findlast-1.2.5.tgz" + integrity sha512-CVvd6FHg1Z3POpBLxO6E6zr+rSKEQ9L6rZHAaY7lLfhKsWYUBBOuMs0e9o24oopj6H+geRCX0YJ+TJLBK2eHyQ== + dependencies: + call-bind "^1.0.7" + define-properties "^1.2.1" + es-abstract "^1.23.2" + es-errors "^1.3.0" + es-object-atoms "^1.0.0" + es-shim-unscopables "^1.0.2" + +array.prototype.flat@^1.3.1: + version "1.3.3" + resolved "https://registry.npmjs.org/array.prototype.flat/-/array.prototype.flat-1.3.3.tgz" + integrity sha512-rwG/ja1neyLqCuGZ5YYrznA62D4mZXg0i1cIskIUKSiqF3Cje9/wXAls9B9s1Wa2fomMsIv8czB8jZcPmxCXFg== + dependencies: + call-bind "^1.0.8" + define-properties "^1.2.1" + es-abstract "^1.23.5" + es-shim-unscopables "^1.0.2" + +array.prototype.flatmap@^1.3.2, array.prototype.flatmap@^1.3.3: + version "1.3.3" + resolved "https://registry.npmjs.org/array.prototype.flatmap/-/array.prototype.flatmap-1.3.3.tgz" + integrity sha512-Y7Wt51eKJSyi80hFrJCePGGNo5ktJCslFuboqJsbf57CCPcm5zztluPlc4/aD8sWsKvlwatezpV4U1efk8kpjg== + dependencies: + call-bind "^1.0.8" + define-properties "^1.2.1" + es-abstract "^1.23.5" + es-shim-unscopables "^1.0.2" + +array.prototype.tosorted@^1.1.4: + version "1.1.4" + resolved "https://registry.npmjs.org/array.prototype.tosorted/-/array.prototype.tosorted-1.1.4.tgz" + integrity sha512-p6Fx8B7b7ZhL/gmUsAy0D15WhvDccw3mnGNbZpi3pmeJdxtWsj2jEaI4Y6oo3XiHfzuSgPwKc04MYt6KgvC/wA== + dependencies: + call-bind "^1.0.7" + define-properties "^1.2.1" + es-abstract "^1.23.3" + es-errors "^1.3.0" + es-shim-unscopables "^1.0.2" + +arraybuffer.prototype.slice@^1.0.4: + version "1.0.4" + resolved "https://registry.npmjs.org/arraybuffer.prototype.slice/-/arraybuffer.prototype.slice-1.0.4.tgz" + integrity sha512-BNoCY6SXXPQ7gF2opIP4GBE+Xw7U+pHMYKuzjgCN3GwiaIR09UUeKfheyIry77QtrCBlC0KK0q5/TER/tYh3PQ== + dependencies: + array-buffer-byte-length "^1.0.1" + call-bind "^1.0.8" + define-properties "^1.2.1" + es-abstract "^1.23.5" + es-errors "^1.3.0" + get-intrinsic "^1.2.6" + is-array-buffer "^3.0.4" + +assertion-error@^2.0.1: + version "2.0.1" + resolved "https://registry.npmjs.org/assertion-error/-/assertion-error-2.0.1.tgz" + integrity sha512-Izi8RQcffqCeNVgFigKli1ssklIbpHnCYc6AknXGYoB6grJqyeby7jv12JUQgmTAnIDnbck1uxksT4dzN3PWBA== + +ast-types-flow@^0.0.8: + version "0.0.8" + resolved "https://registry.npmjs.org/ast-types-flow/-/ast-types-flow-0.0.8.tgz" + integrity sha512-OH/2E5Fg20h2aPrbe+QL8JZQFko0YZaF+j4mnQ7BGhfavO7OpSLa8a0y9sBwomHdSbkhTS8TQNayBfnW5DwbvQ== + +async-function@^1.0.0: + version "1.0.0" + resolved "https://registry.npmjs.org/async-function/-/async-function-1.0.0.tgz" + integrity sha512-hsU18Ae8CDTR6Kgu9DYf0EbCr/a5iGL0rytQDobUcdpYOKokk8LEjVphnXkDkgpi0wYVsqrXuP0bZxJaTqdgoA== + +async@^3.2.4: + version "3.2.6" + resolved "https://registry.npmjs.org/async/-/async-3.2.6.tgz" + integrity sha512-htCUDlxyyCLMgaM3xXg0C0LW2xqfuQ6p05pCEIsXuyQ+a1koYKTuBMzRNwmybfLgvJDMd0r1LTn4+E0Ti6C2AA== + +available-typed-arrays@^1.0.7: + version "1.0.7" + resolved "https://registry.npmjs.org/available-typed-arrays/-/available-typed-arrays-1.0.7.tgz" + integrity sha512-wvUjBtSGN7+7SjNpq/9M2Tg350UZD3q62IFZLbRAR1bSMlCo1ZaeW+BJ+D090e4hIIZLBcTDWe4Mh4jvUDajzQ== + dependencies: + possible-typed-array-names "^1.0.0" + +axe-core@^4.10.0: + version "4.11.1" + resolved "https://registry.npmjs.org/axe-core/-/axe-core-4.11.1.tgz" + integrity sha512-BASOg+YwO2C+346x3LZOeoovTIoTrRqEsqMa6fmfAV0P+U9mFr9NsyOEpiYvFjbc64NMrSswhV50WdXzdb/Z5A== + +axobject-query@^4.1.0: + version "4.1.0" + resolved "https://registry.npmjs.org/axobject-query/-/axobject-query-4.1.0.tgz" + integrity sha512-qIj0G9wZbMGNLjLmg1PT6v2mE9AH2zlnADJD/2tC6E00hgmhUOfEB6greHPAfLRSufHqROIUTkw6E+M3lH0PTQ== + +babel-plugin-macros@^3.1.0: + version "3.1.0" + resolved "https://registry.npmjs.org/babel-plugin-macros/-/babel-plugin-macros-3.1.0.tgz" + integrity sha512-Cg7TFGpIr01vOQNODXOOaGz2NpCU5gl8x1qJFbb6hbZxR7XrcE2vtbAsTAbJ7/xwJtUuJEw8K8Zr/AE0LHlesg== + dependencies: + "@babel/runtime" "^7.12.5" + cosmiconfig "^7.0.0" + resolve "^1.19.0" + +balanced-match@^1.0.0: + version "1.0.2" + resolved "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz" + integrity sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw== + +balanced-match@^4.0.2: + version "4.0.4" + resolved "https://registry.npmjs.org/balanced-match/-/balanced-match-4.0.4.tgz" + integrity sha512-BLrgEcRTwX2o6gGxGOCNyMvGSp35YofuYzw9h1IMTRmKqttAZZVU67bdb9Pr2vUHA8+j3i2tJfjO6C6+4myGTA== + +base64-arraybuffer@^1.0.2: + version "1.0.2" + resolved "https://registry.npmjs.org/base64-arraybuffer/-/base64-arraybuffer-1.0.2.tgz" + integrity sha512-I3yl4r9QB5ZRY3XuJVEPfc2XhZO6YweFPI+UovAzn+8/hb3oJ6lnysaFcjVpkCPfVWFUDvoZ8kmVDP7WyRtYtQ== + +base64-js@^1.3.1: + version "1.5.1" + resolved "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz" + integrity sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA== + +bidi-js@^1.0.3: + version "1.0.3" + resolved "https://registry.npmjs.org/bidi-js/-/bidi-js-1.0.3.tgz" + integrity sha512-RKshQI1R3YQ+n9YJz2QQ147P66ELpa1FQEg20Dk8oW9t2KgLbpDLLp9aGZ7y8WHSshDknG0bknqGw5/tyCs5tw== + dependencies: + require-from-string "^2.0.2" + +big-integer@^1.6.17: + version "1.6.52" + resolved "https://registry.npmjs.org/big-integer/-/big-integer-1.6.52.tgz" + integrity sha512-QxD8cf2eVqJOOz63z6JIN9BzvVs/dlySa5HGSBH5xtR8dPteIRQnBxxKqkNTiT6jbDTF6jAfrd4oMcND9RGbQg== + +binary@~0.3.0: + version "0.3.0" + resolved "https://registry.npmjs.org/binary/-/binary-0.3.0.tgz" + integrity sha512-D4H1y5KYwpJgK8wk1Cue5LLPgmwHKYSChkbspQg5JtVuR5ulGckxfR62H3AE9UDkdMC8yyXlqYihuz3Aqg2XZg== + dependencies: + buffers "~0.1.1" + chainsaw "~0.1.0" + +bl@^4.0.3: + version "4.1.0" + resolved "https://registry.npmjs.org/bl/-/bl-4.1.0.tgz" + integrity sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w== + dependencies: + buffer "^5.5.0" + inherits "^2.0.4" + readable-stream "^3.4.0" + +bluebird@~3.4.1: + version "3.4.7" + resolved "https://registry.npmjs.org/bluebird/-/bluebird-3.4.7.tgz" + integrity sha512-iD3898SR7sWVRHbiQv+sHUtHnMvC1o3nW5rAcqnq3uOn07DSAppZYUkIGslDz6gXC7HfunPe7YVBgoEJASPcHA== + +brace-expansion@^1.1.7: + version "1.1.12" + resolved "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz" + integrity sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg== + dependencies: + balanced-match "^1.0.0" + concat-map "0.0.1" + +brace-expansion@^2.0.1: + version "2.0.2" + resolved "https://registry.npmjs.org/brace-expansion/-/brace-expansion-2.0.2.tgz" + integrity sha512-Jt0vHyM+jmUBqojB7E1NIYadt0vI0Qxjxd2TErW94wDz+E2LAm5vKMXXwg6ZZBTHPuUlDgQHKXvjGBdfcF1ZDQ== + dependencies: + balanced-match "^1.0.0" + +brace-expansion@^5.0.2: + version "5.0.5" + resolved "https://registry.npmjs.org/brace-expansion/-/brace-expansion-5.0.5.tgz" + integrity sha512-VZznLgtwhn+Mact9tfiwx64fA9erHH/MCXEUfB/0bX/6Fz6ny5EGTXYltMocqg4xFAQZtnO3DHWWXi8RiuN7cQ== + dependencies: + balanced-match "^4.0.2" + +bubblesets-js@^3.0.0: + version "3.0.1" + resolved "https://registry.npmjs.org/bubblesets-js/-/bubblesets-js-3.0.1.tgz" + integrity sha512-EKPfysvIU5+u5RLW3mOr94wxzA3nKzqMBX0F95L95BPBDZPVgLBUnT0kJNz4UK/TXbGs8G7yEgl5MvibRBCQoQ== + +buffer-crc32@^0.2.1, buffer-crc32@^0.2.13: + version "0.2.13" + resolved "https://registry.npmjs.org/buffer-crc32/-/buffer-crc32-0.2.13.tgz" + integrity sha512-VO9Ht/+p3SN7SKWqcrgEzjGbRSJYTx+Q1pTQC0wrWqHx0vpJraQ6GtHx8tvcg1rlK1byhU5gccxgOgj7B0TDkQ== + +buffer-indexof-polyfill@~1.0.0: + version "1.0.2" + resolved "https://registry.npmjs.org/buffer-indexof-polyfill/-/buffer-indexof-polyfill-1.0.2.tgz" + integrity sha512-I7wzHwA3t1/lwXQh+A5PbNvJxgfo5r3xulgpYDB5zckTu/Z9oUK9biouBKQUjEqzaz3HnAT6TYoovmE+GqSf7A== + +buffer@^5.5.0: + version "5.7.1" + resolved "https://registry.npmjs.org/buffer/-/buffer-5.7.1.tgz" + integrity sha512-EHcyIPBQ4BSGlvjB16k5KgAJ27CIsHY/2JBmCRReo48y9rQ3MaUzWX3KVlBa4U7MyX02HdVj0K7C3WaB3ju7FQ== + dependencies: + base64-js "^1.3.1" + ieee754 "^1.1.13" + +buffers@~0.1.1: + version "0.1.1" + resolved "https://registry.npmjs.org/buffers/-/buffers-0.1.1.tgz" + integrity sha512-9q/rDEGSb/Qsvv2qvzIzdluL5k7AaJOTrw23z9reQthrbF7is4CtlT0DXyO1oei2DCp4uojjzQ7igaSHp1kAEQ== + +call-bind-apply-helpers@^1.0.0, call-bind-apply-helpers@^1.0.1, call-bind-apply-helpers@^1.0.2: + version "1.0.2" + resolved "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz" + integrity sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ== + dependencies: + es-errors "^1.3.0" + function-bind "^1.1.2" + +call-bind@^1.0.7, call-bind@^1.0.8: + version "1.0.8" + resolved "https://registry.npmjs.org/call-bind/-/call-bind-1.0.8.tgz" + integrity sha512-oKlSFMcMwpUg2ednkhQ454wfWiU/ul3CkJe/PEHcTKuiX6RpbehUiFMXu13HalGZxfUwCQzZG747YXBn1im9ww== + dependencies: + call-bind-apply-helpers "^1.0.0" + es-define-property "^1.0.0" + get-intrinsic "^1.2.4" + set-function-length "^1.2.2" + +call-bound@^1.0.2, call-bound@^1.0.3, call-bound@^1.0.4: + version "1.0.4" + resolved "https://registry.npmjs.org/call-bound/-/call-bound-1.0.4.tgz" + integrity sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg== + dependencies: + call-bind-apply-helpers "^1.0.2" + get-intrinsic "^1.3.0" + +callsites@^3.0.0: + version "3.1.0" + resolved "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz" + integrity sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ== + +canvas@^3.2.1: + version "3.2.2" + resolved "https://registry.npmjs.org/canvas/-/canvas-3.2.2.tgz" + integrity sha512-duEt4h1HHu9sJZyVKfLRXR6tsKPY7cEELzxSRJkwddOXYvQT3P/+es98SV384JA0zMOZ5s+9gatnGfM6sL4Drg== + dependencies: + node-addon-api "^7.0.0" + prebuild-install "^7.1.3" + +chai@^6.2.2: + version "6.2.2" + resolved "https://registry.npmjs.org/chai/-/chai-6.2.2.tgz" + integrity sha512-NUPRluOfOiTKBKvWPtSD4PhFvWCqOi0BGStNWs57X9js7XGTprSmFoz5F0tWhR4WPjNeR9jXqdC7/UpSJTnlRg== + +chainsaw@~0.1.0: + version "0.1.0" + resolved "https://registry.npmjs.org/chainsaw/-/chainsaw-0.1.0.tgz" + integrity sha512-75kWfWt6MEKNC8xYXIdRpDehRYY/tNSgwKaJq+dbbDcxORuVrrQ+SEHoWsniVn9XPYfP4gmdWIeDk/4YNp1rNQ== + dependencies: + traverse ">=0.3.0 <0.4" + +chalk@^4.0.0: + version "4.1.2" + resolved "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz" + integrity sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA== + dependencies: + ansi-styles "^4.1.0" + supports-color "^7.1.0" + +chart.js@^4.5.1: + version "4.5.1" + resolved "https://registry.npmjs.org/chart.js/-/chart.js-4.5.1.tgz" + integrity sha512-GIjfiT9dbmHRiYi6Nl2yFCq7kkwdkp1W/lp2J99rX0yo9tgJGn3lKQATztIjb5tVtevcBtIdICNWqlq5+E8/Pw== + dependencies: + "@kurkle/color" "^0.3.0" + +chokidar@^4.0.0: + version "4.0.3" + resolved "https://registry.npmjs.org/chokidar/-/chokidar-4.0.3.tgz" + integrity sha512-Qgzu8kfBvo+cA4962jnP1KkS6Dop5NS6g7R5LFYJr4b8Ub94PPQXUksCw9PvXoeXPRRddRNC5C1JQUR2SMGtnA== + dependencies: + readdirp "^4.0.1" + +chownr@^1.1.1: + version "1.1.4" + resolved "https://registry.npmjs.org/chownr/-/chownr-1.1.4.tgz" + integrity sha512-jJ0bqzaylmJtVnNgzTeSOs8DPavpbYgEr/b0YL8/2GO3xJEhInFmhKMUnEJQjZumK7KXGFhUy89PrsJWlakBVg== + +chroma-js@^3.1.2: + version "3.2.0" + resolved "https://registry.npmjs.org/chroma-js/-/chroma-js-3.2.0.tgz" + integrity sha512-os/OippSlX1RlWWr+QDPcGUZs0uoqr32urfxESG9U93lhUfbnlyckte84Q8P1UQY/qth983AS1JONKmLS4T0nw== + +classnames@^2.3.0: + version "2.5.1" + resolved "https://registry.npmjs.org/classnames/-/classnames-2.5.1.tgz" + integrity sha512-saHYOzhIQs6wy2sVxTM6bUDsQO4F50V9RQ22qBpEdCW+I+/Wmke2HOl6lS6dTpdxVhb88/I6+Hs+438c3lfUow== + +cliui@^9.0.1: + version "9.0.1" + resolved "https://registry.npmjs.org/cliui/-/cliui-9.0.1.tgz" + integrity sha512-k7ndgKhwoQveBL+/1tqGJYNz097I7WOvwbmmU2AR5+magtbjPWQTS1C5vzGkBC8Ym8UWRzfKUzUUqFLypY4Q+w== + dependencies: + string-width "^7.2.0" + strip-ansi "^7.1.0" + wrap-ansi "^9.0.0" + +clsx@^2.1.1: + version "2.1.1" + resolved "https://registry.npmjs.org/clsx/-/clsx-2.1.1.tgz" + integrity sha512-eYm0QWBtUrBWZWG0d386OGAw16Z995PiOVo2B7bjWSbHedGl5e0ZWaq65kOGgUSNesEIDkB9ISbTg/JK9dhCZA== + +color-convert@^2.0.1: + version "2.0.1" + resolved "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz" + integrity sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ== + dependencies: + color-name "~1.1.4" + +color-name@~1.1.4: + version "1.1.4" + resolved "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz" + integrity sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA== + +commander@^8.3.0: + version "8.3.0" + resolved "https://registry.npmjs.org/commander/-/commander-8.3.0.tgz" + integrity sha512-OkTL9umf+He2DZkUq8f8J9of7yL6RJKI24dVITBmNfZBmri9zYZQrKkuXiKhyfPSu8tUhnVBB1iKXevvnlR4Ww== + +commander@2: + version "2.20.3" + resolved "https://registry.npmjs.org/commander/-/commander-2.20.3.tgz" + integrity sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ== + +commander@7: + version "7.2.0" + resolved "https://registry.npmjs.org/commander/-/commander-7.2.0.tgz" + integrity sha512-QrWXB+ZQSVPmIWIhtEO9H+gwHaMGYiF5ChvoJ+K9ZGHG/sVsa6yiesAD1GC/x46sET00Xlwo1u49RVVVzvcSkw== + +compress-commons@^4.1.2: + version "4.1.2" + resolved "https://registry.npmjs.org/compress-commons/-/compress-commons-4.1.2.tgz" + integrity sha512-D3uMHtGc/fcO1Gt1/L7i1e33VOvD4A9hfQLP+6ewd+BvG/gQ84Yh4oftEhAdjSMgBgwGL+jsppT7JYNpo6MHHg== + dependencies: + buffer-crc32 "^0.2.13" + crc32-stream "^4.0.2" + normalize-path "^3.0.0" + readable-stream "^3.6.0" + +concat-map@0.0.1: + version "0.0.1" + resolved "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz" + integrity sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg== + +convert-source-map@^1.5.0: + version "1.9.0" + resolved "https://registry.npmjs.org/convert-source-map/-/convert-source-map-1.9.0.tgz" + integrity sha512-ASFBup0Mz1uyiIjANan1jzLQami9z1PoYSZCiiYW2FczPbenXc45FZdBZLzOT+r6+iciuEModtmCti+hjaAk0A== + +convert-source-map@^2.0.0: + version "2.0.0" + resolved "https://registry.npmjs.org/convert-source-map/-/convert-source-map-2.0.0.tgz" + integrity sha512-Kvp459HrV2FEJ1CAsi1Ku+MY3kasH19TFykTz2xWmMeq6bk2NU3XXvfJ+Q61m0xktWwt+1HSYf3JZsTms3aRJg== + +core-util-is@~1.0.0: + version "1.0.3" + resolved "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.3.tgz" + integrity sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ== + +cosmiconfig@^7.0.0: + version "7.1.0" + resolved "https://registry.npmjs.org/cosmiconfig/-/cosmiconfig-7.1.0.tgz" + integrity sha512-AdmX6xUzdNASswsFtmwSt7Vj8po9IuqXm0UXz7QKPuEUmPB4XyjGfaAr2PSuELMwkRMVH1EpIkX5bTZGRB3eCA== + dependencies: + "@types/parse-json" "^4.0.0" + import-fresh "^3.2.1" + parse-json "^5.0.0" + path-type "^4.0.0" + yaml "^1.10.0" + +crc-32@^1.2.0: + version "1.2.2" + resolved "https://registry.npmjs.org/crc-32/-/crc-32-1.2.2.tgz" + integrity sha512-ROmzCKrTnOwybPcJApAA6WBWij23HVfGVNKqqrZpuyZOHqK2CwHSvpGuyt/UNNvaIjEd8X5IFGp4Mh+Ie1IHJQ== + +crc32-stream@^4.0.2: + version "4.0.3" + resolved "https://registry.npmjs.org/crc32-stream/-/crc32-stream-4.0.3.tgz" + integrity sha512-NT7w2JVU7DFroFdYkeq8cywxrgjPHWkdX1wjpRQXPX5Asews3tA+Ght6lddQO5Mkumffp3X7GEqku3epj2toIw== + dependencies: + crc-32 "^1.2.0" + readable-stream "^3.4.0" + +crelt@^1.0.0: + version "1.0.6" + resolved "https://registry.npmjs.org/crelt/-/crelt-1.0.6.tgz" + integrity sha512-VQ2MBenTq1fWZUH9DJNGti7kKv6EeAuYr3cLwxUWhIu1baTaXh4Ib5W2CqHVqib4/MqbYGJqiL3Zb8GJZr3l4g== + +cross-spawn@^7.0.6: + version "7.0.6" + resolved "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz" + integrity sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA== + dependencies: + path-key "^3.1.0" + shebang-command "^2.0.0" + which "^2.0.1" + +css-line-break@^2.1.0: + version "2.1.0" + resolved "https://registry.npmjs.org/css-line-break/-/css-line-break-2.1.0.tgz" + integrity sha512-FHcKFCZcAha3LwfVBhCQbW2nCNbkZXn7KVUJcsT5/P8YmfsVja0FMPJr0B903j/E69HUphKiV9iQArX8SDYA4w== + dependencies: + utrie "^1.0.2" + +css-tree@^3.0.0, css-tree@^3.2.1: + version "3.2.1" + resolved "https://registry.npmjs.org/css-tree/-/css-tree-3.2.1.tgz" + integrity sha512-X7sjQzceUhu1u7Y/ylrRZFU2FS6LRiFVp6rKLPg23y3x3c3DOKAwuXGDp+PAGjh6CSnCjYeAul8pcT8bAl+lSA== + dependencies: + mdn-data "2.27.1" + source-map-js "^1.2.1" + +css.escape@^1.5.1: + version "1.5.1" + resolved "https://registry.npmjs.org/css.escape/-/css.escape-1.5.1.tgz" + integrity sha512-YUifsXXuknHlUsmlgyY0PKzgPOr7/FjCePfHNt0jxm83wHZi44VDMQ7/fGNkjY3/jV1MC+1CmZbaHzugyeRtpg== + +csstype@^3.0.2, csstype@^3.1.0, csstype@^3.2.2, csstype@^3.2.3: + version "3.2.3" + resolved "https://registry.npmjs.org/csstype/-/csstype-3.2.3.tgz" + integrity sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ== + +culori@^4.0.2: + version "4.0.2" + resolved "https://registry.npmjs.org/culori/-/culori-4.0.2.tgz" + integrity sha512-1+BhOB8ahCn4O0cep0Sh2l9KCOfOdY+BXJnKMHFFzDEouSr/el18QwXEMRlOj9UY5nCeA8UN3a/82rUWRBeyBw== + +d3-array@^3.2.0, d3-array@^3.2.4, "d3-array@1 - 3", "d3-array@2 - 3", "d3-array@2.10.0 - 3", "d3-array@2.5.0 - 3", d3-array@3, d3-array@3.2.4: + version "3.2.4" + resolved "https://registry.npmjs.org/d3-array/-/d3-array-3.2.4.tgz" + integrity sha512-tdQAmyA18i4J7wprpYq8ClcxZy3SC31QMeByyCFyRt7BVHdREQZ5lpzoe5mFEYZUWe+oq8HBvk9JjpibyEV4Jg== + dependencies: + internmap "1 - 2" + +d3-axis@3: + version "3.0.0" + resolved "https://registry.npmjs.org/d3-axis/-/d3-axis-3.0.0.tgz" + integrity sha512-IH5tgjV4jE/GhHkRV0HiVYPDtvfjHQlQfJHs0usq7M30XcSBvOotpmH1IgkcXsO/5gEQZD43B//fc7SRT5S+xw== + +d3-brush@3: + version "3.0.0" + resolved "https://registry.npmjs.org/d3-brush/-/d3-brush-3.0.0.tgz" + integrity sha512-ALnjWlVYkXsVIGlOsuWH1+3udkYFI48Ljihfnh8FZPF2QS9o+PzGLBslO0PjzVoHLZ2KCVgAM8NVkXPJB2aNnQ== + dependencies: + d3-dispatch "1 - 3" + d3-drag "2 - 3" + d3-interpolate "1 - 3" + d3-selection "3" + d3-transition "3" + +d3-chord@3: + version "3.0.1" + resolved "https://registry.npmjs.org/d3-chord/-/d3-chord-3.0.1.tgz" + integrity sha512-VE5S6TNa+j8msksl7HwjxMHDM2yNK3XCkusIlpX5kwauBfXuyLAtNg9jCp/iHH61tgI4sb6R/EIMWCqEIdjT/g== + dependencies: + d3-path "1 - 3" + +d3-color@^3.1.0, "d3-color@1 - 3", d3-color@3: + version "3.1.0" + resolved "https://registry.npmjs.org/d3-color/-/d3-color-3.1.0.tgz" + integrity sha512-zg/chbXyeBtMQ1LbD/WSoW2DpC3I0mpmPdW+ynRTj/x2DAWYrIY7qeZIHidozwV24m4iavr15lNwIwLxRmOxhA== + +d3-contour@4: + version "4.0.2" + resolved "https://registry.npmjs.org/d3-contour/-/d3-contour-4.0.2.tgz" + integrity sha512-4EzFTRIikzs47RGmdxbeUvLWtGedDUNkTcmzoeyg4sP/dvCexO47AaQL7VKy/gul85TOxw+IBgA8US2xwbToNA== + dependencies: + d3-array "^3.2.0" + +d3-delaunay@^6.0.4, d3-delaunay@6: + version "6.0.4" + resolved "https://registry.npmjs.org/d3-delaunay/-/d3-delaunay-6.0.4.tgz" + integrity sha512-mdjtIZ1XLAM8bm/hx3WwjfHt6Sggek7qH043O8KEjDXN40xi3vx/6pYSVTwLjEgiXQTbvaouWKynLBiUZ6SK6A== + dependencies: + delaunator "5" + +"d3-dispatch@1 - 3", d3-dispatch@3: + version "3.0.1" + resolved "https://registry.npmjs.org/d3-dispatch/-/d3-dispatch-3.0.1.tgz" + integrity sha512-rzUyPU/S7rwUflMyLc1ETDeBj0NRuHKKAcvukozwhshr6g6c5d8zh4c2gQjY2bZ0dXeGLWc1PF174P2tVvKhfg== + +"d3-drag@2 - 3", d3-drag@3: + version "3.0.0" + resolved "https://registry.npmjs.org/d3-drag/-/d3-drag-3.0.0.tgz" + integrity sha512-pWbUJLdETVA8lQNJecMxoXfH6x+mO2UQo8rSmZ+QqxcbyA3hfeprFgIT//HW2nlHChWeIIMwS2Fq+gEARkhTkg== + dependencies: + d3-dispatch "1 - 3" + d3-selection "3" + +d3-dsv@^3.0.1, "d3-dsv@1 - 3", d3-dsv@3: + version "3.0.1" + resolved "https://registry.npmjs.org/d3-dsv/-/d3-dsv-3.0.1.tgz" + integrity sha512-UG6OvdI5afDIFP9w4G0mNq50dSOsXHJaRE8arAS5o9ApWnIElp8GZw1Dun8vP8OyHOZ/QJUKUJwxiiCCnUwm+Q== + dependencies: + commander "7" + iconv-lite "0.6" + rw "1" + +"d3-ease@1 - 3", d3-ease@3: + version "3.0.1" + resolved "https://registry.npmjs.org/d3-ease/-/d3-ease-3.0.1.tgz" + integrity sha512-wR/XK3D3XcLIZwpbvQwQ5fK+8Ykds1ip7A2Txe0yxncXSdq1L9skcG7blcedkOX+ZcgxGAmLX1FrRGbADwzi0w== + +d3-fetch@3: + version "3.0.1" + resolved "https://registry.npmjs.org/d3-fetch/-/d3-fetch-3.0.1.tgz" + integrity sha512-kpkQIM20n3oLVBKGg6oHrUchHM3xODkTzjMoj7aWQFq5QEM+R6E4WkzT5+tojDY7yjez8KgCBRoj4aEr99Fdqw== + dependencies: + d3-dsv "1 - 3" + +d3-force@^3.0.0, d3-force@3: + version "3.0.0" + resolved "https://registry.npmjs.org/d3-force/-/d3-force-3.0.0.tgz" + integrity sha512-zxV/SsA+U4yte8051P4ECydjD/S+qeYtnaIyAs9tgHCqfguma/aAQDjo85A9Z6EKhBirHRJHXIgJUlffT4wdLg== + dependencies: + d3-dispatch "1 - 3" + d3-quadtree "1 - 3" + d3-timer "1 - 3" + +d3-format@^3.1.0, "d3-format@1 - 3", d3-format@3: + version "3.1.2" + resolved "https://registry.npmjs.org/d3-format/-/d3-format-3.1.2.tgz" + integrity sha512-AJDdYOdnyRDV5b6ArilzCPPwc1ejkHcoyFarqlPqT7zRYjhavcT3uSrqcMvsgh2CgoPbK3RCwyHaVyxYcP2Arg== + +d3-geo-projection@^4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/d3-geo-projection/-/d3-geo-projection-4.0.0.tgz" + integrity sha512-p0bK60CEzph1iqmnxut7d/1kyTmm3UWtPlwdkM31AU+LW+BXazd5zJdoCn7VFxNCHXRngPHRnsNn5uGjLRGndg== + dependencies: + commander "7" + d3-array "1 - 3" + d3-geo "1.12.0 - 3" + +d3-geo@^3.1.1, "d3-geo@1.12.0 - 3", d3-geo@3: + version "3.1.1" + resolved "https://registry.npmjs.org/d3-geo/-/d3-geo-3.1.1.tgz" + integrity sha512-637ln3gXKXOwhalDzinUgY83KzNWZRKbYubaG+fGVuc/dxO64RRljtCTnf5ecMyE1RIdtqpkVcq0IbtU2S8j2Q== + dependencies: + d3-array "2.5.0 - 3" + +d3-hierarchy@^3.1.2, d3-hierarchy@3: + version "3.1.2" + resolved "https://registry.npmjs.org/d3-hierarchy/-/d3-hierarchy-3.1.2.tgz" + integrity sha512-FX/9frcub54beBdugHjDCdikxThEqjnR93Qt7PvQTOHxyiNCAlvMrHhclk3cD5VeAaq9fxmfRp+CnWw9rEMBuA== + +d3-interpolate@^3.0.1, "d3-interpolate@1 - 3", "d3-interpolate@1.2.0 - 3", d3-interpolate@3: + version "3.0.1" + resolved "https://registry.npmjs.org/d3-interpolate/-/d3-interpolate-3.0.1.tgz" + integrity sha512-3bYs1rOD33uo8aqJfKP3JWPAibgw8Zm2+L9vBKEHJ2Rg+viTR7o5Mmv5mZcieN+FRYaAOWX5SJATX6k1PWz72g== + dependencies: + d3-color "1 - 3" + +d3-path@^3.1.0, "d3-path@1 - 3", d3-path@3: + version "3.1.0" + resolved "https://registry.npmjs.org/d3-path/-/d3-path-3.1.0.tgz" + integrity sha512-p3KP5HCf/bvjBSSKuXid6Zqijx7wIfNW+J/maPs+iwR35at5JCbLUT0LzF1cnjbCHWhqzQTIN2Jpe8pRebIEFQ== + +d3-polygon@3: + version "3.0.1" + resolved "https://registry.npmjs.org/d3-polygon/-/d3-polygon-3.0.1.tgz" + integrity sha512-3vbA7vXYwfe1SYhED++fPUQlWSYTTGmFmQiany/gdbiWgU/iEyQzyymwL9SkJjFFuCS4902BSzewVGsHHmHtXg== + +"d3-quadtree@1 - 3", d3-quadtree@3: + version "3.0.1" + resolved "https://registry.npmjs.org/d3-quadtree/-/d3-quadtree-3.0.1.tgz" + integrity sha512-04xDrxQTDTCFwP5H6hRhsRcb9xxv2RzkcsygFzmkSIOJy3PeRJP7sNk3VRIbKXcog561P9oU0/rVH6vDROAgUw== + +d3-random@3: + version "3.0.1" + resolved "https://registry.npmjs.org/d3-random/-/d3-random-3.0.1.tgz" + integrity sha512-FXMe9GfxTxqd5D6jFsQ+DJ8BJS4E/fT5mqqdjovykEB2oFbTMDVdg1MGFxfQW+FBOGoB++k8swBrgwSHT1cUXQ== + +d3-scale-chromatic@^3.1.0, d3-scale-chromatic@3: + version "3.1.0" + resolved "https://registry.npmjs.org/d3-scale-chromatic/-/d3-scale-chromatic-3.1.0.tgz" + integrity sha512-A3s5PWiZ9YCXFye1o246KoscMWqf8BsD9eRiJ3He7C9OBaxKhAd5TFCdEx/7VbKtxxTsu//1mMJFrEt572cEyQ== + dependencies: + d3-color "1 - 3" + d3-interpolate "1 - 3" + +d3-scale@^4.0.2, d3-scale@4: + version "4.0.2" + resolved "https://registry.npmjs.org/d3-scale/-/d3-scale-4.0.2.tgz" + integrity sha512-GZW464g1SH7ag3Y7hXjf8RoUuAFIqklOAq3MRl4OaWabTFJY9PN/E1YklhXLh+OQ3fM9yS2nOkCoS+WLZ6kvxQ== + dependencies: + d3-array "2.10.0 - 3" + d3-format "1 - 3" + d3-interpolate "1.2.0 - 3" + d3-time "2.1.1 - 3" + d3-time-format "2 - 4" + +"d3-selection@2 - 3", d3-selection@3: + version "3.0.0" + resolved "https://registry.npmjs.org/d3-selection/-/d3-selection-3.0.0.tgz" + integrity sha512-fmTRWbNMmsmWq6xJV8D19U/gw/bwrHfNXxrIN+HfZgnzqTHp9jOmKMhsTUjXOJnZOdZY9Q28y4yebKzqDKlxlQ== + +d3-shape@^3.2.0, d3-shape@3: + version "3.2.0" + resolved "https://registry.npmjs.org/d3-shape/-/d3-shape-3.2.0.tgz" + integrity sha512-SaLBuwGm3MOViRq2ABk3eLoxwZELpH6zhl3FbAoJ7Vm1gofKx6El1Ib5z23NUEhF9AsGl7y+dzLe5Cw2AArGTA== + dependencies: + d3-path "^3.1.0" + +d3-time-format@^4.1.0, "d3-time-format@2 - 4", d3-time-format@4: + version "4.1.0" + resolved "https://registry.npmjs.org/d3-time-format/-/d3-time-format-4.1.0.tgz" + integrity sha512-dJxPBlzC7NugB2PDLwo9Q8JiTR3M3e4/XANkreKSUxF8vvXKqm1Yfq4Q5dl8budlunRVlUUaDUgFt7eA8D6NLg== + dependencies: + d3-time "1 - 3" + +d3-time@^3.1.0, "d3-time@1 - 3", "d3-time@2.1.1 - 3", d3-time@3: + version "3.1.0" + resolved "https://registry.npmjs.org/d3-time/-/d3-time-3.1.0.tgz" + integrity sha512-VqKjzBLejbSMT4IgbmVgDjpkYrNWUYJnbCGo874u7MMKIWsILRX+OpX/gTk8MqjpT1A/c6HY2dCA77ZN0lkQ2Q== + dependencies: + d3-array "2 - 3" + +d3-timer@^3.0.1, "d3-timer@1 - 3", d3-timer@3: + version "3.0.1" + resolved "https://registry.npmjs.org/d3-timer/-/d3-timer-3.0.1.tgz" + integrity sha512-ndfJ/JxxMd3nw31uyKoY2naivF+r29V+Lc0svZxe1JvvIRmi8hUsrMvdOwgS1o6uBHmiz91geQ0ylPP0aj1VUA== + +"d3-transition@2 - 3", d3-transition@3: + version "3.0.1" + resolved "https://registry.npmjs.org/d3-transition/-/d3-transition-3.0.1.tgz" + integrity sha512-ApKvfjsSR6tg06xrL434C0WydLr7JewBB3V+/39RMHsaXTOG0zmt/OAXeng5M5LBm0ojmxJrpomQVZ1aPvBL4w== + dependencies: + d3-color "1 - 3" + d3-dispatch "1 - 3" + d3-ease "1 - 3" + d3-interpolate "1 - 3" + d3-timer "1 - 3" + +d3-zoom@3: + version "3.0.0" + resolved "https://registry.npmjs.org/d3-zoom/-/d3-zoom-3.0.0.tgz" + integrity sha512-b8AmV3kfQaqWAuacbPuNbL6vahnOJflOhexLzMMNLga62+/nh0JzvJ0aO/5a5MVgUFGS7Hu1P9P03o3fJkDCyw== + dependencies: + d3-dispatch "1 - 3" + d3-drag "2 - 3" + d3-interpolate "1 - 3" + d3-selection "2 - 3" + d3-transition "2 - 3" + +d3@^7.3.0: + version "7.9.0" + resolved "https://registry.npmjs.org/d3/-/d3-7.9.0.tgz" + integrity sha512-e1U46jVP+w7Iut8Jt8ri1YsPOvFpg46k+K8TpCb0P+zjCkjkPnV7WzfDJzMHy1LnA+wj5pLT1wjO901gLXeEhA== + dependencies: + d3-array "3" + d3-axis "3" + d3-brush "3" + d3-chord "3" + d3-color "3" + d3-contour "4" + d3-delaunay "6" + d3-dispatch "3" + d3-drag "3" + d3-dsv "3" + d3-ease "3" + d3-fetch "3" + d3-force "3" + d3-format "3" + d3-geo "3" + d3-hierarchy "3" + d3-interpolate "3" + d3-path "3" + d3-polygon "3" + d3-quadtree "3" + d3-random "3" + d3-scale "4" + d3-scale-chromatic "3" + d3-selection "3" + d3-shape "3" + d3-time "3" + d3-time-format "4" + d3-timer "3" + d3-transition "3" + d3-zoom "3" + +damerau-levenshtein@^1.0.8: + version "1.0.8" + resolved "https://registry.npmjs.org/damerau-levenshtein/-/damerau-levenshtein-1.0.8.tgz" + integrity sha512-sdQSFB7+llfUcQHUQO3+B8ERRj0Oa4w9POWMI/puGtuf7gFywGmkaLCElnudfTiKZV+NvHqL0ifzdrI8Ro7ESA== + +data-urls@^7.0.0: + version "7.0.0" + resolved "https://registry.npmjs.org/data-urls/-/data-urls-7.0.0.tgz" + integrity sha512-23XHcCF+coGYevirZceTVD7NdJOqVn+49IHyxgszm+JIiHLoB2TkmPtsYkNWT1pvRSGkc35L6NHs0yHkN2SumA== + dependencies: + whatwg-mimetype "^5.0.0" + whatwg-url "^16.0.0" + +data-view-buffer@^1.0.2: + version "1.0.2" + resolved "https://registry.npmjs.org/data-view-buffer/-/data-view-buffer-1.0.2.tgz" + integrity sha512-EmKO5V3OLXh1rtK2wgXRansaK1/mtVdTUEiEI0W8RkvgT05kfxaH29PliLnpLP73yYO6142Q72QNa8Wx/A5CqQ== + dependencies: + call-bound "^1.0.3" + es-errors "^1.3.0" + is-data-view "^1.0.2" + +data-view-byte-length@^1.0.2: + version "1.0.2" + resolved "https://registry.npmjs.org/data-view-byte-length/-/data-view-byte-length-1.0.2.tgz" + integrity sha512-tuhGbE6CfTM9+5ANGf+oQb72Ky/0+s3xKUpHvShfiz2RxMFgFPjsXuRLBVMtvMs15awe45SRb83D6wH4ew6wlQ== + dependencies: + call-bound "^1.0.3" + es-errors "^1.3.0" + is-data-view "^1.0.2" + +data-view-byte-offset@^1.0.1: + version "1.0.1" + resolved "https://registry.npmjs.org/data-view-byte-offset/-/data-view-byte-offset-1.0.1.tgz" + integrity sha512-BS8PfmtDGnrgYdOonGZQdLZslWIeCGFP9tpan0hi1Co2Zr2NKADsvGYA8XxuG/4UWgJ6Cjtv+YJnB6MM69QGlQ== + dependencies: + call-bound "^1.0.2" + es-errors "^1.3.0" + is-data-view "^1.0.1" + +dayjs@^1.8.34: + version "1.11.20" + resolved "https://registry.npmjs.org/dayjs/-/dayjs-1.11.20.tgz" + integrity sha512-YbwwqR/uYpeoP4pu043q+LTDLFBLApUP6VxRihdfNTqu4ubqMlGDLd6ErXhEgsyvY0K6nCs7nggYumAN+9uEuQ== + +debug@^4.3.1, debug@^4.3.2, debug@^4.4.3: + version "4.4.3" + resolved "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz" + integrity sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA== + dependencies: + ms "^2.1.3" + +decimal.js@^10.6.0: + version "10.6.0" + resolved "https://registry.npmjs.org/decimal.js/-/decimal.js-10.6.0.tgz" + integrity sha512-YpgQiITW3JXGntzdUmyUR1V812Hn8T1YVXhCu+wO3OpS4eU9l4YdD3qjyiKdV6mvV29zapkMeD390UVEf2lkUg== + +decompress-response@^6.0.0: + version "6.0.0" + resolved "https://registry.npmjs.org/decompress-response/-/decompress-response-6.0.0.tgz" + integrity sha512-aW35yZM6Bb/4oJlZncMH2LCoZtJXTRxES17vE3hoRiowU2kWHaJKFkSBDnDR+cm9J+9QhXmREyIfv0pji9ejCQ== + dependencies: + mimic-response "^3.1.0" + +deep-extend@^0.6.0: + version "0.6.0" + resolved "https://registry.npmjs.org/deep-extend/-/deep-extend-0.6.0.tgz" + integrity sha512-LOHxIOaPYdHlJRtCQfDIVZtfw/ufM8+rVj649RIHzcm/vGwQRXFt6OPqIFWsm2XEMrNIEtWR64sY1LEKD2vAOA== + +deep-is@^0.1.3: + version "0.1.4" + resolved "https://registry.npmjs.org/deep-is/-/deep-is-0.1.4.tgz" + integrity sha512-oIPzksmTg4/MriiaYGO+okXDT7ztn/w3Eptv/+gSIdMdKsJo0u4CfYNFJPy+4SKMuCqGw2wxnA+URMg3t8a/bQ== + +define-data-property@^1.0.1, define-data-property@^1.1.4: + version "1.1.4" + resolved "https://registry.npmjs.org/define-data-property/-/define-data-property-1.1.4.tgz" + integrity sha512-rBMvIzlpA8v6E+SJZoo++HAYqsLrkg7MSfIinMPFhmkorw7X+dOXVJQs+QT69zGkzMyfDnIMN2Wid1+NbL3T+A== + dependencies: + es-define-property "^1.0.0" + es-errors "^1.3.0" + gopd "^1.0.1" + +define-properties@^1.1.3, define-properties@^1.2.1: + version "1.2.1" + resolved "https://registry.npmjs.org/define-properties/-/define-properties-1.2.1.tgz" + integrity sha512-8QmQKqEASLd5nx0U1B1okLElbUuuttJ/AnYmRXbbbGDWh6uS208EjD4Xqq/I9wK7u0v6O08XhTWnt5XtEbR6Dg== + dependencies: + define-data-property "^1.0.1" + has-property-descriptors "^1.0.0" + object-keys "^1.1.1" + +delaunator@5: + version "5.1.0" + resolved "https://registry.npmjs.org/delaunator/-/delaunator-5.1.0.tgz" + integrity sha512-AGrQ4QSgssa1NGmWmLPqN5NY2KajF5MqxetNEO+o0n3ZwZZeTmt7bBnvzHWrmkZFxGgr4HdyFgelzgi06otLuQ== + dependencies: + robust-predicates "^3.0.2" + +dequal@^2.0.3: + version "2.0.3" + resolved "https://registry.npmjs.org/dequal/-/dequal-2.0.3.tgz" + integrity sha512-0je+qPKHEMohvfRTCEo3CrPG6cAzAYgmzKyxRiYSSDkS6eGJdyVJm7WaYA5ECaAD9wLB2T4EEeymA5aFVcYXCA== + +detect-libc@^2.0.0, detect-libc@^2.0.3: + version "2.1.2" + resolved "https://registry.npmjs.org/detect-libc/-/detect-libc-2.1.2.tgz" + integrity sha512-Btj2BOOO83o3WyH59e8MgXsxEQVcarkUOpEYrubB0urwnN10yQ364rsiByU11nZlqWYZm05i/of7io4mzihBtQ== + +dnd-core@^16.0.1: + version "16.0.1" + resolved "https://registry.npmjs.org/dnd-core/-/dnd-core-16.0.1.tgz" + integrity sha512-HK294sl7tbw6F6IeuK16YSBUoorvHpY8RHO+9yFfaJyCDVb6n7PRcezrOEOa2SBCqiYpemh5Jx20ZcjKdFAVng== + dependencies: + "@react-dnd/asap" "^5.0.1" + "@react-dnd/invariant" "^4.0.1" + redux "^4.2.0" + +doctrine@^2.1.0: + version "2.1.0" + resolved "https://registry.npmjs.org/doctrine/-/doctrine-2.1.0.tgz" + integrity sha512-35mSku4ZXK0vfCuHEDAwt55dg2jNajHZ1odvF+8SSr82EsZY4QmXfuWso8oEd8zRhVObSN18aM0CjSdoBX7zIw== + dependencies: + esutils "^2.0.2" + +dom-accessibility-api@^0.5.9: + version "0.5.16" + resolved "https://registry.npmjs.org/dom-accessibility-api/-/dom-accessibility-api-0.5.16.tgz" + integrity sha512-X7BJ2yElsnOJ30pZF4uIIDfBEVgF4XEBxL9Bxhy6dnrm5hkzqmsWHGTiHqRiITNhMyFLyAiWndIJP7Z1NTteDg== + +dom-accessibility-api@^0.6.3: + version "0.6.3" + resolved "https://registry.npmjs.org/dom-accessibility-api/-/dom-accessibility-api-0.6.3.tgz" + integrity sha512-7ZgogeTnjuHbo+ct10G9Ffp0mif17idi0IyWNVA/wcwcm7NPOD/WEHVP3n7n3MhXqxoIYm8d6MuZohYWIZ4T3w== + +dom-helpers@^5.0.1: + version "5.2.1" + resolved "https://registry.npmjs.org/dom-helpers/-/dom-helpers-5.2.1.tgz" + integrity sha512-nRCa7CK3VTrM2NmGkIy4cbK7IZlgBE/PYMn55rrXefr5xXDP0LdtfPnblFDoVdcAfslJ7or6iqAUnx0CCGIWQA== + dependencies: + "@babel/runtime" "^7.8.7" + csstype "^3.0.2" + +dompurify@*, dompurify@^3.2.4: + version "3.3.3" + resolved "https://registry.npmjs.org/dompurify/-/dompurify-3.3.3.tgz" + integrity sha512-Oj6pzI2+RqBfFG+qOaOLbFXLQ90ARpcGG6UePL82bJLtdsa6CYJD7nmiU8MW9nQNOtCHV3lZ/Bzq1X0QYbBZCA== + optionalDependencies: + "@types/trusted-types" "^2.0.7" + +dunder-proto@^1.0.0, dunder-proto@^1.0.1: + version "1.0.1" + resolved "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz" + integrity sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A== + dependencies: + call-bind-apply-helpers "^1.0.1" + es-errors "^1.3.0" + gopd "^1.2.0" + +duplexer2@~0.1.4: + version "0.1.4" + resolved "https://registry.npmjs.org/duplexer2/-/duplexer2-0.1.4.tgz" + integrity sha512-asLFVfWWtJ90ZyOUHMqk7/S2w2guQKxUI2itj3d92ADHhxUSbCMGi1f1cBcJ7xM1To+pE/Khbwo1yuNbMEPKeA== + dependencies: + readable-stream "^2.0.2" + +echarts@^6.0.0: + version "6.0.0" + resolved "https://registry.npmjs.org/echarts/-/echarts-6.0.0.tgz" + integrity sha512-Tte/grDQRiETQP4xz3iZWSvoHrkCQtwqd6hs+mifXcjrCuo2iKWbajFObuLJVBlDIJlOzgQPd1hsaKt/3+OMkQ== + dependencies: + tslib "2.3.0" + zrender "6.0.0" + +emoji-regex@^10.3.0: + version "10.6.0" + resolved "https://registry.npmjs.org/emoji-regex/-/emoji-regex-10.6.0.tgz" + integrity sha512-toUI84YS5YmxW219erniWD0CIVOo46xGKColeNQRgOzDorgBi1v4D71/OFzgD9GO2UGKIv1C3Sp8DAn0+j5w7A== + +emoji-regex@^9.2.2: + version "9.2.2" + resolved "https://registry.npmjs.org/emoji-regex/-/emoji-regex-9.2.2.tgz" + integrity sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg== + +end-of-stream@^1.1.0, end-of-stream@^1.4.1: + version "1.4.5" + resolved "https://registry.npmjs.org/end-of-stream/-/end-of-stream-1.4.5.tgz" + integrity sha512-ooEGc6HP26xXq/N+GCGOT0JKCLDGrq2bQUZrQ7gyrJiZANJ/8YDTxTpQBXGMn+WbIQXNVpyWymm7KYVICQnyOg== + dependencies: + once "^1.4.0" + +entities@^4.4.0: + version "4.5.0" + resolved "https://registry.npmjs.org/entities/-/entities-4.5.0.tgz" + integrity sha512-V0hjH4dGPh9Ao5p0MoRY6BVqtwCjhz6vI5LT8AJ55H+4g9/4vbHx1I54fS0XuclLhDHArPQCiMjDxjaL8fPxhw== + +entities@^6.0.0: + version "6.0.1" + resolved "https://registry.npmjs.org/entities/-/entities-6.0.1.tgz" + integrity sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g== + +error-ex@^1.3.1: + version "1.3.4" + resolved "https://registry.npmjs.org/error-ex/-/error-ex-1.3.4.tgz" + integrity sha512-sqQamAnR14VgCr1A618A3sGrygcpK+HEbenA/HiEAkkUwcZIIB/tgWqHFxWgOyDh4nB4JCRimh79dR5Ywc9MDQ== + dependencies: + is-arrayish "^0.2.1" + +es-abstract@^1.17.5, es-abstract@^1.23.2, es-abstract@^1.23.3, es-abstract@^1.23.5, es-abstract@^1.23.6, es-abstract@^1.23.9, es-abstract@^1.24.0, es-abstract@^1.24.1: + version "1.24.1" + resolved "https://registry.npmjs.org/es-abstract/-/es-abstract-1.24.1.tgz" + integrity sha512-zHXBLhP+QehSSbsS9Pt23Gg964240DPd6QCf8WpkqEXxQ7fhdZzYsocOr5u7apWonsS5EjZDmTF+/slGMyasvw== + dependencies: + array-buffer-byte-length "^1.0.2" + arraybuffer.prototype.slice "^1.0.4" + available-typed-arrays "^1.0.7" + call-bind "^1.0.8" + call-bound "^1.0.4" + data-view-buffer "^1.0.2" + data-view-byte-length "^1.0.2" + data-view-byte-offset "^1.0.1" + es-define-property "^1.0.1" + es-errors "^1.3.0" + es-object-atoms "^1.1.1" + es-set-tostringtag "^2.1.0" + es-to-primitive "^1.3.0" + function.prototype.name "^1.1.8" + get-intrinsic "^1.3.0" + get-proto "^1.0.1" + get-symbol-description "^1.1.0" + globalthis "^1.0.4" + gopd "^1.2.0" + has-property-descriptors "^1.0.2" + has-proto "^1.2.0" + has-symbols "^1.1.0" + hasown "^2.0.2" + internal-slot "^1.1.0" + is-array-buffer "^3.0.5" + is-callable "^1.2.7" + is-data-view "^1.0.2" + is-negative-zero "^2.0.3" + is-regex "^1.2.1" + is-set "^2.0.3" + is-shared-array-buffer "^1.0.4" + is-string "^1.1.1" + is-typed-array "^1.1.15" + is-weakref "^1.1.1" + math-intrinsics "^1.1.0" + object-inspect "^1.13.4" + object-keys "^1.1.1" + object.assign "^4.1.7" + own-keys "^1.0.1" + regexp.prototype.flags "^1.5.4" + safe-array-concat "^1.1.3" + safe-push-apply "^1.0.0" + safe-regex-test "^1.1.0" + set-proto "^1.0.0" + stop-iteration-iterator "^1.1.0" + string.prototype.trim "^1.2.10" + string.prototype.trimend "^1.0.9" + string.prototype.trimstart "^1.0.8" + typed-array-buffer "^1.0.3" + typed-array-byte-length "^1.0.3" + typed-array-byte-offset "^1.0.4" + typed-array-length "^1.0.7" + unbox-primitive "^1.1.0" + which-typed-array "^1.1.19" + +es-define-property@^1.0.0, es-define-property@^1.0.1: + version "1.0.1" + resolved "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz" + integrity sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g== + +es-errors@^1.3.0: + version "1.3.0" + resolved "https://registry.npmjs.org/es-errors/-/es-errors-1.3.0.tgz" + integrity sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw== + +es-iterator-helpers@^1.2.1: + version "1.3.1" + resolved "https://registry.npmjs.org/es-iterator-helpers/-/es-iterator-helpers-1.3.1.tgz" + integrity sha512-zWwRvqWiuBPr0muUG/78cW3aHROFCNIQ3zpmYDpwdbnt2m+xlNyRWpHBpa2lJjSBit7BQ+RXA1iwbSmu5yJ/EQ== + dependencies: + call-bind "^1.0.8" + call-bound "^1.0.4" + define-properties "^1.2.1" + es-abstract "^1.24.1" + es-errors "^1.3.0" + es-set-tostringtag "^2.1.0" + function-bind "^1.1.2" + get-intrinsic "^1.3.0" + globalthis "^1.0.4" + gopd "^1.2.0" + has-property-descriptors "^1.0.2" + has-proto "^1.2.0" + has-symbols "^1.1.0" + internal-slot "^1.1.0" + iterator.prototype "^1.1.5" + math-intrinsics "^1.1.0" + safe-array-concat "^1.1.3" + +es-module-lexer@^2.0.0: + version "2.0.0" + resolved "https://registry.npmjs.org/es-module-lexer/-/es-module-lexer-2.0.0.tgz" + integrity sha512-5POEcUuZybH7IdmGsD8wlf0AI55wMecM9rVBTI/qEAy2c1kTOm3DjFYjrBdI2K3BaJjJYfYFeRtM0t9ssnRuxw== + +es-object-atoms@^1.0.0, es-object-atoms@^1.1.1: + version "1.1.1" + resolved "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz" + integrity sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA== + dependencies: + es-errors "^1.3.0" + +es-set-tostringtag@^2.1.0: + version "2.1.0" + resolved "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz" + integrity sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA== + dependencies: + es-errors "^1.3.0" + get-intrinsic "^1.2.6" + has-tostringtag "^1.0.2" + hasown "^2.0.2" + +es-shim-unscopables@^1.0.2: + version "1.1.0" + resolved "https://registry.npmjs.org/es-shim-unscopables/-/es-shim-unscopables-1.1.0.tgz" + integrity sha512-d9T8ucsEhh8Bi1woXCf+TIKDIROLG5WCkxg8geBCbvk22kzwC5G2OnXVMO6FUsvQlgUUXQ2itephWDLqDzbeCw== + dependencies: + hasown "^2.0.2" + +es-to-primitive@^1.3.0: + version "1.3.0" + resolved "https://registry.npmjs.org/es-to-primitive/-/es-to-primitive-1.3.0.tgz" + integrity sha512-w+5mJ3GuFL+NjVtJlvydShqE1eN3h3PbI7/5LAsYJP/2qtuMXjfL2LpHSRqo4b4eSF5K/DH1JXKUAHSB2UW50g== + dependencies: + is-callable "^1.2.7" + is-date-object "^1.0.5" + is-symbol "^1.0.4" + +esbuild@^0.21.3: + version "0.21.5" + resolved "https://registry.npmjs.org/esbuild/-/esbuild-0.21.5.tgz" + integrity sha512-mg3OPMV4hXywwpoDxu3Qda5xCKQi+vCTZq8S9J/EpkhB2HzKXq4SNFZE3+NK93JYxc8VMSep+lOUSC/RVKaBqw== + optionalDependencies: + "@esbuild/aix-ppc64" "0.21.5" + "@esbuild/android-arm" "0.21.5" + "@esbuild/android-arm64" "0.21.5" + "@esbuild/android-x64" "0.21.5" + "@esbuild/darwin-arm64" "0.21.5" + "@esbuild/darwin-x64" "0.21.5" + "@esbuild/freebsd-arm64" "0.21.5" + "@esbuild/freebsd-x64" "0.21.5" + "@esbuild/linux-arm" "0.21.5" + "@esbuild/linux-arm64" "0.21.5" + "@esbuild/linux-ia32" "0.21.5" + "@esbuild/linux-loong64" "0.21.5" + "@esbuild/linux-mips64el" "0.21.5" + "@esbuild/linux-ppc64" "0.21.5" + "@esbuild/linux-riscv64" "0.21.5" + "@esbuild/linux-s390x" "0.21.5" + "@esbuild/linux-x64" "0.21.5" + "@esbuild/netbsd-x64" "0.21.5" + "@esbuild/openbsd-x64" "0.21.5" + "@esbuild/sunos-x64" "0.21.5" + "@esbuild/win32-arm64" "0.21.5" + "@esbuild/win32-ia32" "0.21.5" + "@esbuild/win32-x64" "0.21.5" + +escalade@^3.1.1: + version "3.2.0" + resolved "https://registry.npmjs.org/escalade/-/escalade-3.2.0.tgz" + integrity sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA== + +escape-string-regexp@^4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-4.0.0.tgz" + integrity sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA== + +eslint-plugin-jsx-a11y@^6.10.2: + version "6.10.2" + resolved "https://registry.npmjs.org/eslint-plugin-jsx-a11y/-/eslint-plugin-jsx-a11y-6.10.2.tgz" + integrity sha512-scB3nz4WmG75pV8+3eRUQOHZlNSUhFNq37xnpgRkCCELU3XMvXAxLk1eqWWyE22Ki4Q01Fnsw9BA3cJHDPgn2Q== + dependencies: + aria-query "^5.3.2" + array-includes "^3.1.8" + array.prototype.flatmap "^1.3.2" + ast-types-flow "^0.0.8" + axe-core "^4.10.0" + axobject-query "^4.1.0" + damerau-levenshtein "^1.0.8" + emoji-regex "^9.2.2" + hasown "^2.0.2" + jsx-ast-utils "^3.3.5" + language-tags "^1.0.9" + minimatch "^3.1.2" + object.fromentries "^2.0.8" + safe-regex-test "^1.0.3" + string.prototype.includes "^2.0.1" + +eslint-plugin-react@^7.37.2: + version "7.37.5" + resolved "https://registry.npmjs.org/eslint-plugin-react/-/eslint-plugin-react-7.37.5.tgz" + integrity sha512-Qteup0SqU15kdocexFNAJMvCJEfa2xUKNV4CC1xsVMrIIqEy3SQ/rqyxCWNzfrd3/ldy6HMlD2e0JDVpDg2qIA== + dependencies: + array-includes "^3.1.8" + array.prototype.findlast "^1.2.5" + array.prototype.flatmap "^1.3.3" + array.prototype.tosorted "^1.1.4" + doctrine "^2.1.0" + es-iterator-helpers "^1.2.1" + estraverse "^5.3.0" + hasown "^2.0.2" + jsx-ast-utils "^2.4.1 || ^3.0.0" + minimatch "^3.1.2" + object.entries "^1.1.9" + object.fromentries "^2.0.8" + object.values "^1.2.1" + prop-types "^15.8.1" + resolve "^2.0.0-next.5" + semver "^6.3.1" + string.prototype.matchall "^4.0.12" + string.prototype.repeat "^1.0.0" + +eslint-scope@^8.4.0: + version "8.4.0" + resolved "https://registry.npmjs.org/eslint-scope/-/eslint-scope-8.4.0.tgz" + integrity sha512-sNXOfKCn74rt8RICKMvJS7XKV/Xk9kA7DyJr8mJik3S7Cwgy3qlkkmyS2uQB3jiJg6VNdZd/pDBJu0nvG2NlTg== + dependencies: + esrecurse "^4.3.0" + estraverse "^5.2.0" + +eslint-visitor-keys@^3.4.3: + version "3.4.3" + resolved "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-3.4.3.tgz" + integrity sha512-wpc+LXeiyiisxPlEkUzU6svyS1frIO3Mgxj1fdy7Pm8Ygzguax2N3Fa/D/ag1WqbOprdI+uY6wMUl8/a2G+iag== + +eslint-visitor-keys@^4.2.1: + version "4.2.1" + resolved "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-4.2.1.tgz" + integrity sha512-Uhdk5sfqcee/9H/rCOJikYz67o0a2Tw2hGRPOG2Y1R2dg7brRe1uG0yaNQDHu+TO/uQPF/5eCapvYSmHUjt7JQ== + +eslint-visitor-keys@^5.0.0: + version "5.0.1" + resolved "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-5.0.1.tgz" + integrity sha512-tD40eHxA35h0PEIZNeIjkHoDR4YjjJp34biM0mDvplBe//mB+IHCqHDGV7pxF+7MklTvighcCPPZC7ynWyjdTA== + +eslint@^9.15.0: + version "9.39.4" + resolved "https://registry.npmjs.org/eslint/-/eslint-9.39.4.tgz" + integrity sha512-XoMjdBOwe/esVgEvLmNsD3IRHkm7fbKIUGvrleloJXUZgDHig2IPWNniv+GwjyJXzuNqVjlr5+4yVUZjycJwfQ== + dependencies: + "@eslint-community/eslint-utils" "^4.8.0" + "@eslint-community/regexpp" "^4.12.1" + "@eslint/config-array" "^0.21.2" + "@eslint/config-helpers" "^0.4.2" + "@eslint/core" "^0.17.0" + "@eslint/eslintrc" "^3.3.5" + "@eslint/js" "9.39.4" + "@eslint/plugin-kit" "^0.4.1" + "@humanfs/node" "^0.16.6" + "@humanwhocodes/module-importer" "^1.0.1" + "@humanwhocodes/retry" "^0.4.2" + "@types/estree" "^1.0.6" + ajv "^6.14.0" + chalk "^4.0.0" + cross-spawn "^7.0.6" + debug "^4.3.2" + escape-string-regexp "^4.0.0" + eslint-scope "^8.4.0" + eslint-visitor-keys "^4.2.1" + espree "^10.4.0" + esquery "^1.5.0" + esutils "^2.0.2" + fast-deep-equal "^3.1.3" + file-entry-cache "^8.0.0" + find-up "^5.0.0" + glob-parent "^6.0.2" + ignore "^5.2.0" + imurmurhash "^0.1.4" + is-glob "^4.0.0" + json-stable-stringify-without-jsonify "^1.0.1" + lodash.merge "^4.6.2" + minimatch "^3.1.5" + natural-compare "^1.4.0" + optionator "^0.9.3" + +espree@^10.0.1, espree@^10.4.0: + version "10.4.0" + resolved "https://registry.npmjs.org/espree/-/espree-10.4.0.tgz" + integrity sha512-j6PAQ2uUr79PZhBjP5C5fhl8e39FmRnOjsD5lGnWrFU8i2G776tBK7+nP8KuQUTTyAZUwfQqXAgrVH5MbH9CYQ== + dependencies: + acorn "^8.15.0" + acorn-jsx "^5.3.2" + eslint-visitor-keys "^4.2.1" + +esquery@^1.5.0: + version "1.7.0" + resolved "https://registry.npmjs.org/esquery/-/esquery-1.7.0.tgz" + integrity sha512-Ap6G0WQwcU/LHsvLwON1fAQX9Zp0A2Y6Y/cJBl9r/JbW90Zyg4/zbG6zzKa2OTALELarYHmKu0GhpM5EO+7T0g== + dependencies: + estraverse "^5.1.0" + +esrecurse@^4.3.0: + version "4.3.0" + resolved "https://registry.npmjs.org/esrecurse/-/esrecurse-4.3.0.tgz" + integrity sha512-KmfKL3b6G+RXvP8N1vr3Tq1kL/oCFgn2NYXEtqP8/L3pKapUA4G8cFVaoF3SU323CD4XypR/ffioHmkti6/Tag== + dependencies: + estraverse "^5.2.0" + +estraverse@^5.1.0, estraverse@^5.2.0, estraverse@^5.3.0: + version "5.3.0" + resolved "https://registry.npmjs.org/estraverse/-/estraverse-5.3.0.tgz" + integrity sha512-MMdARuVEQziNTeJD8DgMqmhwR11BRQ/cBP+pLtYdSTnf3MIO8fFeiINEbX36ZdNlfU/7A9f3gUw49B3oQsvwBA== + +estree-walker@^3.0.3: + version "3.0.3" + resolved "https://registry.npmjs.org/estree-walker/-/estree-walker-3.0.3.tgz" + integrity sha512-7RUKfXgSMMkzt6ZuXmqapOurLGPPfgj6l9uRZ7lRGolvk0y2yocc35LdcxKC5PQZdn2DMqioAQ2NoWcrTKmm6g== + dependencies: + "@types/estree" "^1.0.0" + +esutils@^2.0.2: + version "2.0.3" + resolved "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz" + integrity sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g== + +eventemitter3@^5.0.0: + version "5.0.4" + resolved "https://registry.npmjs.org/eventemitter3/-/eventemitter3-5.0.4.tgz" + integrity sha512-mlsTRyGaPBjPedk6Bvw+aqbsXDtoAyAzm5MO7JgU+yVRyMQ5O8bD4Kcci7BS85f93veegeCPkL8R4GLClnjLFw== + +exceljs@^4.4.0: + version "4.4.0" + resolved "https://registry.npmjs.org/exceljs/-/exceljs-4.4.0.tgz" + integrity sha512-XctvKaEMaj1Ii9oDOqbW/6e1gXknSY4g/aLCDicOXqBE4M0nRWkUu0PTp++UPNzoFY12BNHMfs/VadKIS6llvg== + dependencies: + archiver "^5.0.0" + dayjs "^1.8.34" + fast-csv "^4.3.1" + jszip "^3.10.1" + readable-stream "^3.6.0" + saxes "^5.0.1" + tmp "^0.2.0" + unzipper "^0.10.11" + uuid "^8.3.0" + +expand-template@^2.0.3: + version "2.0.3" + resolved "https://registry.npmjs.org/expand-template/-/expand-template-2.0.3.tgz" + integrity sha512-XYfuKMvj4O35f/pOXLObndIRvyQ+/+6AhODh+OKWj9S9498pHHn/IMszH+gt0fBCRWMNfk1ZSp5x3AifmnI2vg== + +expect-type@^1.3.0: + version "1.3.0" + resolved "https://registry.npmjs.org/expect-type/-/expect-type-1.3.0.tgz" + integrity sha512-knvyeauYhqjOYvQ66MznSMs83wmHrCycNEN6Ao+2AeYEfxUIkuiVxdEa1qlGEPK+We3n0THiDciYSsCcgW/DoA== + +fast-csv@^4.3.1: + version "4.3.6" + resolved "https://registry.npmjs.org/fast-csv/-/fast-csv-4.3.6.tgz" + integrity sha512-2RNSpuwwsJGP0frGsOmTb9oUF+VkFSM4SyLTDgwf2ciHWTarN0lQTC+F2f/t5J9QjW+c65VFIAAu85GsvMIusw== + dependencies: + "@fast-csv/format" "4.3.5" + "@fast-csv/parse" "4.3.6" + +fast-deep-equal@^3.1.1, fast-deep-equal@^3.1.3: + version "3.1.3" + resolved "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz" + integrity sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q== + +fast-equals@^5.3.3: + version "5.4.0" + resolved "https://registry.npmjs.org/fast-equals/-/fast-equals-5.4.0.tgz" + integrity sha512-jt2DW/aNFNwke7AUd+Z+e6pz39KO5rzdbbFCg2sGafS4mk13MI7Z8O5z9cADNn5lhGODIgLwug6TZO2ctf7kcw== + +fast-json-patch@^3.0.0-1, fast-json-patch@^3.1.1: + version "3.1.1" + resolved "https://registry.npmjs.org/fast-json-patch/-/fast-json-patch-3.1.1.tgz" + integrity sha512-vf6IHUX2SBcA+5/+4883dsIjpBTqmfBjmYiWK1savxQmFk4JfBMLa7ynTYOs1Rolp/T1betJxHiGD3g1Mn8lUQ== + +fast-json-stable-stringify@^2.0.0: + version "2.1.0" + resolved "https://registry.npmjs.org/fast-json-stable-stringify/-/fast-json-stable-stringify-2.1.0.tgz" + integrity sha512-lhd/wF+Lk98HZoTCtlVraHtfh5XYijIjalXck7saUtuanSDyLMxnHhSXEDJqHxD7msR8D0uCmqlkwjCV8xvwHw== + +fast-levenshtein@^2.0.6: + version "2.0.6" + resolved "https://registry.npmjs.org/fast-levenshtein/-/fast-levenshtein-2.0.6.tgz" + integrity sha512-DCXu6Ifhqcks7TZKY3Hxp3y6qphY5SJZmrWMDrKcERSOXWQdMhU9Ig/PYrzyw/ul9jOIyh0N4M0tbC5hodg8dw== + +fdir@^6.5.0: + version "6.5.0" + resolved "https://registry.npmjs.org/fdir/-/fdir-6.5.0.tgz" + integrity sha512-tIbYtZbucOs0BRGqPJkshJUYdL+SDH7dVM8gjy+ERp3WAUjLEFJE+02kanyHtwjWOnwrKYBiwAmM0p4kLJAnXg== + +file-entry-cache@^8.0.0: + version "8.0.0" + resolved "https://registry.npmjs.org/file-entry-cache/-/file-entry-cache-8.0.0.tgz" + integrity sha512-XXTUwCvisa5oacNGRP9SfNtYBNAMi+RPwBFmblZEF7N7swHYQS6/Zfk7SRwx4D5j3CH211YNRco1DEMNVfZCnQ== + dependencies: + flat-cache "^4.0.0" + +find-root@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/find-root/-/find-root-1.1.0.tgz" + integrity sha512-NKfW6bec6GfKc0SGx1e07QZY9PE99u0Bft/0rzSD5k3sO/vwkVUpDUKVm5Gpp5Ue3YfShPFTX2070tDs5kB9Ng== + +find-up@^5.0.0: + version "5.0.0" + resolved "https://registry.npmjs.org/find-up/-/find-up-5.0.0.tgz" + integrity sha512-78/PXT1wlLLDgTzDs7sjq9hzz0vXD+zn+7wypEe4fXQxCmdmqfGsEPQxmiCSQI3ajFV91bVSsvNtrJRiW6nGng== + dependencies: + locate-path "^6.0.0" + path-exists "^4.0.0" + +flat-cache@^4.0.0: + version "4.0.1" + resolved "https://registry.npmjs.org/flat-cache/-/flat-cache-4.0.1.tgz" + integrity sha512-f7ccFPK3SXFHpx15UIGyRJ/FJQctuKZ0zVuN3frBo4HnK3cay9VEW0R6yPYFHC0AgqhukPzKjq22t5DmAyqGyw== + dependencies: + flatted "^3.2.9" + keyv "^4.5.4" + +flatted@^3.2.9: + version "3.4.2" + resolved "https://registry.npmjs.org/flatted/-/flatted-3.4.2.tgz" + integrity sha512-PjDse7RzhcPkIJwy5t7KPWQSZ9cAbzQXcafsetQoD7sOJRQlGikNbx7yZp2OotDnJyrDcbyRq3Ttb18iYOqkxA== + +for-each@^0.3.3, for-each@^0.3.5: + version "0.3.5" + resolved "https://registry.npmjs.org/for-each/-/for-each-0.3.5.tgz" + integrity sha512-dKx12eRCVIzqCxFGplyFKJMPvLEWgmNtUrpTiJIR5u97zEhRG8ySrtboPHZXx7daLxQVrl643cTzbab2tkQjxg== + dependencies: + is-callable "^1.2.7" + +fs-constants@^1.0.0: + version "1.0.0" + resolved "https://registry.npmjs.org/fs-constants/-/fs-constants-1.0.0.tgz" + integrity sha512-y6OAwoSIf7FyjMIv94u+b5rdheZEjzR63GTyZJm5qh4Bi+2YgwLCcI/fPFZkL5PSixOt6ZNKm+w+Hfp/Bciwow== + +fs.realpath@^1.0.0: + version "1.0.0" + resolved "https://registry.npmjs.org/fs.realpath/-/fs.realpath-1.0.0.tgz" + integrity sha512-OO0pH2lK6a0hZnAdau5ItzHPI6pUlvI7jMVnxUQRtw4owF2wk8lOSabtGDCTP4Ggrg2MbGnWO9X8K1t4+fGMDw== + +fsevents@~2.3.2, fsevents@~2.3.3: + version "2.3.3" + resolved "https://registry.npmjs.org/fsevents/-/fsevents-2.3.3.tgz" + integrity sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw== + +fstream@^1.0.12: + version "1.0.12" + resolved "https://registry.npmjs.org/fstream/-/fstream-1.0.12.tgz" + integrity sha512-WvJ193OHa0GHPEL+AycEJgxvBEwyfRkN1vhjca23OaPVMCaLCXTd5qAu82AjTcgP1UJmytkOKb63Ypde7raDIg== + dependencies: + graceful-fs "^4.1.2" + inherits "~2.0.0" + mkdirp ">=0.5 0" + rimraf "2" + +function-bind@^1.1.2: + version "1.1.2" + resolved "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz" + integrity sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA== + +function.prototype.name@^1.1.6, function.prototype.name@^1.1.8: + version "1.1.8" + resolved "https://registry.npmjs.org/function.prototype.name/-/function.prototype.name-1.1.8.tgz" + integrity sha512-e5iwyodOHhbMr/yNrc7fDYG4qlbIvI5gajyzPnb5TCwyhjApznQh1BMFou9b30SevY43gCJKXycoCBjMbsuW0Q== + dependencies: + call-bind "^1.0.8" + call-bound "^1.0.3" + define-properties "^1.2.1" + functions-have-names "^1.2.3" + hasown "^2.0.2" + is-callable "^1.2.7" + +functions-have-names@^1.2.3: + version "1.2.3" + resolved "https://registry.npmjs.org/functions-have-names/-/functions-have-names-1.2.3.tgz" + integrity sha512-xckBUXyTIqT97tq2x2AMb+g163b5JFysYk0x4qxNFwbfQkmNZoiRHb6sPzI9/QV33WeuvVYBUIiD4NzNIyqaRQ== + +generator-function@^2.0.0: + version "2.0.1" + resolved "https://registry.npmjs.org/generator-function/-/generator-function-2.0.1.tgz" + integrity sha512-SFdFmIJi+ybC0vjlHN0ZGVGHc3lgE0DxPAT0djjVg+kjOnSqclqmj0KQ7ykTOLP6YxoqOvuAODGdcHJn+43q3g== + +get-caller-file@^2.0.5: + version "2.0.5" + resolved "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz" + integrity sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg== + +get-east-asian-width@^1.0.0: + version "1.5.0" + resolved "https://registry.npmjs.org/get-east-asian-width/-/get-east-asian-width-1.5.0.tgz" + integrity sha512-CQ+bEO+Tva/qlmw24dCejulK5pMzVnUOFOijVogd3KQs07HnRIgp8TGipvCCRT06xeYEbpbgwaCxglFyiuIcmA== + +get-intrinsic@^1.2.4, get-intrinsic@^1.2.5, get-intrinsic@^1.2.6, get-intrinsic@^1.2.7, get-intrinsic@^1.3.0: + version "1.3.0" + resolved "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz" + integrity sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ== + dependencies: + call-bind-apply-helpers "^1.0.2" + es-define-property "^1.0.1" + es-errors "^1.3.0" + es-object-atoms "^1.1.1" + function-bind "^1.1.2" + get-proto "^1.0.1" + gopd "^1.2.0" + has-symbols "^1.1.0" + hasown "^2.0.2" + math-intrinsics "^1.1.0" + +get-proto@^1.0.0, get-proto@^1.0.1: + version "1.0.1" + resolved "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz" + integrity sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g== + dependencies: + dunder-proto "^1.0.1" + es-object-atoms "^1.0.0" + +get-symbol-description@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/get-symbol-description/-/get-symbol-description-1.1.0.tgz" + integrity sha512-w9UMqWwJxHNOvoNzSJ2oPF5wvYcvP7jUvYzhp67yEhTi17ZDBBC1z9pTdGuzjD+EFIqLSYRweZjqfiPzQ06Ebg== + dependencies: + call-bound "^1.0.3" + es-errors "^1.3.0" + get-intrinsic "^1.2.6" + +github-from-package@0.0.0: + version "0.0.0" + resolved "https://registry.npmjs.org/github-from-package/-/github-from-package-0.0.0.tgz" + integrity sha512-SyHy3T1v2NUXn29OsWdxmK6RwHD+vkj3v8en8AOBZ1wBQ/hCAQ5bAQTD02kW4W9tUp/3Qh6J8r9EvntiyCmOOw== + +glob-parent@^6.0.2: + version "6.0.2" + resolved "https://registry.npmjs.org/glob-parent/-/glob-parent-6.0.2.tgz" + integrity sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A== + dependencies: + is-glob "^4.0.3" + +glob@^7.1.3, glob@^7.1.4, glob@^7.2.3: + version "7.2.3" + resolved "https://registry.npmjs.org/glob/-/glob-7.2.3.tgz" + integrity sha512-nFR0zLpU2YCaRxwoCJvL6UvCH2JFyFVIvwTLsIf21AuHlMskA1hhTdk+LlYJtOlYt9v6dvszD2BGRqBL+iQK9Q== + dependencies: + fs.realpath "^1.0.0" + inflight "^1.0.4" + inherits "2" + minimatch "^3.1.1" + once "^1.3.0" + path-is-absolute "^1.0.0" + +globals@^14.0.0: + version "14.0.0" + resolved "https://registry.npmjs.org/globals/-/globals-14.0.0.tgz" + integrity sha512-oahGvuMGQlPw/ivIYBjVSrWAfWLBeku5tpPE2fOPLi+WHffIWbuh2tCjhyQhTBPMf5E9jDEH4FOmTYgYwbKwtQ== + +globals@^15.12.0: + version "15.15.0" + resolved "https://registry.npmjs.org/globals/-/globals-15.15.0.tgz" + integrity sha512-7ACyT3wmyp3I61S4fG682L0VA2RGD9otkqGJIwNUMF1SWUombIIk+af1unuDYgMm082aHYwD+mzJvv9Iu8dsgg== + +globalthis@^1.0.4: + version "1.0.4" + resolved "https://registry.npmjs.org/globalthis/-/globalthis-1.0.4.tgz" + integrity sha512-DpLKbNU4WylpxJykQujfCcwYWiV/Jhm50Goo0wrVILAv5jOr9d+H+UR3PhSCD2rCCEIg0uc+G+muBTwD54JhDQ== + dependencies: + define-properties "^1.2.1" + gopd "^1.0.1" + +gofish-graphics@^0.0.22: + version "0.0.22" + resolved "https://registry.npmjs.org/gofish-graphics/-/gofish-graphics-0.0.22.tgz" + integrity sha512-zRDziOMXIJFAL8Z3mirXKaIC05ZhNd+yWQ3prKsX41MEty+ohs6xH/D43fMhwrKo5AxC8ohdBJ+bvu2m9Z+6cw== + dependencies: + "@types/d3-array" "^3.2.1" + bubblesets-js "^3.0.0" + chroma-js "^3.1.2" + culori "^4.0.2" + d3-array "^3.2.4" + lodash "^4.17.21" + rybitten "^0.22.0" + solid-js "^1.9.5" + spectral.js "^2.0.2" + +gopd@^1.0.1, gopd@^1.2.0: + version "1.2.0" + resolved "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz" + integrity sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg== + +graceful-fs@^4.1.2, graceful-fs@^4.2.0, graceful-fs@^4.2.2: + version "4.2.11" + resolved "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.11.tgz" + integrity sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ== + +has-bigints@^1.0.2: + version "1.1.0" + resolved "https://registry.npmjs.org/has-bigints/-/has-bigints-1.1.0.tgz" + integrity sha512-R3pbpkcIqv2Pm3dUwgjclDRVmWpTJW2DcMzcIhEXEx1oh/CEMObMm3KLmRJOdvhM7o4uQBnwr8pzRK2sJWIqfg== + +has-flag@^4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz" + integrity sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ== + +has-property-descriptors@^1.0.0, has-property-descriptors@^1.0.2: + version "1.0.2" + resolved "https://registry.npmjs.org/has-property-descriptors/-/has-property-descriptors-1.0.2.tgz" + integrity sha512-55JNKuIW+vq4Ke1BjOTjM2YctQIvCT7GFzHwmfZPGo5wnrgkid0YQtnAleFSqumZm4az3n2BS+erby5ipJdgrg== + dependencies: + es-define-property "^1.0.0" + +has-proto@^1.2.0: + version "1.2.0" + resolved "https://registry.npmjs.org/has-proto/-/has-proto-1.2.0.tgz" + integrity sha512-KIL7eQPfHQRC8+XluaIw7BHUwwqL19bQn4hzNgdr+1wXoU0KKj6rufu47lhY7KbJR2C6T6+PfyN0Ea7wkSS+qQ== + dependencies: + dunder-proto "^1.0.0" + +has-symbols@^1.0.3, has-symbols@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz" + integrity sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ== + +has-tostringtag@^1.0.2: + version "1.0.2" + resolved "https://registry.npmjs.org/has-tostringtag/-/has-tostringtag-1.0.2.tgz" + integrity sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw== + dependencies: + has-symbols "^1.0.3" + +hasown@^2.0.2: + version "2.0.2" + resolved "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz" + integrity sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ== + dependencies: + function-bind "^1.1.2" + +hoist-non-react-statics@^3.3.0, hoist-non-react-statics@^3.3.1, hoist-non-react-statics@^3.3.2: + version "3.3.2" + resolved "https://registry.npmjs.org/hoist-non-react-statics/-/hoist-non-react-statics-3.3.2.tgz" + integrity sha512-/gGivxi8JPKWNm/W0jSmzcMPpfpPLc3dY/6GxhX2hQ9iGj3aDfklV4ET7NjKpSinLpJ5vafa9iiGIEZg10SfBw== + dependencies: + react-is "^16.7.0" + +html-encoding-sniffer@^6.0.0: + version "6.0.0" + resolved "https://registry.npmjs.org/html-encoding-sniffer/-/html-encoding-sniffer-6.0.0.tgz" + integrity sha512-CV9TW3Y3f8/wT0BRFc1/KAVQ3TUHiXmaAb6VW9vtiMFf7SLoMd1PdAc4W3KFOFETBJUb90KatHqlsZMWV+R9Gg== + dependencies: + "@exodus/bytes" "^1.6.0" + +html-parse-stringify@^3.0.1: + version "3.0.1" + resolved "https://registry.npmjs.org/html-parse-stringify/-/html-parse-stringify-3.0.1.tgz" + integrity sha512-KknJ50kTInJ7qIScF3jeaFRpMpE8/lfiTdzf/twXyPBLAGrLRTmkz3AdTnKeh40X8k9L2fdYwEp/42WGXIRGcg== + dependencies: + void-elements "3.1.0" + +html2canvas@^1.4.1: + version "1.4.1" + resolved "https://registry.npmjs.org/html2canvas/-/html2canvas-1.4.1.tgz" + integrity sha512-fPU6BHNpsyIhr8yyMpTLLxAbkaK8ArIBcmZIRiBLiDhjeqvXolaEmDGmELFuX9I4xDcaKKcJl+TKZLqruBbmWA== + dependencies: + css-line-break "^2.1.0" + text-segmentation "^1.0.3" + +i18next-browser-languagedetector@^8.2.1: + version "8.2.1" + resolved "https://registry.npmjs.org/i18next-browser-languagedetector/-/i18next-browser-languagedetector-8.2.1.tgz" + integrity sha512-bZg8+4bdmaOiApD7N7BPT9W8MLZG+nPTOFlLiJiT8uzKXFjhxw4v2ierCXOwB5sFDMtuA5G4kgYZ0AznZxQ/cw== + dependencies: + "@babel/runtime" "^7.23.2" + +i18next@^26.0.1: + version "26.0.1" + resolved "https://registry.npmjs.org/i18next/-/i18next-26.0.1.tgz" + integrity sha512-vtz5sXU4+nkCm8yEU+JJ6yYIx0mkg9e68W0G0PXpnOsmzLajNsW5o28DJMqbajxfsfq0gV3XdrBudsDQnwxfsQ== + dependencies: + "@babel/runtime" "^7.29.2" + +iconv-lite@0.6: + version "0.6.3" + resolved "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.6.3.tgz" + integrity sha512-4fCk79wshMdzMp2rH06qWrJE4iolqLhCUH+OiuIgU++RB0+94NlDL81atO7GX55uUKueo0txHNtvEyI6D7WdMw== + dependencies: + safer-buffer ">= 2.1.2 < 3.0.0" + +ieee754@^1.1.13: + version "1.2.1" + resolved "https://registry.npmjs.org/ieee754/-/ieee754-1.2.1.tgz" + integrity sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA== + +ignore@^5.2.0: + version "5.3.2" + resolved "https://registry.npmjs.org/ignore/-/ignore-5.3.2.tgz" + integrity sha512-hsBTNUqQTDwkWtcdYI2i06Y/nUBEsNEDJKjWdigLvegy8kDuJAS8uRlpkkcQpyEXL0Z/pjDy5HBmMjRCJ2gq+g== + +ignore@^7.0.5: + version "7.0.5" + resolved "https://registry.npmjs.org/ignore/-/ignore-7.0.5.tgz" + integrity sha512-Hs59xBNfUIunMFgWAbGX5cq6893IbWg4KnrjbYwX3tx0ztorVgTDA6B2sxf8ejHJ4wz8BqGUMYlnzNBer5NvGg== + +immediate@~3.0.5: + version "3.0.6" + resolved "https://registry.npmjs.org/immediate/-/immediate-3.0.6.tgz" + integrity sha512-XXOFtyqDjNDAQxVfYxuF7g9Il/IbWmmlQg2MYKOH8ExIT1qg6xc4zyS3HaEEATgs1btfzxq15ciUiY7gjSXRGQ== + +immer@^9.0.21: + version "9.0.21" + resolved "https://registry.npmjs.org/immer/-/immer-9.0.21.tgz" + integrity sha512-bc4NBHqOqSfRW7POMkHd51LvClaeMXpm8dx0e8oE2GORbq5aRK7Bxl4FyzVLdGtLmvLKL7BTDBG5ACQm4HWjTA== + +immutable@^5.1.5: + version "5.1.5" + resolved "https://registry.npmjs.org/immutable/-/immutable-5.1.5.tgz" + integrity sha512-t7xcm2siw+hlUM68I+UEOK+z84RzmN59as9DZ7P1l0994DKUWV7UXBMQZVxaoMSRQ+PBZbHCOoBt7a2wxOMt+A== + +import-fresh@^3.2.1: + version "3.3.1" + resolved "https://registry.npmjs.org/import-fresh/-/import-fresh-3.3.1.tgz" + integrity sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ== + dependencies: + parent-module "^1.0.0" + resolve-from "^4.0.0" + +imurmurhash@^0.1.4: + version "0.1.4" + resolved "https://registry.npmjs.org/imurmurhash/-/imurmurhash-0.1.4.tgz" + integrity sha512-JmXMZ6wuvDmLiHEml9ykzqO6lwFbof0GG4IkcGaENdCRDDmMVnny7s5HsIgHCbaq0w2MyPhDqkhTUgS2LU2PHA== + +indent-string@^4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/indent-string/-/indent-string-4.0.0.tgz" + integrity sha512-EdDDZu4A2OyIK7Lr/2zG+w5jmbuk1DVBnEwREQvBzspBJkCEbRa8GxU1lghYcaGJCnRWibjDXlq779X1/y5xwg== + +inflight@^1.0.4: + version "1.0.6" + resolved "https://registry.npmjs.org/inflight/-/inflight-1.0.6.tgz" + integrity sha512-k92I/b08q4wvFscXCLvqfsHCrjrF7yiXsQuIVvVE7N82W3+aqpzuUdBbfhWcy/FZR3/4IgflMgKLOsvPDrGCJA== + dependencies: + once "^1.3.0" + wrappy "1" + +inherits@^2.0.3, inherits@^2.0.4, inherits@~2.0.0, inherits@~2.0.3, inherits@2: + version "2.0.4" + resolved "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz" + integrity sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ== + +ini@~1.3.0: + version "1.3.8" + resolved "https://registry.npmjs.org/ini/-/ini-1.3.8.tgz" + integrity sha512-JV/yugV2uzW5iMRSiZAyDtQd+nxtUnjeLt0acNdw98kKLrvuRVyB80tsREOE7yvGVgalhZ6RNXCmEHkUKBKxew== + +internal-slot@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/internal-slot/-/internal-slot-1.1.0.tgz" + integrity sha512-4gd7VpWNQNB4UKKCFFVcp1AVv+FMOgs9NKzjHKusc8jTMhd5eL1NqQqOpE0KzMds804/yHlglp3uxgluOqAPLw== + dependencies: + es-errors "^1.3.0" + hasown "^2.0.2" + side-channel "^1.1.0" + +"internmap@1 - 2": + version "2.0.3" + resolved "https://registry.npmjs.org/internmap/-/internmap-2.0.3.tgz" + integrity sha512-5Hh7Y1wQbvY5ooGgPbDaL5iYLAPzMTUrjMulskHLH6wnv/A+1q5rgEaiuqEjB+oxGXIVZs1FF+R/KPN3ZSQYYg== + +is-array-buffer@^3.0.4, is-array-buffer@^3.0.5: + version "3.0.5" + resolved "https://registry.npmjs.org/is-array-buffer/-/is-array-buffer-3.0.5.tgz" + integrity sha512-DDfANUiiG2wC1qawP66qlTugJeL5HyzMpfr8lLK+jMQirGzNod0B12cFB/9q838Ru27sBwfw78/rdoU7RERz6A== + dependencies: + call-bind "^1.0.8" + call-bound "^1.0.3" + get-intrinsic "^1.2.6" + +is-arrayish@^0.2.1: + version "0.2.1" + resolved "https://registry.npmjs.org/is-arrayish/-/is-arrayish-0.2.1.tgz" + integrity sha512-zz06S8t0ozoDXMG+ube26zeCTNXcKIPJZJi8hBrF4idCLms4CG9QtK7qBl1boi5ODzFpjswb5JPmHCbMpjaYzg== + +is-async-function@^2.0.0: + version "2.1.1" + resolved "https://registry.npmjs.org/is-async-function/-/is-async-function-2.1.1.tgz" + integrity sha512-9dgM/cZBnNvjzaMYHVoxxfPj2QXt22Ev7SuuPrs+xav0ukGB0S6d4ydZdEiM48kLx5kDV+QBPrpVnFyefL8kkQ== + dependencies: + async-function "^1.0.0" + call-bound "^1.0.3" + get-proto "^1.0.1" + has-tostringtag "^1.0.2" + safe-regex-test "^1.1.0" + +is-bigint@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/is-bigint/-/is-bigint-1.1.0.tgz" + integrity sha512-n4ZT37wG78iz03xPRKJrHTdZbe3IicyucEtdRsV5yglwc3GyUfbAfpSeD0FJ41NbUNSt5wbhqfp1fS+BgnvDFQ== + dependencies: + has-bigints "^1.0.2" + +is-boolean-object@^1.2.1: + version "1.2.2" + resolved "https://registry.npmjs.org/is-boolean-object/-/is-boolean-object-1.2.2.tgz" + integrity sha512-wa56o2/ElJMYqjCjGkXri7it5FbebW5usLw/nPmCMs5DeZ7eziSYZhSmPRn0txqeW4LnAmQQU7FgqLpsEFKM4A== + dependencies: + call-bound "^1.0.3" + has-tostringtag "^1.0.2" + +is-callable@^1.2.7: + version "1.2.7" + resolved "https://registry.npmjs.org/is-callable/-/is-callable-1.2.7.tgz" + integrity sha512-1BC0BVFhS/p0qtw6enp8e+8OD0UrK0oFLztSjNzhcKA3WDuJxxAPXzPuPtKkjEY9UUoEWlX/8fgKeu2S8i9JTA== + +is-core-module@^2.16.1: + version "2.16.1" + resolved "https://registry.npmjs.org/is-core-module/-/is-core-module-2.16.1.tgz" + integrity sha512-UfoeMA6fIJ8wTYFEUjelnaGI67v6+N7qXJEvQuIGa99l4xsCruSYOVSQ0uPANn4dAzm8lkYPaKLrrijLq7x23w== + dependencies: + hasown "^2.0.2" + +is-data-view@^1.0.1, is-data-view@^1.0.2: + version "1.0.2" + resolved "https://registry.npmjs.org/is-data-view/-/is-data-view-1.0.2.tgz" + integrity sha512-RKtWF8pGmS87i2D6gqQu/l7EYRlVdfzemCJN/P3UOs//x1QE7mfhvzHIApBTRf7axvT6DMGwSwBXYCT0nfB9xw== + dependencies: + call-bound "^1.0.2" + get-intrinsic "^1.2.6" + is-typed-array "^1.1.13" + +is-date-object@^1.0.5, is-date-object@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/is-date-object/-/is-date-object-1.1.0.tgz" + integrity sha512-PwwhEakHVKTdRNVOw+/Gyh0+MzlCl4R6qKvkhuvLtPMggI1WAHt9sOwZxQLSGpUaDnrdyDsomoRgNnCfKNSXXg== + dependencies: + call-bound "^1.0.2" + has-tostringtag "^1.0.2" + +is-extglob@^2.1.1: + version "2.1.1" + resolved "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz" + integrity sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ== + +is-finalizationregistry@^1.1.0: + version "1.1.1" + resolved "https://registry.npmjs.org/is-finalizationregistry/-/is-finalizationregistry-1.1.1.tgz" + integrity sha512-1pC6N8qWJbWoPtEjgcL2xyhQOP491EQjeUo3qTKcmV8YSDDJrOepfG8pcC7h/QgnQHYSv0mJ3Z/ZWxmatVrysg== + dependencies: + call-bound "^1.0.3" + +is-generator-function@^1.0.10: + version "1.1.2" + resolved "https://registry.npmjs.org/is-generator-function/-/is-generator-function-1.1.2.tgz" + integrity sha512-upqt1SkGkODW9tsGNG5mtXTXtECizwtS2kA161M+gJPc1xdb/Ax629af6YrTwcOeQHbewrPNlE5Dx7kzvXTizA== + dependencies: + call-bound "^1.0.4" + generator-function "^2.0.0" + get-proto "^1.0.1" + has-tostringtag "^1.0.2" + safe-regex-test "^1.1.0" + +is-glob@^4.0.0, is-glob@^4.0.3: + version "4.0.3" + resolved "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz" + integrity sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg== + dependencies: + is-extglob "^2.1.1" + +is-map@^2.0.3: + version "2.0.3" + resolved "https://registry.npmjs.org/is-map/-/is-map-2.0.3.tgz" + integrity sha512-1Qed0/Hr2m+YqxnM09CjA2d/i6YZNfF6R2oRAOj36eUdS6qIV/huPJNSEpKbupewFs+ZsJlxsjjPbc0/afW6Lw== + +is-negative-zero@^2.0.3: + version "2.0.3" + resolved "https://registry.npmjs.org/is-negative-zero/-/is-negative-zero-2.0.3.tgz" + integrity sha512-5KoIu2Ngpyek75jXodFvnafB6DJgr3u8uuK0LEZJjrU19DrMD3EVERaR8sjz8CCGgpZvxPl9SuE1GMVPFHx1mw== + +is-number-object@^1.1.1: + version "1.1.1" + resolved "https://registry.npmjs.org/is-number-object/-/is-number-object-1.1.1.tgz" + integrity sha512-lZhclumE1G6VYD8VHe35wFaIif+CTy5SJIi5+3y4psDgWu4wPDoBhF8NxUOinEc7pHgiTsT6MaBb92rKhhD+Xw== + dependencies: + call-bound "^1.0.3" + has-tostringtag "^1.0.2" + +is-potential-custom-element-name@^1.0.1: + version "1.0.1" + resolved "https://registry.npmjs.org/is-potential-custom-element-name/-/is-potential-custom-element-name-1.0.1.tgz" + integrity sha512-bCYeRA2rVibKZd+s2625gGnGF/t7DSqDs4dP7CrLA1m7jKWz6pps0LpYLJN8Q64HtmPKJ1hrN3nzPNKFEKOUiQ== + +is-regex@^1.2.1: + version "1.2.1" + resolved "https://registry.npmjs.org/is-regex/-/is-regex-1.2.1.tgz" + integrity sha512-MjYsKHO5O7mCsmRGxWcLWheFqN9DJ/2TmngvjKXihe6efViPqc274+Fx/4fYj/r03+ESvBdTXK0V6tA3rgez1g== + dependencies: + call-bound "^1.0.2" + gopd "^1.2.0" + has-tostringtag "^1.0.2" + hasown "^2.0.2" + +is-set@^2.0.3: + version "2.0.3" + resolved "https://registry.npmjs.org/is-set/-/is-set-2.0.3.tgz" + integrity sha512-iPAjerrse27/ygGLxw+EBR9agv9Y6uLeYVJMu+QNCoouJ1/1ri0mGrcWpfCqFZuzzx3WjtwxG098X+n4OuRkPg== + +is-shared-array-buffer@^1.0.4: + version "1.0.4" + resolved "https://registry.npmjs.org/is-shared-array-buffer/-/is-shared-array-buffer-1.0.4.tgz" + integrity sha512-ISWac8drv4ZGfwKl5slpHG9OwPNty4jOWPRIhBpxOoD+hqITiwuipOQ2bNthAzwA3B4fIjO4Nln74N0S9byq8A== + dependencies: + call-bound "^1.0.3" + +is-string@^1.1.1: + version "1.1.1" + resolved "https://registry.npmjs.org/is-string/-/is-string-1.1.1.tgz" + integrity sha512-BtEeSsoaQjlSPBemMQIrY1MY0uM6vnS1g5fmufYOtnxLGUZM2178PKbhsk7Ffv58IX+ZtcvoGwccYsh0PglkAA== + dependencies: + call-bound "^1.0.3" + has-tostringtag "^1.0.2" + +is-symbol@^1.0.4, is-symbol@^1.1.1: + version "1.1.1" + resolved "https://registry.npmjs.org/is-symbol/-/is-symbol-1.1.1.tgz" + integrity sha512-9gGx6GTtCQM73BgmHQXfDmLtfjjTUDSyoxTCbp5WtoixAhfgsDirWIcVQ/IHpvI5Vgd5i/J5F7B9cN/WlVbC/w== + dependencies: + call-bound "^1.0.2" + has-symbols "^1.1.0" + safe-regex-test "^1.1.0" + +is-typed-array@^1.1.13, is-typed-array@^1.1.14, is-typed-array@^1.1.15: + version "1.1.15" + resolved "https://registry.npmjs.org/is-typed-array/-/is-typed-array-1.1.15.tgz" + integrity sha512-p3EcsicXjit7SaskXHs1hA91QxgTw46Fv6EFKKGS5DRFLD8yKnohjF3hxoju94b/OcMZoQukzpPpBE9uLVKzgQ== + dependencies: + which-typed-array "^1.1.16" + +is-weakmap@^2.0.2: + version "2.0.2" + resolved "https://registry.npmjs.org/is-weakmap/-/is-weakmap-2.0.2.tgz" + integrity sha512-K5pXYOm9wqY1RgjpL3YTkF39tni1XajUIkawTLUo9EZEVUFga5gSQJF8nNS7ZwJQ02y+1YCNYcMh+HIf1ZqE+w== + +is-weakref@^1.0.2, is-weakref@^1.1.1: + version "1.1.1" + resolved "https://registry.npmjs.org/is-weakref/-/is-weakref-1.1.1.tgz" + integrity sha512-6i9mGWSlqzNMEqpCp93KwRS1uUOodk2OJ6b+sq7ZPDSy2WuI5NFIxp/254TytR8ftefexkWn5xNiHUNpPOfSew== + dependencies: + call-bound "^1.0.3" + +is-weakset@^2.0.3: + version "2.0.4" + resolved "https://registry.npmjs.org/is-weakset/-/is-weakset-2.0.4.tgz" + integrity sha512-mfcwb6IzQyOKTs84CQMrOwW4gQcaTOAWJ0zzJCl2WSPDrWk/OzDaImWFH3djXhb24g4eudZfLRozAvPGw4d9hQ== + dependencies: + call-bound "^1.0.3" + get-intrinsic "^1.2.6" + +isarray@^2.0.5: + version "2.0.5" + resolved "https://registry.npmjs.org/isarray/-/isarray-2.0.5.tgz" + integrity sha512-xHjhDr3cNBK0BzdUJSPXZntQUx/mwMS5Rw4A7lPJ90XGAO6ISP/ePDNuo0vhqOZU+UD5JoodwCAAoZQd3FeAKw== + +isarray@~1.0.0: + version "1.0.0" + resolved "https://registry.npmjs.org/isarray/-/isarray-1.0.0.tgz" + integrity sha512-VLghIWNM6ELQzo7zwmcg0NmTVyWKYjvIeM83yjp0wRDTmUnrM678fQbcKBo6n2CJEF0szoG//ytg+TKla89ALQ== + +isexe@^2.0.0: + version "2.0.0" + resolved "https://registry.npmjs.org/isexe/-/isexe-2.0.0.tgz" + integrity sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw== + +iterator.prototype@^1.1.5: + version "1.1.5" + resolved "https://registry.npmjs.org/iterator.prototype/-/iterator.prototype-1.1.5.tgz" + integrity sha512-H0dkQoCa3b2VEeKQBOxFph+JAbcrQdE7KC0UkqwpLmv2EC4P41QXP+rqo9wYodACiG5/WM5s9oDApTU8utwj9g== + dependencies: + define-data-property "^1.1.4" + es-object-atoms "^1.0.0" + get-intrinsic "^1.2.6" + get-proto "^1.0.0" + has-symbols "^1.1.0" + set-function-name "^2.0.2" + +"js-tokens@^3.0.0 || ^4.0.0", js-tokens@^4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz" + integrity sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ== + +js-yaml@^4.1.1: + version "4.1.1" + resolved "https://registry.npmjs.org/js-yaml/-/js-yaml-4.1.1.tgz" + integrity sha512-qQKT4zQxXl8lLwBtHMWwaTcGfFOZviOJet3Oy/xmGk2gZH677CJM9EvtfdSkgWcATZhj/55JZ0rmy3myCT5lsA== + dependencies: + argparse "^2.0.1" + +jsdom@^29.0.1: + version "29.0.1" + resolved "https://registry.npmjs.org/jsdom/-/jsdom-29.0.1.tgz" + integrity sha512-z6JOK5gRO7aMybVq/y/MlIpKh8JIi68FBKMUtKkK2KH/wMSRlCxQ682d08LB9fYXplyY/UXG8P4XXTScmdjApg== + dependencies: + "@asamuzakjp/css-color" "^5.0.1" + "@asamuzakjp/dom-selector" "^7.0.3" + "@bramus/specificity" "^2.4.2" + "@csstools/css-syntax-patches-for-csstree" "^1.1.1" + "@exodus/bytes" "^1.15.0" + css-tree "^3.2.1" + data-urls "^7.0.0" + decimal.js "^10.6.0" + html-encoding-sniffer "^6.0.0" + is-potential-custom-element-name "^1.0.1" + lru-cache "^11.2.7" + parse5 "^8.0.0" + saxes "^6.0.0" + symbol-tree "^3.2.4" + tough-cookie "^6.0.1" + undici "^7.24.5" + w3c-xmlserializer "^5.0.0" + webidl-conversions "^8.0.1" + whatwg-mimetype "^5.0.0" + whatwg-url "^16.0.1" + xml-name-validator "^5.0.0" + +jsesc@^3.0.2: + version "3.1.0" + resolved "https://registry.npmjs.org/jsesc/-/jsesc-3.1.0.tgz" + integrity sha512-/sM3dO2FOzXjKQhJuo0Q173wf2KOo8t4I8vHy6lF9poUp7bKT0/NHE8fPX23PwfhnykfqnC2xRxOnVw5XuGIaA== + +json-buffer@3.0.1: + version "3.0.1" + resolved "https://registry.npmjs.org/json-buffer/-/json-buffer-3.0.1.tgz" + integrity sha512-4bV5BfR2mqfQTJm+V5tPPdf+ZpuhiIvTuAB5g8kcrXOZpTT/QwwVRWBywX1ozr6lEuPdbHxwaJlm9G6mI2sfSQ== + +json-parse-even-better-errors@^2.3.0: + version "2.3.1" + resolved "https://registry.npmjs.org/json-parse-even-better-errors/-/json-parse-even-better-errors-2.3.1.tgz" + integrity sha512-xyFwyhro/JEof6Ghe2iz2NcXoj2sloNsWr/XsERDK/oiPCfaNhl5ONfp+jQdAZRQQ0IJWNzH9zIZF7li91kh2w== + +json-schema-traverse@^0.4.1: + version "0.4.1" + resolved "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-0.4.1.tgz" + integrity sha512-xbbCH5dCYU5T8LcEhhuh7HJ88HXuW3qsI3Y0zOZFKfZEHcpWiHU/Jxzk629Brsab/mMiHQti9wMP+845RPe3Vg== + +json-stable-stringify-without-jsonify@^1.0.1: + version "1.0.1" + resolved "https://registry.npmjs.org/json-stable-stringify-without-jsonify/-/json-stable-stringify-without-jsonify-1.0.1.tgz" + integrity sha512-Bdboy+l7tA3OGW6FjyFHWkP5LuByj1Tk33Ljyq0axyzdk9//JSi2u3fP1QSmd1KNwq6VOKYGlAu87CisVir6Pw== + +json-stringify-pretty-compact@^2.0.0: + version "2.0.0" + resolved "https://registry.npmjs.org/json-stringify-pretty-compact/-/json-stringify-pretty-compact-2.0.0.tgz" + integrity sha512-WRitRfs6BGq4q8gTgOy4ek7iPFXjbra0H3PmDLKm2xnZ+Gh1HUhiKGgCZkSPNULlP7mvfu6FV/mOLhCarspADQ== + +json-stringify-pretty-compact@^4.0.0, json-stringify-pretty-compact@~4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/json-stringify-pretty-compact/-/json-stringify-pretty-compact-4.0.0.tgz" + integrity sha512-3CNZ2DnrpByG9Nqj6Xo8vqbjT4F6N+tb4Gb28ESAZjYZ5yqvmc56J+/kuIwkaAMOyblTQhUW7PxMkUb8Q36N3Q== + +"jsx-ast-utils@^2.4.1 || ^3.0.0", jsx-ast-utils@^3.3.5: + version "3.3.5" + resolved "https://registry.npmjs.org/jsx-ast-utils/-/jsx-ast-utils-3.3.5.tgz" + integrity sha512-ZZow9HBI5O6EPgSJLUb8n2NKgmVWTwCvHGwFuJlMjvLFqlGG6pjirPhtdsseaLZjSibD8eegzmYpUZwoIlj2cQ== + dependencies: + array-includes "^3.1.6" + array.prototype.flat "^1.3.1" + object.assign "^4.1.4" + object.values "^1.1.6" + +jszip@^3.10.1: + version "3.10.1" + resolved "https://registry.npmjs.org/jszip/-/jszip-3.10.1.tgz" + integrity sha512-xXDvecyTpGLrqFrvkrUSoxxfJI5AH7U8zxxtVclpsUtMCq4JQ290LY8AW5c7Ggnr/Y/oK+bQMbqK2qmtk3pN4g== + dependencies: + lie "~3.3.0" + pako "~1.0.2" + readable-stream "~2.3.6" + setimmediate "^1.0.5" + +jwt-decode@^4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/jwt-decode/-/jwt-decode-4.0.0.tgz" + integrity sha512-+KJGIyHgkGuIq3IEBNftfhW/LfWhXUIY6OmyVWjliu5KH1y0fw7VQ8YndE2O4qZdMSd9SqbnC8GOcZEy0Om7sA== + +katex@^0.16.0, katex@^0.16.22: + version "0.16.42" + resolved "https://registry.npmjs.org/katex/-/katex-0.16.42.tgz" + integrity sha512-sZ4jqyEXfHTLEFK+qsFYToa3UZ0rtFcPGwKpyiRYh2NJn8obPWOQ+/u7ux0F6CAU/y78+Mksh1YkxTPXTh47TQ== + dependencies: + commander "^8.3.0" + +keyv@^4.5.4: + version "4.5.4" + resolved "https://registry.npmjs.org/keyv/-/keyv-4.5.4.tgz" + integrity sha512-oxVHkHR/EJf2CNXnWxRLW6mg7JyCCUcG0DtEGmL2ctUo1PNTin1PUil+r/+4r5MpVgC/fn1kjsx7mjSujKqIpw== + dependencies: + json-buffer "3.0.1" + +language-subtag-registry@^0.3.20: + version "0.3.23" + resolved "https://registry.npmjs.org/language-subtag-registry/-/language-subtag-registry-0.3.23.tgz" + integrity sha512-0K65Lea881pHotoGEa5gDlMxt3pctLi2RplBb7Ezh4rRdLEOtgi7n4EwK9lamnUCkKBqaeKRVebTq6BAxSkpXQ== + +language-tags@^1.0.9: + version "1.0.9" + resolved "https://registry.npmjs.org/language-tags/-/language-tags-1.0.9.tgz" + integrity sha512-MbjN408fEndfiQXbFQ1vnd+1NoLDsnQW41410oQBXiyXDMYH5z505juWa4KUE1LqxRC7DgOgZDbKLxHIwm27hA== + dependencies: + language-subtag-registry "^0.3.20" + +lazystream@^1.0.0: + version "1.0.1" + resolved "https://registry.npmjs.org/lazystream/-/lazystream-1.0.1.tgz" + integrity sha512-b94GiNHQNy6JNTrt5w6zNyffMrNkXZb3KTkCZJb2V1xaEGCk093vkZ2jk3tpaeP33/OiXC+WvK9AxUebnf5nbw== + dependencies: + readable-stream "^2.0.5" + +levn@^0.4.1: + version "0.4.1" + resolved "https://registry.npmjs.org/levn/-/levn-0.4.1.tgz" + integrity sha512-+bT2uH4E5LGE7h/n3evcS/sQlJXCpIp6ym8OWJ5eV6+67Dsql/LaaT7qJBAt2rzfoa/5QBGBhxDix1dMt2kQKQ== + dependencies: + prelude-ls "^1.2.1" + type-check "~0.4.0" + +lie@~3.3.0: + version "3.3.0" + resolved "https://registry.npmjs.org/lie/-/lie-3.3.0.tgz" + integrity sha512-UaiMJzeWRlEujzAuw5LokY1L5ecNQYZKfmyZ9L7wDHb/p5etKaxXhohBcrw0EYby+G/NA52vRSN4N39dxHAIwQ== + dependencies: + immediate "~3.0.5" + +lie@3.1.1: + version "3.1.1" + resolved "https://registry.npmjs.org/lie/-/lie-3.1.1.tgz" + integrity sha512-RiNhHysUjhrDQntfYSfY4MU24coXXdEOgw9WGcKHNeEwffDYbF//u87M1EWaMGzuFoSbqW0C9C6lEEhDOAswfw== + dependencies: + immediate "~3.0.5" + +lightningcss-darwin-arm64@1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss-darwin-arm64/-/lightningcss-darwin-arm64-1.32.0.tgz" + integrity sha512-RzeG9Ju5bag2Bv1/lwlVJvBE3q6TtXskdZLLCyfg5pt+HLz9BqlICO7LZM7VHNTTn/5PRhHFBSjk5lc4cmscPQ== + +lightningcss@^1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss/-/lightningcss-1.32.0.tgz" + integrity sha512-NXYBzinNrblfraPGyrbPoD19C1h9lfI/1mzgWYvXUTe414Gz/X1FD2XBZSZM7rRTrMA8JL3OtAaGifrIKhQ5yQ== + dependencies: + detect-libc "^2.0.3" + optionalDependencies: + lightningcss-android-arm64 "1.32.0" + lightningcss-darwin-arm64 "1.32.0" + lightningcss-darwin-x64 "1.32.0" + lightningcss-freebsd-x64 "1.32.0" + lightningcss-linux-arm-gnueabihf "1.32.0" + lightningcss-linux-arm64-gnu "1.32.0" + lightningcss-linux-arm64-musl "1.32.0" + lightningcss-linux-x64-gnu "1.32.0" + lightningcss-linux-x64-musl "1.32.0" + lightningcss-win32-arm64-msvc "1.32.0" + lightningcss-win32-x64-msvc "1.32.0" + +lines-and-columns@^1.1.6: + version "1.2.4" + resolved "https://registry.npmjs.org/lines-and-columns/-/lines-and-columns-1.2.4.tgz" + integrity sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg== + +linkify-it@^5.0.0: + version "5.0.0" + resolved "https://registry.npmjs.org/linkify-it/-/linkify-it-5.0.0.tgz" + integrity sha512-5aHCbzQRADcdP+ATqnDuhhJ/MRIqDkZX5pyjFHRRysS8vZ5AbqGEoFIb6pYHPZ+L/OC2Lc+xT8uHVVR5CAK/wQ== + dependencies: + uc.micro "^2.0.0" + +linkifyjs@^4.3.2: + version "4.3.2" + resolved "https://registry.npmjs.org/linkifyjs/-/linkifyjs-4.3.2.tgz" + integrity sha512-NT1CJtq3hHIreOianA8aSXn6Cw0JzYOuDQbOrSPe7gqFnCpKP++MQe3ODgO3oh2GJFORkAAdqredOa60z63GbA== + +listenercount@~1.0.1: + version "1.0.1" + resolved "https://registry.npmjs.org/listenercount/-/listenercount-1.0.1.tgz" + integrity sha512-3mk/Zag0+IJxeDrxSgaDPy4zZ3w05PRZeJNnlWhzFz5OkX49J4krc+A8X2d2M69vGMBEX0uyl8M+W+8gH+kBqQ== + +localforage@^1.10.0: + version "1.10.0" + resolved "https://registry.npmjs.org/localforage/-/localforage-1.10.0.tgz" + integrity sha512-14/H1aX7hzBBmmh7sGPd+AOMkkIrHM3Z1PAyGgZigA1H1p5O5ANnMyWzvpAETtG68/dC4pC0ncy3+PPGzXZHPg== + dependencies: + lie "3.1.1" + +locate-path@^6.0.0: + version "6.0.0" + resolved "https://registry.npmjs.org/locate-path/-/locate-path-6.0.0.tgz" + integrity sha512-iPZK6eYjbxRu3uB4/WZ3EsEIMJFMqAoopl3R+zuq0UjcAm/MO6KCweDgPfP3elTztoKP3KtnVHxTn2NHBSDVUw== + dependencies: + p-locate "^5.0.0" + +lodash.clamp@^4.0.0: + version "4.0.3" + resolved "https://registry.npmjs.org/lodash.clamp/-/lodash.clamp-4.0.3.tgz" + integrity sha512-HvzRFWjtcguTW7yd8NJBshuNaCa8aqNFtnswdT7f/cMd/1YKy5Zzoq4W/Oxvnx9l7aeY258uSdDfM793+eLsVg== + +lodash.debounce@^4.0.0, lodash.debounce@^4.0.8: + version "4.0.8" + resolved "https://registry.npmjs.org/lodash.debounce/-/lodash.debounce-4.0.8.tgz" + integrity sha512-FT1yDzDYEoYWhnSGnpE/4Kj1fLZkDFyqRb7fNt6FdYOSxlUWAtp42Eh6Wb0rGIv/m9Bgo7x4GhQbm5Ys4SG5ow== + +lodash.defaults@^4.2.0: + version "4.2.0" + resolved "https://registry.npmjs.org/lodash.defaults/-/lodash.defaults-4.2.0.tgz" + integrity sha512-qjxPLHd3r5DnsdGacqOMU6pb/avJzdh9tFX2ymgoZE27BmjXrNy/y4LoaiTeAb+O3gL8AfpJGtqfX/ae2leYYQ== + +lodash.difference@^4.5.0: + version "4.5.0" + resolved "https://registry.npmjs.org/lodash.difference/-/lodash.difference-4.5.0.tgz" + integrity sha512-dS2j+W26TQ7taQBGN8Lbbq04ssV3emRw4NY58WErlTO29pIqS0HmoT5aJ9+TUQ1N3G+JOZSji4eugsWwGp9yPA== + +lodash.escaperegexp@^4.1.2: + version "4.1.2" + resolved "https://registry.npmjs.org/lodash.escaperegexp/-/lodash.escaperegexp-4.1.2.tgz" + integrity sha512-TM9YBvyC84ZxE3rgfefxUWiQKLilstD6k7PTGt6wfbtXF8ixIJLOL3VYyV/z+ZiPLsVxAsKAFVwWlWeb2Y8Yyw== + +lodash.flatten@^4.4.0: + version "4.4.0" + resolved "https://registry.npmjs.org/lodash.flatten/-/lodash.flatten-4.4.0.tgz" + integrity sha512-C5N2Z3DgnnKr0LOpv/hKCgKdb7ZZwafIrsesve6lmzvZIRZRGaZ/l6Q8+2W7NaT+ZwO3fFlSCzCzrDCFdJfZ4g== + +lodash.groupby@^4.6.0: + version "4.6.0" + resolved "https://registry.npmjs.org/lodash.groupby/-/lodash.groupby-4.6.0.tgz" + integrity sha512-5dcWxm23+VAoz+awKmBaiBvzox8+RqMgFhi7UvX9DHZr2HdxHXM/Wrf8cfKpsW37RNrvtPn6hSwNqurSILbmJw== + +lodash.isboolean@^3.0.3: + version "3.0.3" + resolved "https://registry.npmjs.org/lodash.isboolean/-/lodash.isboolean-3.0.3.tgz" + integrity sha512-Bz5mupy2SVbPHURB98VAcw+aHh4vRV5IPNhILUCsOzRmsTmSQ17jIuqopAentWoehktxGd9e/hbIXq980/1QJg== + +lodash.isequal@^4.5.0: + version "4.5.0" + resolved "https://registry.npmjs.org/lodash.isequal/-/lodash.isequal-4.5.0.tgz" + integrity sha512-pDo3lu8Jhfjqls6GkMgpahsF9kCyayhgykjyLMNFTKWrpVdAQtYyB4muAMWozBB4ig/dtWAmsMxLEI8wuz+DYQ== + +lodash.isfunction@^3.0.9: + version "3.0.9" + resolved "https://registry.npmjs.org/lodash.isfunction/-/lodash.isfunction-3.0.9.tgz" + integrity sha512-AirXNj15uRIMMPihnkInB4i3NHeb4iBtNg9WRWuK2o31S+ePwwNmDPaTL3o7dTJ+VXNZim7rFs4rxN4YU1oUJw== + +lodash.isnil@^4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/lodash.isnil/-/lodash.isnil-4.0.0.tgz" + integrity sha512-up2Mzq3545mwVnMhTDMdfoG1OurpA/s5t88JmQX809eH3C8491iu2sfKhTfhQtKY78oPNhiaHJUpT/dUDAAtng== + +lodash.isplainobject@^4.0.6: + version "4.0.6" + resolved "https://registry.npmjs.org/lodash.isplainobject/-/lodash.isplainobject-4.0.6.tgz" + integrity sha512-oSXzaWypCMHkPC3NvBEaPHf0KsA5mvPrOPgQWDsbg8n7orZ290M0BmC/jgRZ4vcJ6DTAhjrsSYgdsW/F+MFOBA== + +lodash.isundefined@^3.0.1: + version "3.0.1" + resolved "https://registry.npmjs.org/lodash.isundefined/-/lodash.isundefined-3.0.1.tgz" + integrity sha512-MXB1is3s899/cD8jheYYE2V9qTHwKvt+npCwpD+1Sxm3Q3cECXCiYHjeHWXNwr6Q0SOBPrYUDxendrO6goVTEA== + +lodash.merge@^4.6.2: + version "4.6.2" + resolved "https://registry.npmjs.org/lodash.merge/-/lodash.merge-4.6.2.tgz" + integrity sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ== + +lodash.union@^4.6.0: + version "4.6.0" + resolved "https://registry.npmjs.org/lodash.union/-/lodash.union-4.6.0.tgz" + integrity sha512-c4pB2CdGrGdjMKYLA+XiRDO7Y0PRQbm/Gzg8qMj+QH+pFVAoTp5sBpO0odL3FjoPCGjK96p6qsP+yQoiLoOBcw== + +lodash.uniq@^4.5.0: + version "4.5.0" + resolved "https://registry.npmjs.org/lodash.uniq/-/lodash.uniq-4.5.0.tgz" + integrity sha512-xfBaXQd9ryd9dlSDvnvI0lvxfLJlYAZzXomUYzLKtUeOQvOP5piqAWuGtrhWeqaXK9hhoM/iyJc5AV+XfsX3HQ== + +lodash@^4.17.21, lodash@^4.17.23: + version "4.17.23" + resolved "https://registry.npmjs.org/lodash/-/lodash-4.17.23.tgz" + integrity sha512-LgVTMpQtIopCi79SJeDiP0TfWi5CNEc/L/aRdTh3yIvmZXTnheWpKjSZhnvMl8iXbC1tFg9gdHHDMLoV7CnG+w== + +loose-envify@^1.1.0, loose-envify@^1.4.0: + version "1.4.0" + resolved "https://registry.npmjs.org/loose-envify/-/loose-envify-1.4.0.tgz" + integrity sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q== + dependencies: + js-tokens "^3.0.0 || ^4.0.0" + +lru-cache@^11.2.6, lru-cache@^11.2.7: + version "11.2.7" + resolved "https://registry.npmjs.org/lru-cache/-/lru-cache-11.2.7.tgz" + integrity sha512-aY/R+aEsRelme17KGQa/1ZSIpLpNYYrhcrepKTZgE+W3WM16YMCaPwOHLHsmopZHELU0Ojin1lPVxKR0MihncA== + +lz-string@^1.5.0: + version "1.5.0" + resolved "https://registry.npmjs.org/lz-string/-/lz-string-1.5.0.tgz" + integrity sha512-h5bgJWpxJNswbU7qCrV0tIKQCaS3blPDrqKWx+QxzuzL1zGUzij9XCWLrSLsJPu5t+eWA/ycetzYAO5IOMcWAQ== + +magic-string@^0.30.21: + version "0.30.21" + resolved "https://registry.npmjs.org/magic-string/-/magic-string-0.30.21.tgz" + integrity sha512-vd2F4YUyEXKGcLHoq+TEyCjxueSeHnFxyyjNp80yg0XV4vUhnDer/lvvlqM/arB5bXQN5K2/3oinyCRyx8T2CQ== + dependencies: + "@jridgewell/sourcemap-codec" "^1.5.5" + +markdown-it-task-lists@^2.1.1: + version "2.1.1" + resolved "https://registry.npmjs.org/markdown-it-task-lists/-/markdown-it-task-lists-2.1.1.tgz" + integrity sha512-TxFAc76Jnhb2OUu+n3yz9RMu4CwGfaT788br6HhEDlvWfdeJcLUsxk1Hgw2yJio0OXsxv7pyIPmvECY7bMbluA== + +markdown-it@^14.0.0, markdown-it@^14.1.0: + version "14.1.1" + resolved "https://registry.npmjs.org/markdown-it/-/markdown-it-14.1.1.tgz" + integrity sha512-BuU2qnTti9YKgK5N+IeMubp14ZUKUUw7yeJbkjtosvHiP0AZ5c8IAgEMk79D0eC8F23r4Ac/q8cAIFdm2FtyoA== + dependencies: + argparse "^2.0.1" + entities "^4.4.0" + linkify-it "^5.0.0" + mdurl "^2.0.0" + punycode.js "^2.3.1" + uc.micro "^2.1.0" + +markdown-to-jsx@^7.4.0: + version "7.7.17" + resolved "https://registry.npmjs.org/markdown-to-jsx/-/markdown-to-jsx-7.7.17.tgz" + integrity sha512-7mG/1feQ0TX5I7YyMZVDgCC/y2I3CiEhIRQIhyov9nGBP5eoVrOXXHuL5ZP8GRfxVZKRiXWJgwXkb9It+nQZfQ== + +math-intrinsics@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz" + integrity sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g== + +mdn-data@2.27.1: + version "2.27.1" + resolved "https://registry.npmjs.org/mdn-data/-/mdn-data-2.27.1.tgz" + integrity sha512-9Yubnt3e8A0OKwxYSXyhLymGW4sCufcLG6VdiDdUGVkPhpqLxlvP5vl1983gQjJl3tqbrM731mjaZaP68AgosQ== + +mdurl@^2.0.0: + version "2.0.0" + resolved "https://registry.npmjs.org/mdurl/-/mdurl-2.0.0.tgz" + integrity sha512-Lf+9+2r+Tdp5wXDXC4PcIBjTDtq4UKjCPMQhKIuzpJNW0b96kVqSwW0bT7FhRSfmAiFYgP+SCRvdrDozfh0U5w== + +mimic-response@^3.1.0: + version "3.1.0" + resolved "https://registry.npmjs.org/mimic-response/-/mimic-response-3.1.0.tgz" + integrity sha512-z0yWI+4FDrrweS8Zmt4Ej5HdJmky15+L2e6Wgn3+iK5fWzb6T3fhNFq2+MeTRb064c6Wr4N/wv0DzQTjNzHNGQ== + +min-indent@^1.0.0: + version "1.0.1" + resolved "https://registry.npmjs.org/min-indent/-/min-indent-1.0.1.tgz" + integrity sha512-I9jwMn07Sy/IwOj3zVkVik2JTvgpaykDZEigL6Rx6N9LbMywwUSMtxET+7lVoDLLd3O3IXwJwvuuns8UB/HeAg== + +minimatch@^10.2.2: + version "10.2.4" + resolved "https://registry.npmjs.org/minimatch/-/minimatch-10.2.4.tgz" + integrity sha512-oRjTw/97aTBN0RHbYCdtF1MQfvusSIBQM0IZEgzl6426+8jSC0nF1a/GmnVLpfB9yyr6g6FTqWqiZVbxrtaCIg== + dependencies: + brace-expansion "^5.0.2" + +minimatch@^3.1.1, minimatch@^3.1.2, minimatch@^3.1.5: + version "3.1.5" + resolved "https://registry.npmjs.org/minimatch/-/minimatch-3.1.5.tgz" + integrity sha512-VgjWUsnnT6n+NUk6eZq77zeFdpW2LWDzP6zFGrCbHXiYNul5Dzqk2HHQ5uFH2DNW5Xbp8+jVzaeNt94ssEEl4w== + dependencies: + brace-expansion "^1.1.7" + +minimatch@^5.1.0: + version "5.1.9" + resolved "https://registry.npmjs.org/minimatch/-/minimatch-5.1.9.tgz" + integrity sha512-7o1wEA2RyMP7Iu7GNba9vc0RWWGACJOCZBJX2GJWip0ikV+wcOsgVuY9uE8CPiyQhkGFSlhuSkZPavN7u1c2Fw== + dependencies: + brace-expansion "^2.0.1" + +minimist@^1.2.0, minimist@^1.2.3, minimist@^1.2.6: + version "1.2.8" + resolved "https://registry.npmjs.org/minimist/-/minimist-1.2.8.tgz" + integrity sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA== + +mkdirp-classic@^0.5.2, mkdirp-classic@^0.5.3: + version "0.5.3" + resolved "https://registry.npmjs.org/mkdirp-classic/-/mkdirp-classic-0.5.3.tgz" + integrity sha512-gKLcREMhtuZRwRAfqP3RFW+TK4JqApVBtOIftVgjuABpAtpxhPGaDcfvbhNvD0B8iD1oUr/txX35NjcaY6Ns/A== + +"mkdirp@>=0.5 0": + version "0.5.6" + resolved "https://registry.npmjs.org/mkdirp/-/mkdirp-0.5.6.tgz" + integrity sha512-FP+p8RB8OWpF3YZBCrP5gtADmtXApB5AMLn+vdyA+PyxCjrCs00mjyUozssO33cwDeT3wNGdLxJ5M//YqtHAJw== + dependencies: + minimist "^1.2.6" + +ms@^2.1.3: + version "2.1.3" + resolved "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz" + integrity sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA== + +nanoid@^3.3.11: + version "3.3.11" + resolved "https://registry.npmjs.org/nanoid/-/nanoid-3.3.11.tgz" + integrity sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w== + +napi-build-utils@^2.0.0: + version "2.0.0" + resolved "https://registry.npmjs.org/napi-build-utils/-/napi-build-utils-2.0.0.tgz" + integrity sha512-GEbrYkbfF7MoNaoh2iGG84Mnf/WZfB0GdGEsM8wz7Expx/LlWf5U8t9nvJKXSp3qr5IsEbK04cBGhol/KwOsWA== + +natural-compare@^1.4.0: + version "1.4.0" + resolved "https://registry.npmjs.org/natural-compare/-/natural-compare-1.4.0.tgz" + integrity sha512-OWND8ei3VtNC9h7V60qff3SVobHr996CTwgxubgyQYEpg290h9J0buyECNNJexkFm5sOajh5G116RYA1c8ZMSw== + +node-abi@^3.3.0: + version "3.89.0" + resolved "https://registry.npmjs.org/node-abi/-/node-abi-3.89.0.tgz" + integrity sha512-6u9UwL0HlAl21+agMN3YAMXcKByMqwGx+pq+P76vii5f7hTPtKDp08/H9py6DY+cfDw7kQNTGEj/rly3IgbNQA== + dependencies: + semver "^7.3.5" + +node-addon-api@^7.0.0: + version "7.1.1" + resolved "https://registry.npmjs.org/node-addon-api/-/node-addon-api-7.1.1.tgz" + integrity sha512-5m3bsyrjFWE1xf7nz7YXdN4udnVtXK6/Yfgn5qnahL6bCkf2yKt4k3nuTKAtT4r3IG8JNR2ncsIMdZuAzJjHQQ== + +node-exports-info@^1.6.0: + version "1.6.0" + resolved "https://registry.npmjs.org/node-exports-info/-/node-exports-info-1.6.0.tgz" + integrity sha512-pyFS63ptit/P5WqUkt+UUfe+4oevH+bFeIiPPdfb0pFeYEu/1ELnJu5l+5EcTKYL5M7zaAa7S8ddywgXypqKCw== + dependencies: + array.prototype.flatmap "^1.3.3" + es-errors "^1.3.0" + object.entries "^1.1.9" + semver "^6.3.1" + +normalize-path@^3.0.0: + version "3.0.0" + resolved "https://registry.npmjs.org/normalize-path/-/normalize-path-3.0.0.tgz" + integrity sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA== + +object-assign@^4.1.1: + version "4.1.1" + resolved "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz" + integrity sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg== + +object-inspect@^1.13.3, object-inspect@^1.13.4: + version "1.13.4" + resolved "https://registry.npmjs.org/object-inspect/-/object-inspect-1.13.4.tgz" + integrity sha512-W67iLl4J2EXEGTbfeHCffrjDfitvLANg0UlX3wFUUSTx92KXRFegMHUVgSqE+wvhAbi4WqjGg9czysTV2Epbew== + +object-keys@^1.1.1: + version "1.1.1" + resolved "https://registry.npmjs.org/object-keys/-/object-keys-1.1.1.tgz" + integrity sha512-NuAESUOUMrlIXOfHKzD6bpPu3tYt3xvjNdRIQ+FeT0lNb4K8WR70CaDxhuNguS2XG+GjkyMwOzsN5ZktImfhLA== + +object.assign@^4.1.4, object.assign@^4.1.7: + version "4.1.7" + resolved "https://registry.npmjs.org/object.assign/-/object.assign-4.1.7.tgz" + integrity sha512-nK28WOo+QIjBkDduTINE4JkF/UJJKyf2EJxvJKfblDpyg0Q+pkOHNTL0Qwy6NP6FhE/EnzV73BxxqcJaXY9anw== + dependencies: + call-bind "^1.0.8" + call-bound "^1.0.3" + define-properties "^1.2.1" + es-object-atoms "^1.0.0" + has-symbols "^1.1.0" + object-keys "^1.1.1" + +object.entries@^1.1.9: + version "1.1.9" + resolved "https://registry.npmjs.org/object.entries/-/object.entries-1.1.9.tgz" + integrity sha512-8u/hfXFRBD1O0hPUjioLhoWFHRmt6tKA4/vZPyckBr18l1KE9uHrFaFaUi8MDRTpi4uak2goyPTSNJLXX2k2Hw== + dependencies: + call-bind "^1.0.8" + call-bound "^1.0.4" + define-properties "^1.2.1" + es-object-atoms "^1.1.1" + +object.fromentries@^2.0.8: + version "2.0.8" + resolved "https://registry.npmjs.org/object.fromentries/-/object.fromentries-2.0.8.tgz" + integrity sha512-k6E21FzySsSK5a21KRADBd/NGneRegFO5pLHfdQLpRDETUNJueLXs3WCzyQ3tFRDYgbq3KHGXfTbi2bs8WQ6rQ== + dependencies: + call-bind "^1.0.7" + define-properties "^1.2.1" + es-abstract "^1.23.2" + es-object-atoms "^1.0.0" + +object.values@^1.1.6, object.values@^1.2.1: + version "1.2.1" + resolved "https://registry.npmjs.org/object.values/-/object.values-1.2.1.tgz" + integrity sha512-gXah6aZrcUxjWg2zR2MwouP2eHlCBzdV4pygudehaKXSGW4v2AsRQUK+lwwXhii6KFZcunEnmSUoYp5CXibxtA== + dependencies: + call-bind "^1.0.8" + call-bound "^1.0.3" + define-properties "^1.2.1" + es-object-atoms "^1.0.0" + +obug@^2.1.1: + version "2.1.1" + resolved "https://registry.npmjs.org/obug/-/obug-2.1.1.tgz" + integrity sha512-uTqF9MuPraAQ+IsnPf366RG4cP9RtUi7MLO1N3KEc+wb0a6yKpeL0lmk2IB1jY5KHPAlTc6T/JRdC/YqxHNwkQ== + +oidc-client-ts@3.5.0: + version "3.5.0" + resolved "https://registry.npmjs.org/oidc-client-ts/-/oidc-client-ts-3.5.0.tgz" + integrity sha512-l2q8l9CTCTOlbX+AnK4p3M+4CEpKpyQhle6blQkdFhm0IsBqsxm15bYaSa11G7pWdsYr6epdsRZxJpCyCRbT8A== + dependencies: + jwt-decode "^4.0.0" + +once@^1.3.0, once@^1.3.1, once@^1.4.0: + version "1.4.0" + resolved "https://registry.npmjs.org/once/-/once-1.4.0.tgz" + integrity sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w== + dependencies: + wrappy "1" + +optionator@^0.9.3: + version "0.9.4" + resolved "https://registry.npmjs.org/optionator/-/optionator-0.9.4.tgz" + integrity sha512-6IpQ7mKUxRcZNLIObR0hz7lxsapSSIYNZJwXPGeF0mTVqGKFIXj1DQcMoT22S3ROcLyY/rz0PWaWZ9ayWmad9g== + dependencies: + deep-is "^0.1.3" + fast-levenshtein "^2.0.6" + levn "^0.4.1" + prelude-ls "^1.2.1" + type-check "^0.4.0" + word-wrap "^1.2.5" + +orderedmap@^2.0.0: + version "2.1.1" + resolved "https://registry.npmjs.org/orderedmap/-/orderedmap-2.1.1.tgz" + integrity sha512-TvAWxi0nDe1j/rtMcWcIj94+Ffe6n7zhow33h40SKxmsmozs6dz/e+EajymfoFcHd7sxNn8yHM8839uixMOV6g== + +own-keys@^1.0.1: + version "1.0.1" + resolved "https://registry.npmjs.org/own-keys/-/own-keys-1.0.1.tgz" + integrity sha512-qFOyK5PjiWZd+QQIh+1jhdb9LpxTF0qs7Pm8o5QHYZ0M3vKqSqzsZaEB6oWlxZ+q2sJBMI/Ktgd2N5ZwQoRHfg== + dependencies: + get-intrinsic "^1.2.6" + object-keys "^1.1.1" + safe-push-apply "^1.0.0" + +p-limit@^3.0.2: + version "3.1.0" + resolved "https://registry.npmjs.org/p-limit/-/p-limit-3.1.0.tgz" + integrity sha512-TYOanM3wGwNGsZN2cVTYPArw454xnXj5qmWF1bEoAc4+cU/ol7GVh7odevjp1FNHduHc3KZMcFduxU5Xc6uJRQ== + dependencies: + yocto-queue "^0.1.0" + +p-locate@^5.0.0: + version "5.0.0" + resolved "https://registry.npmjs.org/p-locate/-/p-locate-5.0.0.tgz" + integrity sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw== + dependencies: + p-limit "^3.0.2" + +pako@~1.0.2: + version "1.0.11" + resolved "https://registry.npmjs.org/pako/-/pako-1.0.11.tgz" + integrity sha512-4hLB8Py4zZce5s4yd9XzopqwVv/yGNhV1Bl8NTmCq1763HeK2+EwVTv+leGeL13Dnh2wfbqowVPXCIO0z4taYw== + +parent-module@^1.0.0: + version "1.0.1" + resolved "https://registry.npmjs.org/parent-module/-/parent-module-1.0.1.tgz" + integrity sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g== + dependencies: + callsites "^3.0.0" + +parse-json@^5.0.0: + version "5.2.0" + resolved "https://registry.npmjs.org/parse-json/-/parse-json-5.2.0.tgz" + integrity sha512-ayCKvm/phCGxOkYRSCM82iDwct8/EonSEgCSxWxD7ve6jHggsFl4fZVQBPRNgQoKiuV/odhFrGzQXZwbifC8Rg== + dependencies: + "@babel/code-frame" "^7.0.0" + error-ex "^1.3.1" + json-parse-even-better-errors "^2.3.0" + lines-and-columns "^1.1.6" + +parse5@^8.0.0: + version "8.0.0" + resolved "https://registry.npmjs.org/parse5/-/parse5-8.0.0.tgz" + integrity sha512-9m4m5GSgXjL4AjumKzq1Fgfp3Z8rsvjRNbnkVwfu2ImRqE5D0LnY2QfDen18FSY9C573YU5XxSapdHZTZ2WolA== + dependencies: + entities "^6.0.0" + +path-exists@^4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz" + integrity sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w== + +path-is-absolute@^1.0.0: + version "1.0.1" + resolved "https://registry.npmjs.org/path-is-absolute/-/path-is-absolute-1.0.1.tgz" + integrity sha512-AVbw3UJ2e9bq64vSaS9Am0fje1Pa8pbGqTTsmXfaIiMpnr5DlDhfJOuLj9Sf95ZPVDAUerDfEk88MPmPe7UCQg== + +path-key@^3.1.0: + version "3.1.1" + resolved "https://registry.npmjs.org/path-key/-/path-key-3.1.1.tgz" + integrity sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q== + +path-parse@^1.0.7: + version "1.0.7" + resolved "https://registry.npmjs.org/path-parse/-/path-parse-1.0.7.tgz" + integrity sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw== + +path-type@^4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/path-type/-/path-type-4.0.0.tgz" + integrity sha512-gDKb8aZMDeD/tZWs9P6+q0J9Mwkdl6xMV8TjnGP3qJVJ06bdMgkbBlLU8IdfOsIsFz2BW1rNVT3XuNEl8zPAvw== + +pathe@^2.0.3: + version "2.0.3" + resolved "https://registry.npmjs.org/pathe/-/pathe-2.0.3.tgz" + integrity sha512-WUjGcAqP1gQacoQe+OBJsFA7Ld4DyXuUIjZ5cc75cLHvJ7dtNsTugphxIADwspS+AraAUePCKrSVtPLFj/F88w== + +picocolors@^1.1.1, picocolors@1.1.1: + version "1.1.1" + resolved "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz" + integrity sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA== + +picomatch@^4.0.3: + version "4.0.4" + resolved "https://registry.npmjs.org/picomatch/-/picomatch-4.0.4.tgz" + integrity sha512-QP88BAKvMam/3NxH6vj2o21R6MjxZUAd6nlwAS/pnGvN9IVLocLHxGYIzFhg6fUQ+5th6P4dv4eW9jX3DSIj7A== + +possible-typed-array-names@^1.0.0: + version "1.1.0" + resolved "https://registry.npmjs.org/possible-typed-array-names/-/possible-typed-array-names-1.1.0.tgz" + integrity sha512-/+5VFTchJDoVj3bhoqi6UeymcD00DAwb1nJwamzPvHEszJ4FpF6SNNbUbOS8yI56qHzdV8eK0qEfOSiodkTdxg== + +postcss@^8.4.43, postcss@^8.5.8: + version "8.5.8" + resolved "https://registry.npmjs.org/postcss/-/postcss-8.5.8.tgz" + integrity sha512-OW/rX8O/jXnm82Ey1k44pObPtdblfiuWnrd8X7GJ7emImCOstunGbXUpp7HdBrFQX6rJzn3sPT397Wp5aCwCHg== + dependencies: + nanoid "^3.3.11" + picocolors "^1.1.1" + source-map-js "^1.2.1" + +prebuild-install@^7.1.3: + version "7.1.3" + resolved "https://registry.npmjs.org/prebuild-install/-/prebuild-install-7.1.3.tgz" + integrity sha512-8Mf2cbV7x1cXPUILADGI3wuhfqWvtiLA1iclTDbFRZkgRQS0NqsPZphna9V+HyTEadheuPmjaJMsbzKQFOzLug== + dependencies: + detect-libc "^2.0.0" + expand-template "^2.0.3" + github-from-package "0.0.0" + minimist "^1.2.3" + mkdirp-classic "^0.5.3" + napi-build-utils "^2.0.0" + node-abi "^3.3.0" + pump "^3.0.0" + rc "^1.2.7" + simple-get "^4.0.0" + tar-fs "^2.0.0" + tunnel-agent "^0.6.0" + +prelude-ls@^1.2.1: + version "1.2.1" + resolved "https://registry.npmjs.org/prelude-ls/-/prelude-ls-1.2.1.tgz" + integrity sha512-vkcDPrRZo1QZLbn5RLGPpg/WmIQ65qoWWhcGKf/b5eplkkarX0m9z8ppCat4mlOqUsWpyNuYgO3VRyrYHSzX5g== + +prettier@^2.8.3: + version "2.8.8" + resolved "https://registry.npmjs.org/prettier/-/prettier-2.8.8.tgz" + integrity sha512-tdN8qQGvNjw4CHbY+XXk0JgCXn9QiF21a55rBe5LJAU+kDyC4WQn4+awm2Xfk2lQMk5fKup9XgzTZtGkjBdP9Q== + +pretty-format@^27.0.2: + version "27.5.1" + resolved "https://registry.npmjs.org/pretty-format/-/pretty-format-27.5.1.tgz" + integrity sha512-Qb1gy5OrP5+zDf2Bvnzdl3jsTf1qXVMazbvCoKhtKqVs4/YK4ozX4gKQJJVyNe+cajNPn0KoC0MC3FUmaHWEmQ== + dependencies: + ansi-regex "^5.0.1" + ansi-styles "^5.0.0" + react-is "^17.0.1" + +prism-react-renderer@^1.3.5: + version "1.3.5" + resolved "https://registry.npmjs.org/prism-react-renderer/-/prism-react-renderer-1.3.5.tgz" + integrity sha512-IJ+MSwBWKG+SM3b2SUfdrhC+gu01QkV2KmRQgREThBfSQRoufqRfxfHUxpG1WcaFjP+kojcFyO9Qqtpgt3qLCg== + +prismjs@^1.30.0: + version "1.30.0" + resolved "https://registry.npmjs.org/prismjs/-/prismjs-1.30.0.tgz" + integrity sha512-DEvV2ZF2r2/63V+tK8hQvrR2ZGn10srHbXviTlcv7Kpzw8jWiNTqbVgjO3IY8RxrrOUF8VPMQQFysYYYv0YZxw== + +process-nextick-args@~2.0.0: + version "2.0.1" + resolved "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-2.0.1.tgz" + integrity sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag== + +prop-types@^15.6.2, prop-types@^15.8.1: + version "15.8.1" + resolved "https://registry.npmjs.org/prop-types/-/prop-types-15.8.1.tgz" + integrity sha512-oj87CgZICdulUohogVAR7AjlC0327U4el4L6eAvOqCeudMDVU0NThNaV+b9Df4dXgSP1gXMTnPdhfe/2qDH5cg== + dependencies: + loose-envify "^1.4.0" + object-assign "^4.1.1" + react-is "^16.13.1" + +prosemirror-changeset@^2.3.0: + version "2.4.0" + resolved "https://registry.npmjs.org/prosemirror-changeset/-/prosemirror-changeset-2.4.0.tgz" + integrity sha512-LvqH2v7Q2SF6yxatuPP2e8vSUKS/L+xAU7dPDC4RMyHMhZoGDfBC74mYuyYF4gLqOEG758wajtyhNnsTkuhvng== + dependencies: + prosemirror-transform "^1.0.0" + +prosemirror-collab@^1.3.1: + version "1.3.1" + resolved "https://registry.npmjs.org/prosemirror-collab/-/prosemirror-collab-1.3.1.tgz" + integrity sha512-4SnynYR9TTYaQVXd/ieUvsVV4PDMBzrq2xPUWutHivDuOshZXqQ5rGbZM84HEaXKbLdItse7weMGOUdDVcLKEQ== + dependencies: + prosemirror-state "^1.0.0" + +prosemirror-commands@^1.0.0, prosemirror-commands@^1.6.2: + version "1.7.1" + resolved "https://registry.npmjs.org/prosemirror-commands/-/prosemirror-commands-1.7.1.tgz" + integrity sha512-rT7qZnQtx5c0/y/KlYaGvtG411S97UaL6gdp6RIZ23DLHanMYLyfGBV5DtSnZdthQql7W+lEVbpSfwtO8T+L2w== + dependencies: + prosemirror-model "^1.0.0" + prosemirror-state "^1.0.0" + prosemirror-transform "^1.10.2" + +prosemirror-dropcursor@^1.8.1: + version "1.8.2" + resolved "https://registry.npmjs.org/prosemirror-dropcursor/-/prosemirror-dropcursor-1.8.2.tgz" + integrity sha512-CCk6Gyx9+Tt2sbYk5NK0nB1ukHi2ryaRgadV/LvyNuO3ena1payM2z6Cg0vO1ebK8cxbzo41ku2DE5Axj1Zuiw== + dependencies: + prosemirror-state "^1.0.0" + prosemirror-transform "^1.1.0" + prosemirror-view "^1.1.0" + +prosemirror-gapcursor@^1.3.2: + version "1.4.1" + resolved "https://registry.npmjs.org/prosemirror-gapcursor/-/prosemirror-gapcursor-1.4.1.tgz" + integrity sha512-pMdYaEnjNMSwl11yjEGtgTmLkR08m/Vl+Jj443167p9eB3HVQKhYCc4gmHVDsLPODfZfjr/MmirsdyZziXbQKw== + dependencies: + prosemirror-keymap "^1.0.0" + prosemirror-model "^1.0.0" + prosemirror-state "^1.0.0" + prosemirror-view "^1.0.0" + +prosemirror-history@^1.0.0, prosemirror-history@^1.4.1: + version "1.5.0" + resolved "https://registry.npmjs.org/prosemirror-history/-/prosemirror-history-1.5.0.tgz" + integrity sha512-zlzTiH01eKA55UAf1MEjtssJeHnGxO0j4K4Dpx+gnmX9n+SHNlDqI2oO1Kv1iPN5B1dm5fsljCfqKF9nFL6HRg== + dependencies: + prosemirror-state "^1.2.2" + prosemirror-transform "^1.0.0" + prosemirror-view "^1.31.0" + rope-sequence "^1.3.0" + +prosemirror-inputrules@^1.4.0: + version "1.5.1" + resolved "https://registry.npmjs.org/prosemirror-inputrules/-/prosemirror-inputrules-1.5.1.tgz" + integrity sha512-7wj4uMjKaXWAQ1CDgxNzNtR9AlsuwzHfdFH1ygEHA2KHF2DOEaXl1CJfNPAKCg9qNEh4rum975QLaCiQPyY6Fw== + dependencies: + prosemirror-state "^1.0.0" + prosemirror-transform "^1.0.0" + +prosemirror-keymap@^1.0.0, prosemirror-keymap@^1.2.2, prosemirror-keymap@^1.2.3: + version "1.2.3" + resolved "https://registry.npmjs.org/prosemirror-keymap/-/prosemirror-keymap-1.2.3.tgz" + integrity sha512-4HucRlpiLd1IPQQXNqeo81BGtkY8Ai5smHhKW9jjPKRc2wQIxksg7Hl1tTI2IfT2B/LgX6bfYvXxEpJl7aKYKw== + dependencies: + prosemirror-state "^1.0.0" + w3c-keyname "^2.2.0" + +prosemirror-markdown@^1.11.1, prosemirror-markdown@^1.13.1: + version "1.13.4" + resolved "https://registry.npmjs.org/prosemirror-markdown/-/prosemirror-markdown-1.13.4.tgz" + integrity sha512-D98dm4cQ3Hs6EmjK500TdAOew4Z03EV71ajEFiWra3Upr7diytJsjF4mPV2dW+eK5uNectiRj0xFxYI9NLXDbw== + dependencies: + "@types/markdown-it" "^14.0.0" + markdown-it "^14.0.0" + prosemirror-model "^1.25.0" + +prosemirror-menu@^1.2.4: + version "1.3.0" + resolved "https://registry.npmjs.org/prosemirror-menu/-/prosemirror-menu-1.3.0.tgz" + integrity sha512-TImyPXCHPcDsSka2/lwJ6WjTASr4re/qWq1yoTTuLOqfXucwF6VcRa2LWCkM/EyTD1UO3CUwiH8qURJoWJRxwg== + dependencies: + crelt "^1.0.0" + prosemirror-commands "^1.0.0" + prosemirror-history "^1.0.0" + prosemirror-state "^1.0.0" + +prosemirror-model@^1.0.0, prosemirror-model@^1.20.0, prosemirror-model@^1.21.0, prosemirror-model@^1.24.1, prosemirror-model@^1.25.0, prosemirror-model@^1.25.4: + version "1.25.4" + resolved "https://registry.npmjs.org/prosemirror-model/-/prosemirror-model-1.25.4.tgz" + integrity sha512-PIM7E43PBxKce8OQeezAs9j4TP+5yDpZVbuurd1h5phUxEKIu+G2a+EUZzIC5nS1mJktDJWzbqS23n1tsAf5QA== + dependencies: + orderedmap "^2.0.0" + +prosemirror-schema-basic@^1.2.3: + version "1.2.4" + resolved "https://registry.npmjs.org/prosemirror-schema-basic/-/prosemirror-schema-basic-1.2.4.tgz" + integrity sha512-ELxP4TlX3yr2v5rM7Sb70SqStq5NvI15c0j9j/gjsrO5vaw+fnnpovCLEGIcpeGfifkuqJwl4fon6b+KdrODYQ== + dependencies: + prosemirror-model "^1.25.0" + +prosemirror-schema-list@^1.5.0: + version "1.5.1" + resolved "https://registry.npmjs.org/prosemirror-schema-list/-/prosemirror-schema-list-1.5.1.tgz" + integrity sha512-927lFx/uwyQaGwJxLWCZRkjXG0p48KpMj6ueoYiu4JX05GGuGcgzAy62dfiV8eFZftgyBUvLx76RsMe20fJl+Q== + dependencies: + prosemirror-model "^1.0.0" + prosemirror-state "^1.0.0" + prosemirror-transform "^1.7.3" + +prosemirror-state@^1.0.0, prosemirror-state@^1.2.2, prosemirror-state@^1.4.3, prosemirror-state@^1.4.4: + version "1.4.4" + resolved "https://registry.npmjs.org/prosemirror-state/-/prosemirror-state-1.4.4.tgz" + integrity sha512-6jiYHH2CIGbCfnxdHbXZ12gySFY/fz/ulZE333G6bPqIZ4F+TXo9ifiR86nAHpWnfoNjOb3o5ESi7J8Uz1jXHw== + dependencies: + prosemirror-model "^1.0.0" + prosemirror-transform "^1.0.0" + prosemirror-view "^1.27.0" + +prosemirror-tables@^1.6.4: + version "1.8.5" + resolved "https://registry.npmjs.org/prosemirror-tables/-/prosemirror-tables-1.8.5.tgz" + integrity sha512-V/0cDCsHKHe/tfWkeCmthNUcEp1IVO3p6vwN8XtwE9PZQLAZJigbw3QoraAdfJPir4NKJtNvOB8oYGKRl+t0Dw== + dependencies: + prosemirror-keymap "^1.2.3" + prosemirror-model "^1.25.4" + prosemirror-state "^1.4.4" + prosemirror-transform "^1.10.5" + prosemirror-view "^1.41.4" + +prosemirror-trailing-node@^3.0.0: + version "3.0.0" + resolved "https://registry.npmjs.org/prosemirror-trailing-node/-/prosemirror-trailing-node-3.0.0.tgz" + integrity sha512-xiun5/3q0w5eRnGYfNlW1uU9W6x5MoFKWwq/0TIRgt09lv7Hcser2QYV8t4muXbEr+Fwo0geYn79Xs4GKywrRQ== + dependencies: + "@remirror/core-constants" "3.0.0" + escape-string-regexp "^4.0.0" + +prosemirror-transform@^1.0.0, prosemirror-transform@^1.1.0, prosemirror-transform@^1.10.2, prosemirror-transform@^1.10.5, prosemirror-transform@^1.7.3: + version "1.12.0" + resolved "https://registry.npmjs.org/prosemirror-transform/-/prosemirror-transform-1.12.0.tgz" + integrity sha512-GxboyN4AMIsoHNtz5uf2r2Ru551i5hWeCMD6E2Ib4Eogqoub0NflniaBPVQ4MrGE5yZ8JV9tUHg9qcZTTrcN4w== + dependencies: + prosemirror-model "^1.21.0" + +prosemirror-view@^1.0.0, prosemirror-view@^1.1.0, prosemirror-view@^1.27.0, prosemirror-view@^1.31.0, prosemirror-view@^1.38.1, prosemirror-view@^1.41.4: + version "1.41.8" + resolved "https://registry.npmjs.org/prosemirror-view/-/prosemirror-view-1.41.8.tgz" + integrity sha512-TnKDdohEatgyZNGCDWIdccOHXhYloJwbwU+phw/a23KBvJIR9lWQWW7WHHK3vBdOLDNuF7TaX98GObUZOWkOnA== + dependencies: + prosemirror-model "^1.20.0" + prosemirror-state "^1.0.0" + prosemirror-transform "^1.1.0" + +pump@^3.0.0: + version "3.0.4" + resolved "https://registry.npmjs.org/pump/-/pump-3.0.4.tgz" + integrity sha512-VS7sjc6KR7e1ukRFhQSY5LM2uBWAUPiOPa/A3mkKmiMwSmRFUITt0xuj+/lesgnCv+dPIEYlkzrcyXgquIHMcA== + dependencies: + end-of-stream "^1.1.0" + once "^1.3.1" + +punycode.js@^2.3.1: + version "2.3.1" + resolved "https://registry.npmjs.org/punycode.js/-/punycode.js-2.3.1.tgz" + integrity sha512-uxFIHU0YlHYhDQtV4R9J6a52SLx28BCjT+4ieh7IGbgwVJWO+km431c4yRlREUAsAmt/uMjQUyQHNEPf0M39CA== + +punycode@^2.1.0, punycode@^2.3.1: + version "2.3.1" + resolved "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz" + integrity sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg== + +rc@^1.2.7: + version "1.2.8" + resolved "https://registry.npmjs.org/rc/-/rc-1.2.8.tgz" + integrity sha512-y3bGgqKj3QBdxLbLkomlohkvsA8gdAiUQlSBJnBhfn+BPxg4bc62d8TcBW15wavDfgexCgccckhcZvywyQYPOw== + dependencies: + deep-extend "^0.6.0" + ini "~1.3.0" + minimist "^1.2.0" + strip-json-comments "~2.0.1" + +react-animate-height@^3.0.4: + version "3.2.3" + resolved "https://registry.npmjs.org/react-animate-height/-/react-animate-height-3.2.3.tgz" + integrity sha512-R6DSvr7ud07oeCixScyvXWEMJY/Mt2+GyOWC1KMaRc69gOBw+SsCg4TJmrp4rKUM1hyd6p+YKw90brjPH93Y2A== + +react-animate-on-change@^2.2.0: + version "2.2.0" + resolved "https://registry.npmjs.org/react-animate-on-change/-/react-animate-on-change-2.2.0.tgz" + integrity sha512-cM0YHbsxIh8fshX/U24+pk4nDG7Ike9NsEy21reqJPqVt6xRA+6oYkaQHEggINKjYEMbztwK40Ro0/EHZ5naVQ== + +react-dnd-html5-backend@^16.0.1: + version "16.0.1" + resolved "https://registry.npmjs.org/react-dnd-html5-backend/-/react-dnd-html5-backend-16.0.1.tgz" + integrity sha512-Wu3dw5aDJmOGw8WjH1I1/yTH+vlXEL4vmjk5p+MHxP8HuHJS1lAGeIdG/hze1AvNeXWo/JgULV87LyQOr+r5jw== + dependencies: + dnd-core "^16.0.1" + +react-dnd@^16.0.1: + version "16.0.1" + resolved "https://registry.npmjs.org/react-dnd/-/react-dnd-16.0.1.tgz" + integrity sha512-QeoM/i73HHu2XF9aKksIUuamHPDvRglEwdHL4jsp784BgUuWcg6mzfxT0QDdQz8Wj0qyRKx2eMg8iZtWvU4E2Q== + dependencies: + "@react-dnd/invariant" "^4.0.1" + "@react-dnd/shallowequal" "^4.0.1" + dnd-core "^16.0.1" + fast-deep-equal "^3.1.3" + hoist-non-react-statics "^3.3.2" + +react-dom@^18.2.0: + version "18.3.1" + resolved "https://registry.npmjs.org/react-dom/-/react-dom-18.3.1.tgz" + integrity sha512-5m4nQKp+rZRb09LNH59GM4BxTh9251/ylbKIbpe7TpGxfJ+9kv6BLkLBXIjjspbgbnIBNqlI23tRnTWT0snUIw== + dependencies: + loose-envify "^1.1.0" + scheduler "^0.23.2" + +react-i18next@^16.5.4: + version "16.6.6" + resolved "https://registry.npmjs.org/react-i18next/-/react-i18next-16.6.6.tgz" + integrity sha512-ZgL2HUoW34UKUkOV7uSQFE1CDnRPD+tCR3ywSuWH7u2iapnz86U8Bi3Vrs620qNDzCf1F47NxglCEkchCTDOHw== + dependencies: + "@babel/runtime" "^7.29.2" + html-parse-stringify "^3.0.1" + use-sync-external-store "^1.6.0" + +react-is@^16.13.1, react-is@^16.7.0: + version "16.13.1" + resolved "https://registry.npmjs.org/react-is/-/react-is-16.13.1.tgz" + integrity sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ== + +react-is@^17.0.1: + version "17.0.2" + resolved "https://registry.npmjs.org/react-is/-/react-is-17.0.2.tgz" + integrity sha512-w2GsyukL62IJnlaff/nRegPQR94C/XXamvMWmSHRJ4y7Ts/4ocGRmTHvOs8PSE6pB3dWOrD/nueuU5sduBsQ4w== + +react-is@^18.0.0: + version "18.3.1" + resolved "https://registry.npmjs.org/react-is/-/react-is-18.3.1.tgz" + integrity sha512-/LLMVyas0ljjAtoYiPqYiL8VWXzUUdThrmU5+n20DZv+a+ClRoevUzw5JxU+Ieh5/c87ytoTBV9G1FiKfNJdmg== + +react-is@^19.2.3: + version "19.2.4" + resolved "https://registry.npmjs.org/react-is/-/react-is-19.2.4.tgz" + integrity sha512-W+EWGn2v0ApPKgKKCy/7s7WHXkboGcsrXE+2joLyVxkbyVQfO3MUEaUQDHoSmb8TFFrSKYa9mw64WZHNHSDzYA== + +react-katex@^3.1.0: + version "3.1.0" + resolved "https://registry.npmjs.org/react-katex/-/react-katex-3.1.0.tgz" + integrity sha512-At9uLOkC75gwn2N+ZXc5HD8TlATsB+3Hkp9OGs6uA8tM3dwZ3Wljn74Bk3JyHFPgSnesY/EMrIAB1WJwqZqejA== + dependencies: + katex "^0.16.0" + +react-redux@^8.0.4: + version "8.1.3" + resolved "https://registry.npmjs.org/react-redux/-/react-redux-8.1.3.tgz" + integrity sha512-n0ZrutD7DaX/j9VscF+uTALI3oUPa/pO4Z3soOBIjuRn/FzVu6aehhysxZCLi6y7duMf52WNZGMl7CtuK5EnRw== + dependencies: + "@babel/runtime" "^7.12.1" + "@types/hoist-non-react-statics" "^3.3.1" + "@types/use-sync-external-store" "^0.0.3" + hoist-non-react-statics "^3.3.2" + react-is "^18.0.0" + use-sync-external-store "^1.0.0" + +react-router-dom@^6.22.0: + version "6.30.3" + resolved "https://registry.npmjs.org/react-router-dom/-/react-router-dom-6.30.3.tgz" + integrity sha512-pxPcv1AczD4vso7G4Z3TKcvlxK7g7TNt3/FNGMhfqyntocvYKj+GCatfigGDjbLozC4baguJ0ReCigoDJXb0ag== + dependencies: + "@remix-run/router" "1.23.2" + react-router "6.30.3" + +react-router@6.30.3: + version "6.30.3" + resolved "https://registry.npmjs.org/react-router/-/react-router-6.30.3.tgz" + integrity sha512-XRnlbKMTmktBkjCLE8/XcZFlnHvr2Ltdr1eJX4idL55/9BbORzyZEaIkBFDhFGCEWBBItsVrDxwx3gnisMitdw== + dependencies: + "@remix-run/router" "1.23.2" + +react-selectable-fast@^3.4.0: + version "3.4.0" + resolved "https://registry.npmjs.org/react-selectable-fast/-/react-selectable-fast-3.4.0.tgz" + integrity sha512-4DVrX6eTCLqt+GVtSNAEcL3S9ODUvtcPrzUL1ObjSL507D+i+HE4tCokSxUn4PqLtEsrWxXJU+CVC43XmwIVyw== + +react-simple-code-editor@^0.13.1: + version "0.13.1" + resolved "https://registry.npmjs.org/react-simple-code-editor/-/react-simple-code-editor-0.13.1.tgz" + integrity sha512-XYeVwRZwgyKtjNIYcAEgg2FaQcCZwhbarnkJIV20U2wkCU9q/CPFBo8nRXrK4GXUz3AvbqZFsZRrpUTkqqEYyQ== + +react-transition-group@^4.4.5: + version "4.4.5" + resolved "https://registry.npmjs.org/react-transition-group/-/react-transition-group-4.4.5.tgz" + integrity sha512-pZcd1MCJoiKiBR2NRxeCRg13uCXbydPnmB4EOeRrY7480qNWO8IIgQG6zlDkm6uRMsURXPuKq0GWtiM59a5Q6g== + dependencies: + "@babel/runtime" "^7.5.5" + dom-helpers "^5.0.1" + loose-envify "^1.4.0" + prop-types "^15.6.2" + +react-vega@^7.6.0: + version "7.7.1" + resolved "https://registry.npmjs.org/react-vega/-/react-vega-7.7.1.tgz" + integrity sha512-Dj7n1LkfJEkY/FdwQfOZqIQ+wGUcJNwlTuWhYcuQtbBpTgvtI4wwqOvJ0QWBE19nXMU7t9HmP8sqQO5v6soOlg== + dependencies: + "@types/react" "*" + fast-deep-equal "^3.1.1" + prop-types "^15.8.1" + vega-embed "6.5.1" + +react-virtuoso@^4.3.10: + version "4.18.3" + resolved "https://registry.npmjs.org/react-virtuoso/-/react-virtuoso-4.18.3.tgz" + integrity sha512-fLz/peHAx4Eu0DLHurFEEI7Y6n5CqEoxBh04rgJM9yMuOJah2a9zWg/MUOmZLcp7zuWYorXq5+5bf3IRgkNvWg== + +react@^18.2.0: + version "18.3.1" + resolved "https://registry.npmjs.org/react/-/react-18.3.1.tgz" + integrity sha512-wS+hAgJShR0KhEvPJArfuPVN1+Hz1t0Y6n5jLrGQbkb4urgPE/0Rve+1kMB1v/oWgHgm4WIcV+i7F2pTVj+2iQ== + dependencies: + loose-envify "^1.1.0" + +readable-stream@^2.0.0: + version "2.3.8" + resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.8.tgz" + integrity sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA== + dependencies: + core-util-is "~1.0.0" + inherits "~2.0.3" + isarray "~1.0.0" + process-nextick-args "~2.0.0" + safe-buffer "~5.1.1" + string_decoder "~1.1.1" + util-deprecate "~1.0.1" + +readable-stream@^2.0.2: + version "2.3.8" + resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.8.tgz" + integrity sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA== + dependencies: + core-util-is "~1.0.0" + inherits "~2.0.3" + isarray "~1.0.0" + process-nextick-args "~2.0.0" + safe-buffer "~5.1.1" + string_decoder "~1.1.1" + util-deprecate "~1.0.1" + +readable-stream@^2.0.5: + version "2.3.8" + resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.8.tgz" + integrity sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA== + dependencies: + core-util-is "~1.0.0" + inherits "~2.0.3" + isarray "~1.0.0" + process-nextick-args "~2.0.0" + safe-buffer "~5.1.1" + string_decoder "~1.1.1" + util-deprecate "~1.0.1" + +readable-stream@^3.1.1, readable-stream@^3.4.0, readable-stream@^3.6.0: + version "3.6.2" + resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.2.tgz" + integrity sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA== + dependencies: + inherits "^2.0.3" + string_decoder "^1.1.1" + util-deprecate "^1.0.1" + +readable-stream@~2.3.6: + version "2.3.8" + resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.8.tgz" + integrity sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA== + dependencies: + core-util-is "~1.0.0" + inherits "~2.0.3" + isarray "~1.0.0" + process-nextick-args "~2.0.0" + safe-buffer "~5.1.1" + string_decoder "~1.1.1" + util-deprecate "~1.0.1" + +readdir-glob@^1.1.2: + version "1.1.3" + resolved "https://registry.npmjs.org/readdir-glob/-/readdir-glob-1.1.3.tgz" + integrity sha512-v05I2k7xN8zXvPD9N+z/uhXPaj0sUFCe2rcWZIpBsqxfP7xXFQ0tipAd/wjj1YxWyWtUS5IDJpOG82JKt2EAVA== + dependencies: + minimatch "^5.1.0" + +readdirp@^4.0.1: + version "4.1.2" + resolved "https://registry.npmjs.org/readdirp/-/readdirp-4.1.2.tgz" + integrity sha512-GDhwkLfywWL2s6vEjyhri+eXmfH6j1L7JE27WhqLeYzoh/A3DBaYGEj2H/HFZCn/kMfim73FXxEJTw06WtxQwg== + +redent@^3.0.0: + version "3.0.0" + resolved "https://registry.npmjs.org/redent/-/redent-3.0.0.tgz" + integrity sha512-6tDA8g98We0zd0GvVeMT9arEOnTw9qM03L9cJXaCjrip1OO764RDBLBfrB4cwzNGDj5OA5ioymC9GkizgWJDUg== + dependencies: + indent-string "^4.0.0" + strip-indent "^3.0.0" + +redux-persist@^6.0.0: + version "6.0.0" + resolved "https://registry.npmjs.org/redux-persist/-/redux-persist-6.0.0.tgz" + integrity sha512-71LLMbUq2r02ng2We9S215LtPu3fY0KgaGE0k8WRgl6RkqxtGfl7HUozz1Dftwsb0D/5mZ8dwAaPbtnzfvbEwQ== + +redux-thunk@^2.4.2: + version "2.4.2" + resolved "https://registry.npmjs.org/redux-thunk/-/redux-thunk-2.4.2.tgz" + integrity sha512-+P3TjtnP0k/FEjcBL5FZpoovtvrTNT/UXd4/sluaSyrURlSlhLSzEdfsTBW7WsKB6yPvgd7q/iZPICFjW4o57Q== + +redux@^4.2.0, redux@^4.2.1: + version "4.2.1" + resolved "https://registry.npmjs.org/redux/-/redux-4.2.1.tgz" + integrity sha512-LAUYz4lc+Do8/g7aeRa8JkyDErK6ekstQaqWQrNRW//MY1TvCEpMtpTWvlQ+FPbWCx+Xixu/6SHt5N0HR+SB4w== + dependencies: + "@babel/runtime" "^7.9.2" + +reflect.getprototypeof@^1.0.6, reflect.getprototypeof@^1.0.9: + version "1.0.10" + resolved "https://registry.npmjs.org/reflect.getprototypeof/-/reflect.getprototypeof-1.0.10.tgz" + integrity sha512-00o4I+DVrefhv+nX0ulyi3biSHCPDe+yLv5o/p6d/UVlirijB8E16FtfwSAi4g3tcqrQ4lRAqQSoFEZJehYEcw== + dependencies: + call-bind "^1.0.8" + define-properties "^1.2.1" + es-abstract "^1.23.9" + es-errors "^1.3.0" + es-object-atoms "^1.0.0" + get-intrinsic "^1.2.7" + get-proto "^1.0.1" + which-builtin-type "^1.2.1" + +regexp.prototype.flags@^1.5.3, regexp.prototype.flags@^1.5.4: + version "1.5.4" + resolved "https://registry.npmjs.org/regexp.prototype.flags/-/regexp.prototype.flags-1.5.4.tgz" + integrity sha512-dYqgNSZbDwkaJ2ceRd9ojCGjBq+mOm9LmtXnAnEGyHhN/5R7iDW2TRw3h+o/jCFxus3P2LfWIIiwowAjANm7IA== + dependencies: + call-bind "^1.0.8" + define-properties "^1.2.1" + es-errors "^1.3.0" + get-proto "^1.0.1" + gopd "^1.2.0" + set-function-name "^2.0.2" + +require-from-string@^2.0.2: + version "2.0.2" + resolved "https://registry.npmjs.org/require-from-string/-/require-from-string-2.0.2.tgz" + integrity sha512-Xf0nWe6RseziFMu+Ap9biiUbmplq6S9/p+7w7YXP/JBHhrUDDUhwa+vANyubuqfZWTveU//DYVGsDG7RKL/vEw== + +reselect@^4.1.8: + version "4.1.8" + resolved "https://registry.npmjs.org/reselect/-/reselect-4.1.8.tgz" + integrity sha512-ab9EmR80F/zQTMNeneUr4cv+jSwPJgIlvEmVwLerwrWVbpLlBuls9XHzIeTFy4cegU2NHBp3va0LKOzU5qFEYQ== + +resolve-from@^4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/resolve-from/-/resolve-from-4.0.0.tgz" + integrity sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g== + +resolve@^1.19.0: + version "1.22.11" + resolved "https://registry.npmjs.org/resolve/-/resolve-1.22.11.tgz" + integrity sha512-RfqAvLnMl313r7c9oclB1HhUEAezcpLjz95wFH4LVuhk9JF/r22qmVP9AMmOU4vMX7Q8pN8jwNg/CSpdFnMjTQ== + dependencies: + is-core-module "^2.16.1" + path-parse "^1.0.7" + supports-preserve-symlinks-flag "^1.0.0" + +resolve@^2.0.0-next.5: + version "2.0.0-next.6" + resolved "https://registry.npmjs.org/resolve/-/resolve-2.0.0-next.6.tgz" + integrity sha512-3JmVl5hMGtJ3kMmB3zi3DL25KfkCEyy3Tw7Gmw7z5w8M9WlwoPFnIvwChzu1+cF3iaK3sp18hhPz8ANeimdJfA== + dependencies: + es-errors "^1.3.0" + is-core-module "^2.16.1" + node-exports-info "^1.6.0" + object-keys "^1.1.1" + path-parse "^1.0.7" + supports-preserve-symlinks-flag "^1.0.0" + +rimraf@2: + version "2.7.1" + resolved "https://registry.npmjs.org/rimraf/-/rimraf-2.7.1.tgz" + integrity sha512-uWjbaKIK3T1OSVptzX7Nl6PvQ3qAGtKEtVRjRuazjfL3Bx5eI409VZSqgND+4UNnmzLVdPj9FqFJNPqBZFve4w== + dependencies: + glob "^7.1.3" + +robust-predicates@^3.0.2: + version "3.0.3" + resolved "https://registry.npmjs.org/robust-predicates/-/robust-predicates-3.0.3.tgz" + integrity sha512-NS3levdsRIUOmiJ8FZWCP7LG3QpJyrs/TE0Zpf1yvZu8cAJJ6QMW92H1c7kWpdIHo8RvmLxN/o2JXTKHp74lUA== + +rolldown@1.0.0-rc.11: + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/rolldown/-/rolldown-1.0.0-rc.11.tgz" + integrity sha512-NRjoKMusSjfRbSYiH3VSumlkgFe7kYAa3pzVOsVYVFY3zb5d7nS+a3KGQ7hJKXuYWbzJKPVQ9Wxq2UvyK+ENpw== + dependencies: + "@oxc-project/types" "=0.122.0" + "@rolldown/pluginutils" "1.0.0-rc.11" + optionalDependencies: + "@rolldown/binding-android-arm64" "1.0.0-rc.11" + "@rolldown/binding-darwin-arm64" "1.0.0-rc.11" + "@rolldown/binding-darwin-x64" "1.0.0-rc.11" + "@rolldown/binding-freebsd-x64" "1.0.0-rc.11" + "@rolldown/binding-linux-arm-gnueabihf" "1.0.0-rc.11" + "@rolldown/binding-linux-arm64-gnu" "1.0.0-rc.11" + "@rolldown/binding-linux-arm64-musl" "1.0.0-rc.11" + "@rolldown/binding-linux-ppc64-gnu" "1.0.0-rc.11" + "@rolldown/binding-linux-s390x-gnu" "1.0.0-rc.11" + "@rolldown/binding-linux-x64-gnu" "1.0.0-rc.11" + "@rolldown/binding-linux-x64-musl" "1.0.0-rc.11" + "@rolldown/binding-openharmony-arm64" "1.0.0-rc.11" + "@rolldown/binding-wasm32-wasi" "1.0.0-rc.11" + "@rolldown/binding-win32-arm64-msvc" "1.0.0-rc.11" + "@rolldown/binding-win32-x64-msvc" "1.0.0-rc.11" + +rollup@^4.20.0: + version "4.60.0" + resolved "https://registry.npmjs.org/rollup/-/rollup-4.60.0.tgz" + integrity sha512-yqjxruMGBQJ2gG4HtjZtAfXArHomazDHoFwFFmZZl0r7Pdo7qCIXKqKHZc8yeoMgzJJ+pO6pEEHa+V7uzWlrAQ== + dependencies: + "@types/estree" "1.0.8" + optionalDependencies: + "@rollup/rollup-android-arm-eabi" "4.60.0" + "@rollup/rollup-android-arm64" "4.60.0" + "@rollup/rollup-darwin-arm64" "4.60.0" + "@rollup/rollup-darwin-x64" "4.60.0" + "@rollup/rollup-freebsd-arm64" "4.60.0" + "@rollup/rollup-freebsd-x64" "4.60.0" + "@rollup/rollup-linux-arm-gnueabihf" "4.60.0" + "@rollup/rollup-linux-arm-musleabihf" "4.60.0" + "@rollup/rollup-linux-arm64-gnu" "4.60.0" + "@rollup/rollup-linux-arm64-musl" "4.60.0" + "@rollup/rollup-linux-loong64-gnu" "4.60.0" + "@rollup/rollup-linux-loong64-musl" "4.60.0" + "@rollup/rollup-linux-ppc64-gnu" "4.60.0" + "@rollup/rollup-linux-ppc64-musl" "4.60.0" + "@rollup/rollup-linux-riscv64-gnu" "4.60.0" + "@rollup/rollup-linux-riscv64-musl" "4.60.0" + "@rollup/rollup-linux-s390x-gnu" "4.60.0" + "@rollup/rollup-linux-x64-gnu" "4.60.0" + "@rollup/rollup-linux-x64-musl" "4.60.0" + "@rollup/rollup-openbsd-x64" "4.60.0" + "@rollup/rollup-openharmony-arm64" "4.60.0" + "@rollup/rollup-win32-arm64-msvc" "4.60.0" + "@rollup/rollup-win32-ia32-msvc" "4.60.0" + "@rollup/rollup-win32-x64-gnu" "4.60.0" + "@rollup/rollup-win32-x64-msvc" "4.60.0" + fsevents "~2.3.2" + +rope-sequence@^1.3.0: + version "1.3.4" + resolved "https://registry.npmjs.org/rope-sequence/-/rope-sequence-1.3.4.tgz" + integrity sha512-UT5EDe2cu2E/6O4igUr5PSFs23nvvukicWHx6GnOPlHAiiYbzNuCRQCuiUdHJQcqKalLKlrYJnjY0ySGsXNQXQ== + +rw@1: + version "1.3.3" + resolved "https://registry.npmjs.org/rw/-/rw-1.3.3.tgz" + integrity sha512-PdhdWy89SiZogBLaw42zdeqtRJ//zFd2PgQavcICDUgJT5oW10QCRKbJ6bg4r0/UY2M6BWd5tkxuGFRvCkgfHQ== + +rybitten@^0.22.0: + version "0.22.0" + resolved "https://registry.npmjs.org/rybitten/-/rybitten-0.22.0.tgz" + integrity sha512-w9aWDjaIo3YWLTBFiPDLzWWbdiKDkghLKzCeXDsXTqs64Ai0Dw8mHn9d/nnMvnds93GVpRwqjVM5VH+SDJsIsQ== + +safe-array-concat@^1.1.3: + version "1.1.3" + resolved "https://registry.npmjs.org/safe-array-concat/-/safe-array-concat-1.1.3.tgz" + integrity sha512-AURm5f0jYEOydBj7VQlVvDrjeFgthDdEF5H1dP+6mNpoXOMo1quQqJ4wvJDyRZ9+pO3kGWoOdmV08cSv2aJV6Q== + dependencies: + call-bind "^1.0.8" + call-bound "^1.0.2" + get-intrinsic "^1.2.6" + has-symbols "^1.1.0" + isarray "^2.0.5" + +safe-buffer@^5.0.1: + version "5.2.1" + resolved "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz" + integrity sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ== + +safe-buffer@~5.1.0, safe-buffer@~5.1.1: + version "5.1.2" + resolved "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz" + integrity sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g== + +safe-buffer@~5.2.0: + version "5.2.1" + resolved "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz" + integrity sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ== + +safe-push-apply@^1.0.0: + version "1.0.0" + resolved "https://registry.npmjs.org/safe-push-apply/-/safe-push-apply-1.0.0.tgz" + integrity sha512-iKE9w/Z7xCzUMIZqdBsp6pEQvwuEebH4vdpjcDWnyzaI6yl6O9FHvVpmGelvEHNsoY6wGblkxR6Zty/h00WiSA== + dependencies: + es-errors "^1.3.0" + isarray "^2.0.5" + +safe-regex-test@^1.0.3, safe-regex-test@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/safe-regex-test/-/safe-regex-test-1.1.0.tgz" + integrity sha512-x/+Cz4YrimQxQccJf5mKEbIa1NzeCRNI5Ecl/ekmlYaampdNLPalVyIcCZNNH3MvmqBugV5TMYZXv0ljslUlaw== + dependencies: + call-bound "^1.0.2" + es-errors "^1.3.0" + is-regex "^1.2.1" + +"safer-buffer@>= 2.1.2 < 3.0.0": + version "2.1.2" + resolved "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz" + integrity sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg== + +sass@^1.77.6: + version "1.98.0" + resolved "https://registry.npmjs.org/sass/-/sass-1.98.0.tgz" + integrity sha512-+4N/u9dZ4PrgzGgPlKnaaRQx64RO0JBKs9sDhQ2pLgN6JQZ25uPQZKQYaBJU48Kd5BxgXoJ4e09Dq7nMcOUW3A== + dependencies: + chokidar "^4.0.0" + immutable "^5.1.5" + source-map-js ">=0.6.2 <2.0.0" + optionalDependencies: + "@parcel/watcher" "^2.4.1" + +saxes@^5.0.1: + version "5.0.1" + resolved "https://registry.npmjs.org/saxes/-/saxes-5.0.1.tgz" + integrity sha512-5LBh1Tls8c9xgGjw3QrMwETmTMVk0oFgvrFSvWx62llR2hcEInrKNZ2GZCCuuy2lvWrdl5jhbpeqc5hRYKFOcw== + dependencies: + xmlchars "^2.2.0" + +saxes@^6.0.0: + version "6.0.0" + resolved "https://registry.npmjs.org/saxes/-/saxes-6.0.0.tgz" + integrity sha512-xAg7SOnEhrm5zI3puOOKyy1OMcMlIJZYNJY7xLBwSze0UjhPLnWfj2GF2EpT0jmzaJKIWKHLsaSSajf35bcYnA== + dependencies: + xmlchars "^2.2.0" + +scheduler@^0.23.2: + version "0.23.2" + resolved "https://registry.npmjs.org/scheduler/-/scheduler-0.23.2.tgz" + integrity sha512-UOShsPwz7NrMUqhR6t0hWjFduvOzbtv7toDH1/hIrfRNIDBnnBWd0CwJTGvTpngVlmwGCdP9/Zl/tVrDqcuYzQ== + dependencies: + loose-envify "^1.1.0" + +semver@^6.3.1: + version "6.3.1" + resolved "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz" + integrity sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA== + +semver@^7.1.3, semver@^7.3.5, semver@^7.6.3, semver@^7.7.3: + version "7.7.4" + resolved "https://registry.npmjs.org/semver/-/semver-7.7.4.tgz" + integrity sha512-vFKC2IEtQnVhpT78h1Yp8wzwrf8CM+MzKMHGJZfBtzhZNycRFnXsHk6E5TxIkkMsgNS7mdX3AGB7x2QM2di4lA== + +seroval-plugins@~1.5.0: + version "1.5.1" + resolved "https://registry.npmjs.org/seroval-plugins/-/seroval-plugins-1.5.1.tgz" + integrity sha512-4FbuZ/TMl02sqv0RTFexu0SP6V+ywaIe5bAWCCEik0fk17BhALgwvUDVF7e3Uvf9pxmwCEJsRPmlkUE6HdzLAw== + +seroval@~1.5.0: + version "1.5.1" + resolved "https://registry.npmjs.org/seroval/-/seroval-1.5.1.tgz" + integrity sha512-OwrZRZAfhHww0WEnKHDY8OM0U/Qs8OTfIDWhUD4BLpNJUfXK4cGmjiagGze086m+mhI+V2nD0gfbHEnJjb9STA== + +set-function-length@^1.2.2: + version "1.2.2" + resolved "https://registry.npmjs.org/set-function-length/-/set-function-length-1.2.2.tgz" + integrity sha512-pgRc4hJ4/sNjWCSS9AmnS40x3bNMDTknHgL5UaMBTMyJnU90EgWh1Rz+MC9eFu4BuN/UwZjKQuY/1v3rM7HMfg== + dependencies: + define-data-property "^1.1.4" + es-errors "^1.3.0" + function-bind "^1.1.2" + get-intrinsic "^1.2.4" + gopd "^1.0.1" + has-property-descriptors "^1.0.2" + +set-function-name@^2.0.2: + version "2.0.2" + resolved "https://registry.npmjs.org/set-function-name/-/set-function-name-2.0.2.tgz" + integrity sha512-7PGFlmtwsEADb0WYyvCMa1t+yke6daIG4Wirafur5kcf+MhUnPms1UeR0CKQdTZD81yESwMHbtn+TR+dMviakQ== + dependencies: + define-data-property "^1.1.4" + es-errors "^1.3.0" + functions-have-names "^1.2.3" + has-property-descriptors "^1.0.2" + +set-proto@^1.0.0: + version "1.0.0" + resolved "https://registry.npmjs.org/set-proto/-/set-proto-1.0.0.tgz" + integrity sha512-RJRdvCo6IAnPdsvP/7m6bsQqNnn1FCBX5ZNtFL98MmFF/4xAIJTIg1YbHW5DC2W5SKZanrC6i4HsJqlajw/dZw== + dependencies: + dunder-proto "^1.0.1" + es-errors "^1.3.0" + es-object-atoms "^1.0.0" + +setimmediate@^1.0.5, setimmediate@~1.0.4: + version "1.0.5" + resolved "https://registry.npmjs.org/setimmediate/-/setimmediate-1.0.5.tgz" + integrity sha512-MATJdZp8sLqDl/68LfQmbP8zKPLQNV6BIZoIgrscFDQ+RsvK/BxeDQOgyxKKoh0y/8h3BqVFnCqQ/gd+reiIXA== + +shebang-command@^2.0.0: + version "2.0.0" + resolved "https://registry.npmjs.org/shebang-command/-/shebang-command-2.0.0.tgz" + integrity sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA== + dependencies: + shebang-regex "^3.0.0" + +shebang-regex@^3.0.0: + version "3.0.0" + resolved "https://registry.npmjs.org/shebang-regex/-/shebang-regex-3.0.0.tgz" + integrity sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A== + +side-channel-list@^1.0.0: + version "1.0.0" + resolved "https://registry.npmjs.org/side-channel-list/-/side-channel-list-1.0.0.tgz" + integrity sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA== + dependencies: + es-errors "^1.3.0" + object-inspect "^1.13.3" + +side-channel-map@^1.0.1: + version "1.0.1" + resolved "https://registry.npmjs.org/side-channel-map/-/side-channel-map-1.0.1.tgz" + integrity sha512-VCjCNfgMsby3tTdo02nbjtM/ewra6jPHmpThenkTYh8pG9ucZ/1P8So4u4FGBek/BjpOVsDCMoLA/iuBKIFXRA== + dependencies: + call-bound "^1.0.2" + es-errors "^1.3.0" + get-intrinsic "^1.2.5" + object-inspect "^1.13.3" + +side-channel-weakmap@^1.0.2: + version "1.0.2" + resolved "https://registry.npmjs.org/side-channel-weakmap/-/side-channel-weakmap-1.0.2.tgz" + integrity sha512-WPS/HvHQTYnHisLo9McqBHOJk2FkHO/tlpvldyrnem4aeQp4hai3gythswg6p01oSoTl58rcpiFAjF2br2Ak2A== + dependencies: + call-bound "^1.0.2" + es-errors "^1.3.0" + get-intrinsic "^1.2.5" + object-inspect "^1.13.3" + side-channel-map "^1.0.1" + +side-channel@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/side-channel/-/side-channel-1.1.0.tgz" + integrity sha512-ZX99e6tRweoUXqR+VBrslhda51Nh5MTQwou5tnUDgbtyM0dBgmhEDtWGP/xbKn6hqfPRHujUNwz5fy/wbbhnpw== + dependencies: + es-errors "^1.3.0" + object-inspect "^1.13.3" + side-channel-list "^1.0.0" + side-channel-map "^1.0.1" + side-channel-weakmap "^1.0.2" + +siginfo@^2.0.0: + version "2.0.0" + resolved "https://registry.npmjs.org/siginfo/-/siginfo-2.0.0.tgz" + integrity sha512-ybx0WO1/8bSBLEWXZvEd7gMW3Sn3JFlW3TvX1nREbDLRNQNaeNN8WK0meBwPdAaOI7TtRRRJn/Es1zhrrCHu7g== + +simple-concat@^1.0.0: + version "1.0.1" + resolved "https://registry.npmjs.org/simple-concat/-/simple-concat-1.0.1.tgz" + integrity sha512-cSFtAPtRhljv69IK0hTVZQ+OfE9nePi/rtJmw5UjHeVyVroEqJXP1sFztKUy1qU+xvz3u/sfYJLa947b7nAN2Q== + +simple-get@^4.0.0: + version "4.0.1" + resolved "https://registry.npmjs.org/simple-get/-/simple-get-4.0.1.tgz" + integrity sha512-brv7p5WgH0jmQJr1ZDDfKDOSeWWg+OVypG99A/5vYGPqJ6pxiaHLy8nxtFjBA7oMa01ebA9gfh1uMCFqOuXxvA== + dependencies: + decompress-response "^6.0.0" + once "^1.3.1" + simple-concat "^1.0.0" + +solid-js@^1.9.5: + version "1.9.12" + resolved "https://registry.npmjs.org/solid-js/-/solid-js-1.9.12.tgz" + integrity sha512-QzKaSJq2/iDrWR1As6MHZQ8fQkdOBf8GReYb7L5iKwMGceg7HxDcaOHk0at66tNgn9U2U7dXo8ZZpLIAmGMzgw== + dependencies: + csstype "^3.1.0" + seroval "~1.5.0" + seroval-plugins "~1.5.0" + +source-map-js@^1.2.1, "source-map-js@>=0.6.2 <2.0.0": + version "1.2.1" + resolved "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz" + integrity sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA== + +source-map@^0.5.7: + version "0.5.7" + resolved "https://registry.npmjs.org/source-map/-/source-map-0.5.7.tgz" + integrity sha512-LbrmJOMUSdEVxIKvdcJzQC+nQhe8FUZQTXQy6+I75skNgn3OoQ0DZA8YnFa7gp8tqtL3KPf1kmo0R5DoApeSGQ== + +spectral.js@^2.0.2: + version "2.0.2" + resolved "https://registry.npmjs.org/spectral.js/-/spectral.js-2.0.2.tgz" + integrity sha512-g7NA/GMc2C50ez/foALJW8DcwvwbMgW5WF0/1fmAib5AN8NkJwMVyWgkPeSGAm4D6XAFXdtz9KM4AreuV+hJsg== + +stackback@0.0.2: + version "0.0.2" + resolved "https://registry.npmjs.org/stackback/-/stackback-0.0.2.tgz" + integrity sha512-1XMJE5fQo1jGH6Y/7ebnwPOBEkIEnT4QF32d5R1+VXdXveM0IBMJt8zfaxX1P3QhVwrYe+576+jkANtSS2mBbw== + +std-env@^4.0.0-rc.1: + version "4.0.0" + resolved "https://registry.npmjs.org/std-env/-/std-env-4.0.0.tgz" + integrity sha512-zUMPtQ/HBY3/50VbpkupYHbRroTRZJPRLvreamgErJVys0ceuzMkD44J/QjqhHjOzK42GQ3QZIeFG1OYfOtKqQ== + +stop-iteration-iterator@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/stop-iteration-iterator/-/stop-iteration-iterator-1.1.0.tgz" + integrity sha512-eLoXW/DHyl62zxY4SCaIgnRhuMr6ri4juEYARS8E6sCEqzKpOiE521Ucofdx+KnDZl5xmvGYaaKCk5FEOxJCoQ== + dependencies: + es-errors "^1.3.0" + internal-slot "^1.1.0" + +string_decoder@^1.1.1: + version "1.3.0" + resolved "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz" + integrity sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA== + dependencies: + safe-buffer "~5.2.0" + +string_decoder@~1.1.1: + version "1.1.1" + resolved "https://registry.npmjs.org/string_decoder/-/string_decoder-1.1.1.tgz" + integrity sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg== + dependencies: + safe-buffer "~5.1.0" + +string-width@^7.0.0, string-width@^7.2.0: + version "7.2.0" + resolved "https://registry.npmjs.org/string-width/-/string-width-7.2.0.tgz" + integrity sha512-tsaTIkKW9b4N+AEj+SVA+WhJzV7/zMhcSu78mLKWSk7cXMOSHsBKFWUs0fWwq8QyK3MgJBQRX6Gbi4kYbdvGkQ== + dependencies: + emoji-regex "^10.3.0" + get-east-asian-width "^1.0.0" + strip-ansi "^7.1.0" + +string.prototype.includes@^2.0.1: + version "2.0.1" + resolved "https://registry.npmjs.org/string.prototype.includes/-/string.prototype.includes-2.0.1.tgz" + integrity sha512-o7+c9bW6zpAdJHTtujeePODAhkuicdAryFsfVKwA+wGw89wJ4GTY484WTucM9hLtDEOpOvI+aHnzqnC5lHp4Rg== + dependencies: + call-bind "^1.0.7" + define-properties "^1.2.1" + es-abstract "^1.23.3" + +string.prototype.matchall@^4.0.12: + version "4.0.12" + resolved "https://registry.npmjs.org/string.prototype.matchall/-/string.prototype.matchall-4.0.12.tgz" + integrity sha512-6CC9uyBL+/48dYizRf7H7VAYCMCNTBeM78x/VTUe9bFEaxBepPJDa1Ow99LqI/1yF7kuy7Q3cQsYMrcjGUcskA== + dependencies: + call-bind "^1.0.8" + call-bound "^1.0.3" + define-properties "^1.2.1" + es-abstract "^1.23.6" + es-errors "^1.3.0" + es-object-atoms "^1.0.0" + get-intrinsic "^1.2.6" + gopd "^1.2.0" + has-symbols "^1.1.0" + internal-slot "^1.1.0" + regexp.prototype.flags "^1.5.3" + set-function-name "^2.0.2" + side-channel "^1.1.0" + +string.prototype.repeat@^1.0.0: + version "1.0.0" + resolved "https://registry.npmjs.org/string.prototype.repeat/-/string.prototype.repeat-1.0.0.tgz" + integrity sha512-0u/TldDbKD8bFCQ/4f5+mNRrXwZ8hg2w7ZR8wa16e8z9XpePWl3eGEcUD0OXpEH/VJH/2G3gjUtR3ZOiBe2S/w== + dependencies: + define-properties "^1.1.3" + es-abstract "^1.17.5" + +string.prototype.trim@^1.2.10: + version "1.2.10" + resolved "https://registry.npmjs.org/string.prototype.trim/-/string.prototype.trim-1.2.10.tgz" + integrity sha512-Rs66F0P/1kedk5lyYyH9uBzuiI/kNRmwJAR9quK6VOtIpZ2G+hMZd+HQbbv25MgCA6gEffoMZYxlTod4WcdrKA== + dependencies: + call-bind "^1.0.8" + call-bound "^1.0.2" + define-data-property "^1.1.4" + define-properties "^1.2.1" + es-abstract "^1.23.5" + es-object-atoms "^1.0.0" + has-property-descriptors "^1.0.2" + +string.prototype.trimend@^1.0.9: + version "1.0.9" + resolved "https://registry.npmjs.org/string.prototype.trimend/-/string.prototype.trimend-1.0.9.tgz" + integrity sha512-G7Ok5C6E/j4SGfyLCloXTrngQIQU3PWtXGst3yM7Bea9FRURf1S42ZHlZZtsNque2FN2PoUhfZXYLNWwEr4dLQ== + dependencies: + call-bind "^1.0.8" + call-bound "^1.0.2" + define-properties "^1.2.1" + es-object-atoms "^1.0.0" + +string.prototype.trimstart@^1.0.8: + version "1.0.8" + resolved "https://registry.npmjs.org/string.prototype.trimstart/-/string.prototype.trimstart-1.0.8.tgz" + integrity sha512-UXSH262CSZY1tfu3G3Secr6uGLCFVPMhIqHjlgCUtCCcgihYc/xKs9djMTMUOb2j1mVSeU8EU6NWc/iQKU6Gfg== + dependencies: + call-bind "^1.0.7" + define-properties "^1.2.1" + es-object-atoms "^1.0.0" + +strip-ansi@^7.1.0: + version "7.2.0" + resolved "https://registry.npmjs.org/strip-ansi/-/strip-ansi-7.2.0.tgz" + integrity sha512-yDPMNjp4WyfYBkHnjIRLfca1i6KMyGCtsVgoKe/z1+6vukgaENdgGBZt+ZmKPc4gavvEZ5OgHfHdrazhgNyG7w== + dependencies: + ansi-regex "^6.2.2" + +strip-indent@^3.0.0: + version "3.0.0" + resolved "https://registry.npmjs.org/strip-indent/-/strip-indent-3.0.0.tgz" + integrity sha512-laJTa3Jb+VQpaC6DseHhF7dXVqHTfJPCRDaEbid/drOhgitgYku/letMUqOXFoWV0zIIUbjpdH2t+tYj4bQMRQ== + dependencies: + min-indent "^1.0.0" + +strip-json-comments@^3.1.1: + version "3.1.1" + resolved "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz" + integrity sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig== + +strip-json-comments@~2.0.1: + version "2.0.1" + resolved "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-2.0.1.tgz" + integrity sha512-4gB8na07fecVVkOI6Rs4e7T6NOTki5EmL7TUduTs6bu3EdnSycntVJ4re8kgZA+wx9IueI2Y11bfbgwtzuE0KQ== + +stylis@4.2.0: + version "4.2.0" + resolved "https://registry.npmjs.org/stylis/-/stylis-4.2.0.tgz" + integrity sha512-Orov6g6BB1sDfYgzWfTHDOxamtX1bE/zo104Dh9e6fqJ3PooipYyfJ0pUmrZO2wAvO8YbEyeFrkV91XTsGMSrw== + +supports-color@^7.1.0: + version "7.2.0" + resolved "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz" + integrity sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw== + dependencies: + has-flag "^4.0.0" + +supports-preserve-symlinks-flag@^1.0.0: + version "1.0.0" + resolved "https://registry.npmjs.org/supports-preserve-symlinks-flag/-/supports-preserve-symlinks-flag-1.0.0.tgz" + integrity sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w== + +symbol-tree@^3.2.4: + version "3.2.4" + resolved "https://registry.npmjs.org/symbol-tree/-/symbol-tree-3.2.4.tgz" + integrity sha512-9QNk5KwDF+Bvz+PyObkmSYjI5ksVUYtjW7AU22r2NKcfLJcXp96hkDWU3+XndOsUb+AQ9QhfzfCT2O+CNWT5Tw== + +tar-fs@^2.0.0: + version "2.1.4" + resolved "https://registry.npmjs.org/tar-fs/-/tar-fs-2.1.4.tgz" + integrity sha512-mDAjwmZdh7LTT6pNleZ05Yt65HC3E+NiQzl672vQG38jIrehtJk/J3mNwIg+vShQPcLF/LV7CMnDW6vjj6sfYQ== + dependencies: + chownr "^1.1.1" + mkdirp-classic "^0.5.2" + pump "^3.0.0" + tar-stream "^2.1.4" + +tar-stream@^2.1.4, tar-stream@^2.2.0: + version "2.2.0" + resolved "https://registry.npmjs.org/tar-stream/-/tar-stream-2.2.0.tgz" + integrity sha512-ujeqbceABgwMZxEJnk2HDY2DlnUZ+9oEcb1KzTVfYHio0UE6dG71n60d8D2I4qNvleWrrXpmjpt7vZeF1LnMZQ== + dependencies: + bl "^4.0.3" + end-of-stream "^1.4.1" + fs-constants "^1.0.0" + inherits "^2.0.3" + readable-stream "^3.1.1" + +text-segmentation@^1.0.3: + version "1.0.3" + resolved "https://registry.npmjs.org/text-segmentation/-/text-segmentation-1.0.3.tgz" + integrity sha512-iOiPUo/BGnZ6+54OsWxZidGCsdU8YbE4PSpdPinp7DeMtUJNJBoJ/ouUSTJjHkh1KntHaltHl/gDs2FC4i5+Nw== + dependencies: + utrie "^1.0.2" + +tinybench@^2.9.0: + version "2.9.0" + resolved "https://registry.npmjs.org/tinybench/-/tinybench-2.9.0.tgz" + integrity sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg== + +tinyexec@^1.0.2: + version "1.0.4" + resolved "https://registry.npmjs.org/tinyexec/-/tinyexec-1.0.4.tgz" + integrity sha512-u9r3uZC0bdpGOXtlxUIdwf9pkmvhqJdrVCH9fapQtgy/OeTTMZ1nqH7agtvEfmGui6e1XxjcdrlxvxJvc3sMqw== + +tinyglobby@^0.2.15: + version "0.2.15" + resolved "https://registry.npmjs.org/tinyglobby/-/tinyglobby-0.2.15.tgz" + integrity sha512-j2Zq4NyQYG5XMST4cbs02Ak8iJUdxRM0XI5QyxXuZOzKOINmWurp3smXu3y5wDcJrptwpSjgXHzIQxR0omXljQ== + dependencies: + fdir "^6.5.0" + picomatch "^4.0.3" + +tinyrainbow@^3.0.3: + version "3.1.0" + resolved "https://registry.npmjs.org/tinyrainbow/-/tinyrainbow-3.1.0.tgz" + integrity sha512-Bf+ILmBgretUrdJxzXM0SgXLZ3XfiaUuOj/IKQHuTXip+05Xn+uyEYdVg0kYDipTBcLrCVyUzAPz7QmArb0mmw== + +tiptap-markdown@^0.9.0: + version "0.9.0" + resolved "https://registry.npmjs.org/tiptap-markdown/-/tiptap-markdown-0.9.0.tgz" + integrity sha512-dKLQ9iiuGNgrlGVjrNauF/UBzWu4LYOx5pkD0jNkmQt/GOwfCJsBuzZTsf1jZ204ANHOm572mZ9PYvGh1S7tpQ== + dependencies: + "@types/markdown-it" "^13.0.7" + markdown-it "^14.1.0" + markdown-it-task-lists "^2.1.1" + prosemirror-markdown "^1.11.1" + +tldts-core@^7.0.27: + version "7.0.27" + resolved "https://registry.npmjs.org/tldts-core/-/tldts-core-7.0.27.tgz" + integrity sha512-YQ7uPjgWUibIK6DW5lrKujGwUKhLevU4hcGbP5O6TcIUb+oTjJYJVWPS4nZsIHrEEEG6myk/oqAJUEQmpZrHsg== + +tldts@^7.0.5: + version "7.0.27" + resolved "https://registry.npmjs.org/tldts/-/tldts-7.0.27.tgz" + integrity sha512-I4FZcVFcqCRuT0ph6dCDpPuO4Xgzvh+spkcTr1gK7peIvxWauoloVO0vuy1FQnijT63ss6AsHB6+OIM4aXHbPg== + dependencies: + tldts-core "^7.0.27" + +tmp@^0.2.0: + version "0.2.5" + resolved "https://registry.npmjs.org/tmp/-/tmp-0.2.5.tgz" + integrity sha512-voyz6MApa1rQGUxT3E+BK7/ROe8itEx7vD8/HEvt4xwXucvQ5G5oeEiHkmHZJuBO21RpOf+YYm9MOivj709jow== + +topojson-client@^3.1.0: + version "3.1.0" + resolved "https://registry.npmjs.org/topojson-client/-/topojson-client-3.1.0.tgz" + integrity sha512-605uxS6bcYxGXw9qi62XyrV6Q3xwbndjachmNxu8HWTtVPxZfEJN9fd/SZS1Q54Sn2y0TMyMxFj/cJINqGHrKw== + dependencies: + commander "2" + +tough-cookie@^6.0.1: + version "6.0.1" + resolved "https://registry.npmjs.org/tough-cookie/-/tough-cookie-6.0.1.tgz" + integrity sha512-LktZQb3IeoUWB9lqR5EWTHgW/VTITCXg4D21M+lvybRVdylLrRMnqaIONLVb5mav8vM19m44HIcGq4qASeu2Qw== + dependencies: + tldts "^7.0.5" + +tr46@^6.0.0: + version "6.0.0" + resolved "https://registry.npmjs.org/tr46/-/tr46-6.0.0.tgz" + integrity sha512-bLVMLPtstlZ4iMQHpFHTR7GAGj2jxi8Dg0s2h2MafAE4uSWF98FC/3MomU51iQAMf8/qDUbKWf5GxuvvVcXEhw== + dependencies: + punycode "^2.3.1" + +"traverse@>=0.3.0 <0.4": + version "0.3.9" + resolved "https://registry.npmjs.org/traverse/-/traverse-0.3.9.tgz" + integrity sha512-iawgk0hLP3SxGKDfnDJf8wTz4p2qImnyihM5Hh/sGvQ3K37dPi/w8sRhdNIxYA1TwFwc5mDhIJq+O0RsvXBKdQ== + +ts-api-utils@^2.4.0: + version "2.5.0" + resolved "https://registry.npmjs.org/ts-api-utils/-/ts-api-utils-2.5.0.tgz" + integrity sha512-OJ/ibxhPlqrMM0UiNHJ/0CKQkoKF243/AEmplt3qpRgkW8VG7IfOS41h7V8TjITqdByHzrjcS/2si+y4lIh8NA== + +tslib@^2.8.1, tslib@~2.8.1: + version "2.8.1" + resolved "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz" + integrity sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w== + +tslib@2.3.0: + version "2.3.0" + resolved "https://registry.npmjs.org/tslib/-/tslib-2.3.0.tgz" + integrity sha512-N82ooyxVNm6h1riLCoyS9e3fuJ3AMG2zIZs2Gd1ATcSFjSA23Q0fzjjZeh0jbJvWVDZ0cJT8yaNNaaXHzueNjg== + +tunnel-agent@^0.6.0: + version "0.6.0" + resolved "https://registry.npmjs.org/tunnel-agent/-/tunnel-agent-0.6.0.tgz" + integrity sha512-McnNiV1l8RYeY8tBgEpuodCC1mLUdbSN+CYBL7kJsJNInOP8UjDDEwdk6Mw60vdLLrr5NHKZhMAOSrR2NZuQ+w== + dependencies: + safe-buffer "^5.0.1" + +type-check@^0.4.0, type-check@~0.4.0: + version "0.4.0" + resolved "https://registry.npmjs.org/type-check/-/type-check-0.4.0.tgz" + integrity sha512-XleUoc9uwGXqjWwXaUTZAmzMcFZ5858QA2vvx1Ur5xIcixXIP+8LnFDgRplU30us6teqdlskFfu+ae4K79Ooew== + dependencies: + prelude-ls "^1.2.1" + +typed-array-buffer@^1.0.3: + version "1.0.3" + resolved "https://registry.npmjs.org/typed-array-buffer/-/typed-array-buffer-1.0.3.tgz" + integrity sha512-nAYYwfY3qnzX30IkA6AQZjVbtK6duGontcQm1WSG1MD94YLqK0515GNApXkoxKOWMusVssAHWLh9SeaoefYFGw== + dependencies: + call-bound "^1.0.3" + es-errors "^1.3.0" + is-typed-array "^1.1.14" + +typed-array-byte-length@^1.0.3: + version "1.0.3" + resolved "https://registry.npmjs.org/typed-array-byte-length/-/typed-array-byte-length-1.0.3.tgz" + integrity sha512-BaXgOuIxz8n8pIq3e7Atg/7s+DpiYrxn4vdot3w9KbnBhcRQq6o3xemQdIfynqSeXeDrF32x+WvfzmOjPiY9lg== + dependencies: + call-bind "^1.0.8" + for-each "^0.3.3" + gopd "^1.2.0" + has-proto "^1.2.0" + is-typed-array "^1.1.14" + +typed-array-byte-offset@^1.0.4: + version "1.0.4" + resolved "https://registry.npmjs.org/typed-array-byte-offset/-/typed-array-byte-offset-1.0.4.tgz" + integrity sha512-bTlAFB/FBYMcuX81gbL4OcpH5PmlFHqlCCpAl8AlEzMz5k53oNDvN8p1PNOWLEmI2x4orp3raOFB51tv9X+MFQ== + dependencies: + available-typed-arrays "^1.0.7" + call-bind "^1.0.8" + for-each "^0.3.3" + gopd "^1.2.0" + has-proto "^1.2.0" + is-typed-array "^1.1.15" + reflect.getprototypeof "^1.0.9" + +typed-array-length@^1.0.7: + version "1.0.7" + resolved "https://registry.npmjs.org/typed-array-length/-/typed-array-length-1.0.7.tgz" + integrity sha512-3KS2b+kL7fsuk/eJZ7EQdnEmQoaho/r6KUef7hxvltNA5DR8NAUM+8wJMbJyZ4G9/7i3v5zPBIMN5aybAh2/Jg== + dependencies: + call-bind "^1.0.7" + for-each "^0.3.3" + gopd "^1.0.1" + is-typed-array "^1.1.13" + possible-typed-array-names "^1.0.0" + reflect.getprototypeof "^1.0.6" + +typescript-eslint@^8.16.0: + version "8.57.2" + resolved "https://registry.npmjs.org/typescript-eslint/-/typescript-eslint-8.57.2.tgz" + integrity sha512-VEPQ0iPgWO/sBaZOU1xo4nuNdODVOajPnTIbog2GKYr31nIlZ0fWPoCQgGfF3ETyBl1vn63F/p50Um9Z4J8O8A== + dependencies: + "@typescript-eslint/eslint-plugin" "8.57.2" + "@typescript-eslint/parser" "8.57.2" + "@typescript-eslint/typescript-estree" "8.57.2" + "@typescript-eslint/utils" "8.57.2" + +typescript@^4.9.5: + version "4.9.5" + resolved "https://registry.npmjs.org/typescript/-/typescript-4.9.5.tgz" + integrity sha512-1FXk9E2Hm+QzZQ7z+McJiHL4NW1F2EzMu9Nq9i3zAaGqibafqYwCVU6WyWAuyQRRzOlxou8xZSyXLEN8oKj24g== + +uc.micro@^2.0.0, uc.micro@^2.1.0: + version "2.1.0" + resolved "https://registry.npmjs.org/uc.micro/-/uc.micro-2.1.0.tgz" + integrity sha512-ARDJmphmdvUk6Glw7y9DQ2bFkKBHwQHLi2lsaH6PPmz/Ka9sFOBsBluozhDltWmnv9u/cF6Rt87znRTPV+yp/A== + +unbox-primitive@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/unbox-primitive/-/unbox-primitive-1.1.0.tgz" + integrity sha512-nWJ91DjeOkej/TA8pXQ3myruKpKEYgqvpw9lz4OPHj/NWFNluYrjbz9j01CJ8yKQd2g4jFoOkINCTW2I5LEEyw== + dependencies: + call-bound "^1.0.3" + has-bigints "^1.0.2" + has-symbols "^1.1.0" + which-boxed-primitive "^1.1.1" + +undici-types@~6.21.0: + version "6.21.0" + resolved "https://registry.npmjs.org/undici-types/-/undici-types-6.21.0.tgz" + integrity sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ== + +undici@^7.24.5: + version "7.24.5" + resolved "https://registry.npmjs.org/undici/-/undici-7.24.5.tgz" + integrity sha512-3IWdCpjgxp15CbJnsi/Y9TCDE7HWVN19j1hmzVhoAkY/+CJx449tVxT5wZc1Gwg8J+P0LWvzlBzxYRnHJ+1i7Q== + +unzipper@^0.10.11: + version "0.10.14" + resolved "https://registry.npmjs.org/unzipper/-/unzipper-0.10.14.tgz" + integrity sha512-ti4wZj+0bQTiX2KmKWuwj7lhV+2n//uXEotUmGuQqrbVZSEGFMbI68+c6JCQ8aAmUWYvtHEz2A8K6wXvueR/6g== + dependencies: + big-integer "^1.6.17" + binary "~0.3.0" + bluebird "~3.4.1" + buffer-indexof-polyfill "~1.0.0" + duplexer2 "~0.1.4" + fstream "^1.0.12" + graceful-fs "^4.2.2" + listenercount "~1.0.1" + readable-stream "~2.3.6" + setimmediate "~1.0.4" + +uri-js@^4.2.2: + version "4.4.1" + resolved "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz" + integrity sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg== + dependencies: + punycode "^2.1.0" + +use-sync-external-store@^1.0.0, use-sync-external-store@^1.4.0, use-sync-external-store@^1.6.0: + version "1.6.0" + resolved "https://registry.npmjs.org/use-sync-external-store/-/use-sync-external-store-1.6.0.tgz" + integrity sha512-Pp6GSwGP/NrPIrxVFAIkOQeyw8lFenOHijQWkUTrDvrF4ALqylP2C/KCkeS9dpUM3KvYRQhna5vt7IL95+ZQ9w== + +usehooks-ts@^3.1.1: + version "3.1.1" + resolved "https://registry.npmjs.org/usehooks-ts/-/usehooks-ts-3.1.1.tgz" + integrity sha512-I4diPp9Cq6ieSUH2wu+fDAVQO43xwtulo+fKEidHUwZPnYImbtkTjzIJYcDcJqxgmX31GVqNFURodvcgHcW0pA== + dependencies: + lodash.debounce "^4.0.8" + +util-deprecate@^1.0.1, util-deprecate@~1.0.1: + version "1.0.2" + resolved "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz" + integrity sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw== + +utrie@^1.0.2: + version "1.0.2" + resolved "https://registry.npmjs.org/utrie/-/utrie-1.0.2.tgz" + integrity sha512-1MLa5ouZiOmQzUbjbu9VmjLzn1QLXBhwpUa7kdLUQK+KQ5KA9I1vk5U4YHe/X2Ch7PYnJfWuWT+VbuxbGwljhw== + dependencies: + base64-arraybuffer "^1.0.2" + +uuid@^8.3.0: + version "8.3.2" + resolved "https://registry.npmjs.org/uuid/-/uuid-8.3.2.tgz" + integrity sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg== + +validator@^13.15.20: + version "13.15.26" + resolved "https://registry.npmjs.org/validator/-/validator-13.15.26.tgz" + integrity sha512-spH26xU080ydGggxRyR1Yhcbgx+j3y5jbNXk/8L+iRvdIEQ4uTRH2Sgf2dokud6Q4oAtsbNvJ1Ft+9xmm6IZcA== + +vega-canvas@^2.0.0: + version "2.0.0" + resolved "https://registry.npmjs.org/vega-canvas/-/vega-canvas-2.0.0.tgz" + integrity sha512-9x+4TTw/USYST5nx4yN272sy9WcqSRjAR0tkQYZJ4cQIeon7uVsnohvoPQK1JZu7K1QXGUqzj08z0u/UegBVMA== + +vega-crossfilter@~5.1.0: + version "5.1.0" + resolved "https://registry.npmjs.org/vega-crossfilter/-/vega-crossfilter-5.1.0.tgz" + integrity sha512-EmVhfP3p6AM7o/lPan/QAoqjblI19BxWUlvl2TSs0xjQd8KbaYYbS4Ixt3cmEvl0QjRdBMF6CdJJ/cy9DTS4Fw== + dependencies: + d3-array "^3.2.4" + vega-dataflow "^6.1.0" + vega-util "^2.1.0" + +vega-dataflow@^6.1.0, vega-dataflow@~6.1.0: + version "6.1.0" + resolved "https://registry.npmjs.org/vega-dataflow/-/vega-dataflow-6.1.0.tgz" + integrity sha512-JxumGlODtFbzoQ4c/jQK8Tb/68ih0lrexlCozcMfTAwQ12XhTqCvlafh7MAKKTMBizjOfaQTHm4Jkyb1H5CfyQ== + dependencies: + vega-format "^2.1.0" + vega-loader "^5.1.0" + vega-util "^2.1.0" + +vega-embed@^6.21.0: + version "6.29.0" + resolved "https://registry.npmjs.org/vega-embed/-/vega-embed-6.29.0.tgz" + integrity sha512-PmlshTLtLFLgWtF/b23T1OwX53AugJ9RZ3qPE2c01VFAbgt3/GSNI/etzA/GzdrkceXFma+FDHNXUppKuM0U6Q== + dependencies: + fast-json-patch "^3.1.1" + json-stringify-pretty-compact "^4.0.0" + semver "^7.6.3" + tslib "^2.8.1" + vega-interpreter "^1.0.5" + vega-schema-url-parser "^2.2.0" + vega-themes "^2.15.0" + vega-tooltip "^0.35.2" + +vega-embed@6.5.1: + version "6.5.1" + resolved "https://registry.npmjs.org/vega-embed/-/vega-embed-6.5.1.tgz" + integrity sha512-yz/L1bN3+fLOpgXVb/8sCRv4GlZpD2/ngeKJAFRiHTIRm5zK6W0KuqZZvyGaO7E4s7RuYjW1TWhRIOqh5rS5hA== + dependencies: + fast-json-patch "^3.0.0-1" + json-stringify-pretty-compact "^2.0.0" + semver "^7.1.3" + vega-schema-url-parser "^1.1.0" + vega-themes "^2.8.2" + vega-tooltip "^0.22.0" + +vega-encode@~5.1.0: + version "5.1.0" + resolved "https://registry.npmjs.org/vega-encode/-/vega-encode-5.1.0.tgz" + integrity sha512-q26oI7B+MBQYcTQcr5/c1AMsX3FvjZLQOBi7yI0vV+GEn93fElDgvhQiYrgeYSD4Exi/jBPeUXuN6p4bLz16kA== + dependencies: + d3-array "^3.2.4" + d3-interpolate "^3.0.1" + vega-dataflow "^6.1.0" + vega-scale "^8.1.0" + vega-util "^2.1.0" + +vega-event-selector@^4.0.0, vega-event-selector@~4.0.0: + version "4.0.0" + resolved "https://registry.npmjs.org/vega-event-selector/-/vega-event-selector-4.0.0.tgz" + integrity sha512-CcWF4m4KL/al1Oa5qSzZ5R776q8lRxCj3IafCHs5xipoEHrkgu1BWa7F/IH5HrDNXeIDnqOpSV1pFsAWRak4gQ== + +vega-expression@^6.1.0, vega-expression@~6.1.0: + version "6.1.0" + resolved "https://registry.npmjs.org/vega-expression/-/vega-expression-6.1.0.tgz" + integrity sha512-hHgNx/fQ1Vn1u6vHSamH7lRMsOa/yQeHGGcWVmh8fZafLdwdhCM91kZD9p7+AleNpgwiwzfGogtpATFaMmDFYg== + dependencies: + "@types/estree" "^1.0.8" + vega-util "^2.1.0" + +vega-force@~5.1.0: + version "5.1.0" + resolved "https://registry.npmjs.org/vega-force/-/vega-force-5.1.0.tgz" + integrity sha512-wdnchOSeXpF9Xx8Yp0s6Do9F7YkFeOn/E/nENtsI7NOcyHpICJ5+UkgjUo9QaQ/Yu+dIDU+sP/4NXsUtq6SMaQ== + dependencies: + d3-force "^3.0.0" + vega-dataflow "^6.1.0" + vega-util "^2.1.0" + +vega-format@^2.1.0, vega-format@~2.1.0: + version "2.1.0" + resolved "https://registry.npmjs.org/vega-format/-/vega-format-2.1.0.tgz" + integrity sha512-i9Ht33IgqG36+S1gFDpAiKvXCPz+q+1vDhDGKK8YsgMxGOG4PzinKakI66xd7SdV4q97FgpR7odAXqtDN2wKqw== + dependencies: + d3-array "^3.2.4" + d3-format "^3.1.0" + d3-time-format "^4.1.0" + vega-time "^3.1.0" + vega-util "^2.1.0" + +vega-functions@^6.1.0, vega-functions@~6.1.0: + version "6.1.1" + resolved "https://registry.npmjs.org/vega-functions/-/vega-functions-6.1.1.tgz" + integrity sha512-Due6jP0y0FfsGMTrHnzUGnEwXPu7VwE+9relfo+LjL/tRPYnnKqwWvzt7n9JkeBuZqjkgYjMzm/WucNn6Hkw5A== + dependencies: + d3-array "^3.2.4" + d3-color "^3.1.0" + d3-geo "^3.1.1" + vega-dataflow "^6.1.0" + vega-expression "^6.1.0" + vega-scale "^8.1.0" + vega-scenegraph "^5.1.0" + vega-selections "^6.1.0" + vega-statistics "^2.0.0" + vega-time "^3.1.0" + vega-util "^2.1.0" + +vega-geo@~5.1.0: + version "5.1.0" + resolved "https://registry.npmjs.org/vega-geo/-/vega-geo-5.1.0.tgz" + integrity sha512-H8aBBHfthc3rzDbz/Th18+Nvp00J73q3uXGAPDQqizioDm/CoXCK8cX4pMePydBY9S6ikBiGJrLKFDa80wI20g== + dependencies: + d3-array "^3.2.4" + d3-color "^3.1.0" + d3-geo "^3.1.1" + vega-canvas "^2.0.0" + vega-dataflow "^6.1.0" + vega-projection "^2.1.0" + vega-statistics "^2.0.0" + vega-util "^2.1.0" + +vega-hierarchy@~5.1.0: + version "5.1.0" + resolved "https://registry.npmjs.org/vega-hierarchy/-/vega-hierarchy-5.1.0.tgz" + integrity sha512-rZlU8QJNETlB6o73lGCPybZtw2fBBsRIRuFE77aCLFHdGsh6wIifhplVarqE9icBqjUHRRUOmcEYfzwVIPr65g== + dependencies: + d3-hierarchy "^3.1.2" + vega-dataflow "^6.1.0" + vega-util "^2.1.0" + +vega-interpreter@^1.0.5: + version "1.2.1" + resolved "https://registry.npmjs.org/vega-interpreter/-/vega-interpreter-1.2.1.tgz" + integrity sha512-EMHLGxJ+SWfh1K/fHDRlHEZtLA/2ZNAXItYb5e8CxuAIm/Ha/3DHX/8VlvbTGIciUpuwmcKx4tVhJWlKreQ/Yw== + dependencies: + vega-util "^1.17.4" + +vega-label@~2.1.0: + version "2.1.0" + resolved "https://registry.npmjs.org/vega-label/-/vega-label-2.1.0.tgz" + integrity sha512-/hgf+zoA3FViDBehrQT42Lta3t8In6YwtMnwjYlh72zNn1p3c7E3YUBwqmAqTM1x+tudgzMRGLYig+bX1ewZxQ== + dependencies: + vega-canvas "^2.0.0" + vega-dataflow "^6.1.0" + vega-scenegraph "^5.1.0" + vega-util "^2.1.0" + +vega-lite@6.4.1: + version "6.4.1" + resolved "https://registry.npmjs.org/vega-lite/-/vega-lite-6.4.1.tgz" + integrity sha512-KO3ybHNouRK4A0al/+2fN9UqgTEfxrd/ntGLY933Hg5UOYotDVQdshR3zn7OfXwQ7uj0W96Vfa5R+QxO8am3IQ== + dependencies: + json-stringify-pretty-compact "~4.0.0" + tslib "~2.8.1" + vega-event-selector "~4.0.0" + vega-expression "~6.1.0" + vega-util "~2.1.0" + yargs "~18.0.0" + +vega-loader@^5.1.0, vega-loader@~5.1.0: + version "5.1.0" + resolved "https://registry.npmjs.org/vega-loader/-/vega-loader-5.1.0.tgz" + integrity sha512-GaY3BdSPbPNdtrBz8SYUBNmNd8mdPc3mtdZfdkFazQ0RD9m+Toz5oR8fKnTamNSk9fRTJX0Lp3uEqxrAlQVreg== + dependencies: + d3-dsv "^3.0.1" + topojson-client "^3.1.0" + vega-format "^2.1.0" + vega-util "^2.1.0" + +vega-parser@~7.1.0: + version "7.1.0" + resolved "https://registry.npmjs.org/vega-parser/-/vega-parser-7.1.0.tgz" + integrity sha512-g0lrYxtmYVW8G6yXpIS4J3Uxt9OUSkc0bLu5afoYDo4rZmoOOdll3x3ebActp5LHPW+usZIE+p5nukRS2vEc7Q== + dependencies: + vega-dataflow "^6.1.0" + vega-event-selector "^4.0.0" + vega-functions "^6.1.0" + vega-scale "^8.1.0" + vega-util "^2.1.0" + +vega-projection@^2.1.0, vega-projection@~2.1.0: + version "2.1.0" + resolved "https://registry.npmjs.org/vega-projection/-/vega-projection-2.1.0.tgz" + integrity sha512-EjRjVSoMR5ibrU7q8LaOQKP327NcOAM1+eZ+NO4ANvvAutwmbNVTmfA1VpPH+AD0AlBYc39ND/wnRk7SieDiXA== + dependencies: + d3-geo "^3.1.1" + d3-geo-projection "^4.0.0" + vega-scale "^8.1.0" + +vega-regression@~2.1.0: + version "2.1.0" + resolved "https://registry.npmjs.org/vega-regression/-/vega-regression-2.1.0.tgz" + integrity sha512-HzC7MuoEwG1rIxRaNTqgcaYF03z/ZxYkQR2D5BN0N45kLnHY1HJXiEcZkcffTsqXdspLjn47yLi44UoCwF5fxQ== + dependencies: + d3-array "^3.2.4" + vega-dataflow "^6.1.0" + vega-statistics "^2.0.0" + vega-util "^2.1.0" + +vega-runtime@^7.1.0, vega-runtime@~7.1.0: + version "7.1.0" + resolved "https://registry.npmjs.org/vega-runtime/-/vega-runtime-7.1.0.tgz" + integrity sha512-mItI+WHimyEcZlZrQ/zYR3LwHVeyHCWwp7MKaBjkU8EwkSxEEGVceyGUY9X2YuJLiOgkLz/6juYDbMv60pfwYA== + dependencies: + vega-dataflow "^6.1.0" + vega-util "^2.1.0" + +vega-scale@^8.1.0, vega-scale@~8.1.0: + version "8.1.0" + resolved "https://registry.npmjs.org/vega-scale/-/vega-scale-8.1.0.tgz" + integrity sha512-VEgDuEcOec8+C8+FzLcnAmcXrv2gAJKqQifCdQhkgnsLa978vYUgVfCut/mBSMMHbH8wlUV1D0fKZTjRukA1+A== + dependencies: + d3-array "^3.2.4" + d3-interpolate "^3.0.1" + d3-scale "^4.0.2" + d3-scale-chromatic "^3.1.0" + vega-time "^3.1.0" + vega-util "^2.1.0" + +vega-scenegraph@^5.1.0, vega-scenegraph@~5.1.0: + version "5.1.0" + resolved "https://registry.npmjs.org/vega-scenegraph/-/vega-scenegraph-5.1.0.tgz" + integrity sha512-4gA89CFIxkZX+4Nvl8SZF2MBOqnlj9J5zgdPh/HPx+JOwtzSlUqIhxFpFj7GWYfwzr/PyZnguBLPihPw1Og/cA== + dependencies: + d3-path "^3.1.0" + d3-shape "^3.2.0" + vega-canvas "^2.0.0" + vega-loader "^5.1.0" + vega-scale "^8.1.0" + vega-util "^2.1.0" + +vega-schema-url-parser@^1.1.0: + version "1.1.0" + resolved "https://registry.npmjs.org/vega-schema-url-parser/-/vega-schema-url-parser-1.1.0.tgz" + integrity sha512-Tc85J2ofMZZOsxiqDM9sbvfsa+Vdo3GwNLjEEsPOsCDeYqsUHKAlc1IpbbhPLZ6jusyM9Lk0e1izF64GGklFDg== + +vega-schema-url-parser@^2.2.0: + version "2.2.0" + resolved "https://registry.npmjs.org/vega-schema-url-parser/-/vega-schema-url-parser-2.2.0.tgz" + integrity sha512-yAtdBnfYOhECv9YC70H2gEiqfIbVkq09aaE4y/9V/ovEFmH9gPKaEgzIZqgT7PSPQjKhsNkb6jk6XvSoboxOBw== + +vega-selections@^6.1.0: + version "6.1.2" + resolved "https://registry.npmjs.org/vega-selections/-/vega-selections-6.1.2.tgz" + integrity sha512-xJ+V4qdd46nk2RBdwIRrQm2iSTMHdlu/omhLz1pqRL3jZDrkqNBXimrisci2kIKpH2WBpA1YVagwuZEKBmF2Qw== + dependencies: + d3-array "3.2.4" + vega-expression "^6.1.0" + vega-util "^2.1.0" + +vega-statistics@^2.0.0, vega-statistics@~2.0.0: + version "2.0.0" + resolved "https://registry.npmjs.org/vega-statistics/-/vega-statistics-2.0.0.tgz" + integrity sha512-dGPfDXnBlgXbZF3oxtkb8JfeRXd5TYHx25Z/tIoaa9jWua4Vf/AoW2wwh8J1qmMy8J03/29aowkp1yk4DOPazQ== + dependencies: + d3-array "^3.2.4" + +vega-themes@^2.15.0, vega-themes@^2.8.2: + version "2.15.0" + resolved "https://registry.npmjs.org/vega-themes/-/vega-themes-2.15.0.tgz" + integrity sha512-DicRAKG9z+23A+rH/3w3QjJvKnlGhSbbUXGjBvYGseZ1lvj9KQ0BXZ2NS/+MKns59LNpFNHGi9us/wMlci4TOA== + +vega-time@^3.1.0, vega-time@~3.1.0: + version "3.1.0" + resolved "https://registry.npmjs.org/vega-time/-/vega-time-3.1.0.tgz" + integrity sha512-G93mWzPwNa6UYQRkr8Ujur9uqxbBDjDT/WpXjbDY0yygdSkRT+zXF+Sb4gjhW0nPaqdiwkn0R6kZcSPMj1bMNA== + dependencies: + d3-array "^3.2.4" + d3-time "^3.1.0" + vega-util "^2.1.0" + +vega-tooltip@^0.22.0: + version "0.22.1" + resolved "https://registry.npmjs.org/vega-tooltip/-/vega-tooltip-0.22.1.tgz" + integrity sha512-mPmzxwvi6+2ZgbZ/+mNC7XbSu5I6Ckon8zdgUfH9neb+vV7CKlV/FYypMdVN/9iDMFUqGzybYdqNOiSPPIxFEQ== + dependencies: + vega-util "^1.13.1" + +vega-tooltip@^0.35.2: + version "0.35.2" + resolved "https://registry.npmjs.org/vega-tooltip/-/vega-tooltip-0.35.2.tgz" + integrity sha512-kuYcsAAKYn39ye5wKf2fq1BAxVcjoz0alvKp/G+7BWfIb94J0PHmwrJ5+okGefeStZnbXxINZEOKo7INHaj9GA== + dependencies: + vega-util "^1.17.2" + optionalDependencies: + "@rollup/rollup-linux-x64-gnu" "^4.24.4" + +vega-transforms@~5.1.0: + version "5.1.0" + resolved "https://registry.npmjs.org/vega-transforms/-/vega-transforms-5.1.0.tgz" + integrity sha512-mj/sO2tSuzzpiXX8JSl4DDlhEmVwM/46MTAzTNQUQzJPMI/n4ChCjr/SdEbfEyzlD4DPm1bjohZGjLc010yuMg== + dependencies: + d3-array "^3.2.4" + vega-dataflow "^6.1.0" + vega-statistics "^2.0.0" + vega-time "^3.1.0" + vega-util "^2.1.0" + +vega-typings@~2.1.0: + version "2.1.0" + resolved "https://registry.npmjs.org/vega-typings/-/vega-typings-2.1.0.tgz" + integrity sha512-zdis4Fg4gv37yEvTTSZEVMNhp8hwyEl7GZ4X4HHddRVRKxWFsbyKvZx/YW5Z9Ox4sjxVA2qHzEbod4Fdx+SEJA== + dependencies: + "@types/geojson" "7946.0.16" + vega-event-selector "^4.0.0" + vega-expression "^6.1.0" + vega-util "^2.1.0" + +vega-util@^1.13.1: + version "1.17.4" + resolved "https://registry.npmjs.org/vega-util/-/vega-util-1.17.4.tgz" + integrity sha512-+y3ZW7dEqM8Ck+KRsd+jkMfxfE7MrQxUyIpNjkfhIpGEreym+aTn7XUw1DKXqclr8mqTQvbilPo16B3lnBr0wA== + +vega-util@^1.17.2: + version "1.17.4" + resolved "https://registry.npmjs.org/vega-util/-/vega-util-1.17.4.tgz" + integrity sha512-+y3ZW7dEqM8Ck+KRsd+jkMfxfE7MrQxUyIpNjkfhIpGEreym+aTn7XUw1DKXqclr8mqTQvbilPo16B3lnBr0wA== + +vega-util@^1.17.4: + version "1.17.4" + resolved "https://registry.npmjs.org/vega-util/-/vega-util-1.17.4.tgz" + integrity sha512-+y3ZW7dEqM8Ck+KRsd+jkMfxfE7MrQxUyIpNjkfhIpGEreym+aTn7XUw1DKXqclr8mqTQvbilPo16B3lnBr0wA== + +vega-util@^2.1.0, vega-util@~2.1.0: + version "2.1.0" + resolved "https://registry.npmjs.org/vega-util/-/vega-util-2.1.0.tgz" + integrity sha512-PGfp0m0QCufDmcxKJCWQy4Ov23FoF8DSXmoJwSezi3itQaa2hbxK0+xwsTMP2vy4PR16Pu25HMzgMwXVW1+33w== + +vega-view-transforms@~5.1.0: + version "5.1.0" + resolved "https://registry.npmjs.org/vega-view-transforms/-/vega-view-transforms-5.1.0.tgz" + integrity sha512-fpigh/xn/32t+An1ShoY3MLeGzNdlbAp2+HvFKzPpmpMTZqJEWkk/J/wHU7Swyc28Ta7W1z3fO+8dZkOYO5TWQ== + dependencies: + vega-dataflow "^6.1.0" + vega-scenegraph "^5.1.0" + vega-util "^2.1.0" + +vega-view@~6.1.0: + version "6.1.0" + resolved "https://registry.npmjs.org/vega-view/-/vega-view-6.1.0.tgz" + integrity sha512-hmHDm/zC65lb23mb9Tr9Gx0wkxP0TMS31LpMPYxIZpvInxvUn7TYitkOtz1elr63k2YZrgmF7ztdGyQ4iCQ5fQ== + dependencies: + d3-array "^3.2.4" + d3-timer "^3.0.1" + vega-dataflow "^6.1.0" + vega-format "^2.1.0" + vega-functions "^6.1.0" + vega-runtime "^7.1.0" + vega-scenegraph "^5.1.0" + vega-util "^2.1.0" + +vega-voronoi@~5.1.0: + version "5.1.0" + resolved "https://registry.npmjs.org/vega-voronoi/-/vega-voronoi-5.1.0.tgz" + integrity sha512-uKdsoR9x60mz7eYtVG+NhlkdQXeVdMr6jHNAHxs+W+i6kawkUp5S9jp1xf1FmW/uZvtO1eqinHQNwATcDRsiUg== + dependencies: + d3-delaunay "^6.0.4" + vega-dataflow "^6.1.0" + vega-util "^2.1.0" + +vega-wordcloud@~5.1.0: + version "5.1.0" + resolved "https://registry.npmjs.org/vega-wordcloud/-/vega-wordcloud-5.1.0.tgz" + integrity sha512-sSdNmT8y2D7xXhM2h76dKyaYn3PA4eV49WUUkfYfqHz/vpcu10GSAoFxLhQQTkbZXR+q5ZB63tFUow9W2IFo6g== + dependencies: + vega-canvas "^2.0.0" + vega-dataflow "^6.1.0" + vega-scale "^8.1.0" + vega-statistics "^2.0.0" + vega-util "^2.1.0" + +vega@^6.2.0: + version "6.2.0" + resolved "https://registry.npmjs.org/vega/-/vega-6.2.0.tgz" + integrity sha512-BIwalIcEGysJdQDjeVUmMWB3e50jPDNAMfLJscjEvpunU9bSt7X1OYnQxkg3uBwuRRI4nWfFZO9uIW910nLeGw== + dependencies: + vega-crossfilter "~5.1.0" + vega-dataflow "~6.1.0" + vega-encode "~5.1.0" + vega-event-selector "~4.0.0" + vega-expression "~6.1.0" + vega-force "~5.1.0" + vega-format "~2.1.0" + vega-functions "~6.1.0" + vega-geo "~5.1.0" + vega-hierarchy "~5.1.0" + vega-label "~2.1.0" + vega-loader "~5.1.0" + vega-parser "~7.1.0" + vega-projection "~2.1.0" + vega-regression "~2.1.0" + vega-runtime "~7.1.0" + vega-scale "~8.1.0" + vega-scenegraph "~5.1.0" + vega-statistics "~2.0.0" + vega-time "~3.1.0" + vega-transforms "~5.1.0" + vega-typings "~2.1.0" + vega-util "~2.1.0" + vega-view "~6.1.0" + vega-view-transforms "~5.1.0" + vega-voronoi "~5.1.0" + vega-wordcloud "~5.1.0" + +vite@^5.4.21: + version "5.4.21" + resolved "https://registry.npmjs.org/vite/-/vite-5.4.21.tgz" + integrity sha512-o5a9xKjbtuhY6Bi5S3+HvbRERmouabWbyUcpXXUA1u+GNUKoROi9byOJ8M0nHbHYHkYICiMlqxkg1KkYmm25Sw== + dependencies: + esbuild "^0.21.3" + postcss "^8.4.43" + rollup "^4.20.0" + optionalDependencies: + fsevents "~2.3.3" + +"vite@^6.0.0 || ^7.0.0 || ^8.0.0": + version "8.0.2" + resolved "https://registry.npmjs.org/vite/-/vite-8.0.2.tgz" + integrity sha512-1gFhNi+bHhRE/qKZOJXACm6tX4bA3Isy9KuKF15AgSRuRazNBOJfdDemPBU16/mpMxApDPrWvZ08DcLPEoRnuA== + dependencies: + lightningcss "^1.32.0" + picomatch "^4.0.3" + postcss "^8.5.8" + rolldown "1.0.0-rc.11" + tinyglobby "^0.2.15" + optionalDependencies: + fsevents "~2.3.3" + +vitest@^4.1.0: + version "4.1.1" + resolved "https://registry.npmjs.org/vitest/-/vitest-4.1.1.tgz" + integrity sha512-yF+o4POL41rpAzj5KVILUxm1GCjKnELvaqmU9TLLUbMfDzuN0UpUR9uaDs+mCtjPe+uYPksXDRLQGGPvj1cTmA== + dependencies: + "@vitest/expect" "4.1.1" + "@vitest/mocker" "4.1.1" + "@vitest/pretty-format" "4.1.1" + "@vitest/runner" "4.1.1" + "@vitest/snapshot" "4.1.1" + "@vitest/spy" "4.1.1" + "@vitest/utils" "4.1.1" + es-module-lexer "^2.0.0" + expect-type "^1.3.0" + magic-string "^0.30.21" + obug "^2.1.1" + pathe "^2.0.3" + picomatch "^4.0.3" + std-env "^4.0.0-rc.1" + tinybench "^2.9.0" + tinyexec "^1.0.2" + tinyglobby "^0.2.15" + tinyrainbow "^3.0.3" + vite "^6.0.0 || ^7.0.0 || ^8.0.0" + why-is-node-running "^2.3.0" + +vm-browserify@^1.1.2: + version "1.1.2" + resolved "https://registry.npmjs.org/vm-browserify/-/vm-browserify-1.1.2.tgz" + integrity sha512-2ham8XPWTONajOR0ohOKOHXkm3+gaBmGut3SRuu75xLd/RRaY6vqgh8NBYYk7+RW3u5AtzPQZG8F10LHkl0lAQ== + +void-elements@3.1.0: + version "3.1.0" + resolved "https://registry.npmjs.org/void-elements/-/void-elements-3.1.0.tgz" + integrity sha512-Dhxzh5HZuiHQhbvTW9AMetFfBHDMYpo23Uo9btPXgdYP+3T5S+p+jgNy7spra+veYhBP2dCSgxR/i2Y02h5/6w== + +w3c-keyname@^2.2.0: + version "2.2.8" + resolved "https://registry.npmjs.org/w3c-keyname/-/w3c-keyname-2.2.8.tgz" + integrity sha512-dpojBhNsCNN7T82Tm7k26A6G9ML3NkhDsnw9n/eoxSRlVBB4CEtIQ/KTCLI2Fwf3ataSXRhYFkQi3SlnFwPvPQ== + +w3c-xmlserializer@^5.0.0: + version "5.0.0" + resolved "https://registry.npmjs.org/w3c-xmlserializer/-/w3c-xmlserializer-5.0.0.tgz" + integrity sha512-o8qghlI8NZHU1lLPrpi2+Uq7abh4GGPpYANlalzWxyWteJOCsr/P+oPBA49TOLu5FTZO4d3F9MnWJfiMo4BkmA== + dependencies: + xml-name-validator "^5.0.0" + +webidl-conversions@^8.0.1: + version "8.0.1" + resolved "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-8.0.1.tgz" + integrity sha512-BMhLD/Sw+GbJC21C/UgyaZX41nPt8bUTg+jWyDeg7e7YN4xOM05YPSIXceACnXVtqyEw/LMClUQMtMZ+PGGpqQ== + +whatwg-mimetype@^5.0.0: + version "5.0.0" + resolved "https://registry.npmjs.org/whatwg-mimetype/-/whatwg-mimetype-5.0.0.tgz" + integrity sha512-sXcNcHOC51uPGF0P/D4NVtrkjSU2fNsm9iog4ZvZJsL3rjoDAzXZhkm2MWt1y+PUdggKAYVoMAIYcs78wJ51Cw== + +whatwg-url@^16.0.0, whatwg-url@^16.0.1: + version "16.0.1" + resolved "https://registry.npmjs.org/whatwg-url/-/whatwg-url-16.0.1.tgz" + integrity sha512-1to4zXBxmXHV3IiSSEInrreIlu02vUOvrhxJJH5vcxYTBDAx51cqZiKdyTxlecdKNSjj8EcxGBxNf6Vg+945gw== + dependencies: + "@exodus/bytes" "^1.11.0" + tr46 "^6.0.0" + webidl-conversions "^8.0.1" + +which-boxed-primitive@^1.1.0, which-boxed-primitive@^1.1.1: + version "1.1.1" + resolved "https://registry.npmjs.org/which-boxed-primitive/-/which-boxed-primitive-1.1.1.tgz" + integrity sha512-TbX3mj8n0odCBFVlY8AxkqcHASw3L60jIuF8jFP78az3C2YhmGvqbHBpAjTRH2/xqYunrJ9g1jSyjCjpoWzIAA== + dependencies: + is-bigint "^1.1.0" + is-boolean-object "^1.2.1" + is-number-object "^1.1.1" + is-string "^1.1.1" + is-symbol "^1.1.1" + +which-builtin-type@^1.2.1: + version "1.2.1" + resolved "https://registry.npmjs.org/which-builtin-type/-/which-builtin-type-1.2.1.tgz" + integrity sha512-6iBczoX+kDQ7a3+YJBnh3T+KZRxM/iYNPXicqk66/Qfm1b93iu+yOImkg0zHbj5LNOcNv1TEADiZ0xa34B4q6Q== + dependencies: + call-bound "^1.0.2" + function.prototype.name "^1.1.6" + has-tostringtag "^1.0.2" + is-async-function "^2.0.0" + is-date-object "^1.1.0" + is-finalizationregistry "^1.1.0" + is-generator-function "^1.0.10" + is-regex "^1.2.1" + is-weakref "^1.0.2" + isarray "^2.0.5" + which-boxed-primitive "^1.1.0" + which-collection "^1.0.2" + which-typed-array "^1.1.16" + +which-collection@^1.0.2: + version "1.0.2" + resolved "https://registry.npmjs.org/which-collection/-/which-collection-1.0.2.tgz" + integrity sha512-K4jVyjnBdgvc86Y6BkaLZEN933SwYOuBFkdmBu9ZfkcAbdVbpITnDmjvZ/aQjRXQrv5EPkTnD1s39GiiqbngCw== + dependencies: + is-map "^2.0.3" + is-set "^2.0.3" + is-weakmap "^2.0.2" + is-weakset "^2.0.3" + +which-typed-array@^1.1.16, which-typed-array@^1.1.19: + version "1.1.20" + resolved "https://registry.npmjs.org/which-typed-array/-/which-typed-array-1.1.20.tgz" + integrity sha512-LYfpUkmqwl0h9A2HL09Mms427Q1RZWuOHsukfVcKRq9q95iQxdw0ix1JQrqbcDR9PH1QDwf5Qo8OZb5lksZ8Xg== + dependencies: + available-typed-arrays "^1.0.7" + call-bind "^1.0.8" + call-bound "^1.0.4" + for-each "^0.3.5" + get-proto "^1.0.1" + gopd "^1.2.0" + has-tostringtag "^1.0.2" + +which@^2.0.1: + version "2.0.2" + resolved "https://registry.npmjs.org/which/-/which-2.0.2.tgz" + integrity sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA== + dependencies: + isexe "^2.0.0" + +why-is-node-running@^2.3.0: + version "2.3.0" + resolved "https://registry.npmjs.org/why-is-node-running/-/why-is-node-running-2.3.0.tgz" + integrity sha512-hUrmaWBdVDcxvYqnyh09zunKzROWjbZTiNy8dBEjkS7ehEDQibXJ7XvlmtbwuTclUiIyN+CyXQD4Vmko8fNm8w== + dependencies: + siginfo "^2.0.0" + stackback "0.0.2" + +word-wrap@^1.2.5: + version "1.2.5" + resolved "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz" + integrity sha512-BN22B5eaMMI9UMtjrGd5g5eCYPpCPDUy0FJXbYsaT5zYxjFOckS53SQDE3pWkVoWpHXVb3BrYcEN4Twa55B5cA== + +wrap-ansi@^9.0.0: + version "9.0.2" + resolved "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-9.0.2.tgz" + integrity sha512-42AtmgqjV+X1VpdOfyTGOYRi0/zsoLqtXQckTmqTeybT+BDIbM/Guxo7x3pE2vtpr1ok6xRqM9OpBe+Jyoqyww== + dependencies: + ansi-styles "^6.2.1" + string-width "^7.0.0" + strip-ansi "^7.1.0" + +wrappy@1: + version "1.0.2" + resolved "https://registry.npmjs.org/wrappy/-/wrappy-1.0.2.tgz" + integrity sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ== + +xml-name-validator@^5.0.0: + version "5.0.0" + resolved "https://registry.npmjs.org/xml-name-validator/-/xml-name-validator-5.0.0.tgz" + integrity sha512-EvGK8EJ3DhaHfbRlETOWAS5pO9MZITeauHKJyb8wyajUfQUenkIg2MvLDTZ4T/TgIcm3HU0TFBgWWboAZ30UHg== + +xmlchars@^2.2.0: + version "2.2.0" + resolved "https://registry.npmjs.org/xmlchars/-/xmlchars-2.2.0.tgz" + integrity sha512-JZnDKK8B0RCDw84FNdDAIpZK+JuJw+s7Lz8nksI7SIuU3UXJJslUthsi+uWBUYOwPFwW7W7PRLRfUKpxjtjFCw== + +y18n@^5.0.5: + version "5.0.8" + resolved "https://registry.npmjs.org/y18n/-/y18n-5.0.8.tgz" + integrity sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA== + +yaml@^1.10.0: + version "1.10.3" + resolved "https://registry.npmjs.org/yaml/-/yaml-1.10.3.tgz" + integrity sha512-vIYeF1u3CjlhAFekPPAk2h/Kv4T3mAkMox5OymRiJQB0spDP10LHvt+K7G9Ny6NuuMAb25/6n1qyUjAcGNf/AA== + +yargs-parser@^22.0.0: + version "22.0.0" + resolved "https://registry.npmjs.org/yargs-parser/-/yargs-parser-22.0.0.tgz" + integrity sha512-rwu/ClNdSMpkSrUb+d6BRsSkLUq1fmfsY6TOpYzTwvwkg1/NRG85KBy3kq++A8LKQwX6lsu+aWad+2khvuXrqw== + +yargs@~18.0.0: + version "18.0.0" + resolved "https://registry.npmjs.org/yargs/-/yargs-18.0.0.tgz" + integrity sha512-4UEqdc2RYGHZc7Doyqkrqiln3p9X2DZVxaGbwhn2pi7MrRagKaOcIKe8L3OxYcbhXLgLFUS3zAYuQjKBQgmuNg== + dependencies: + cliui "^9.0.1" + escalade "^3.1.1" + get-caller-file "^2.0.5" + string-width "^7.2.0" + y18n "^5.0.5" + yargs-parser "^22.0.0" + +yocto-queue@^0.1.0: + version "0.1.0" + resolved "https://registry.npmjs.org/yocto-queue/-/yocto-queue-0.1.0.tgz" + integrity sha512-rVksvsnNCdJ/ohGc6xgPwyN8eheCxsiLM8mxuE/t/mOVqJewPuO1miLpTHQiRgTKCLexL4MeAFVagts7HmNZ2Q== + +zip-stream@^4.1.0: + version "4.1.1" + resolved "https://registry.npmjs.org/zip-stream/-/zip-stream-4.1.1.tgz" + integrity sha512-9qv4rlDiopXg4E69k+vMHjNN63YFMe9sZMrdlvKnCjlCRWeCBswPPMPUfx+ipsAWq1LXHe70RcbaHdJJpS6hyQ== + dependencies: + archiver-utils "^3.0.4" + compress-commons "^4.1.2" + readable-stream "^3.6.0" + +zrender@6.0.0: + version "6.0.0" + resolved "https://registry.npmjs.org/zrender/-/zrender-6.0.0.tgz" + integrity sha512-41dFXEEXuJpNecuUQq6JlbybmnHaqqpGlbH1yxnA5V9MMP4SbohSVZsJIwz+zdjQXSSlR1Vc34EgH1zxyTDvhg== + dependencies: + tslib "2.3.0" From a184067134693457c8eb39071d0b405bd932ca05 Mon Sep 17 00:00:00 2001 From: Chenglong Wang Date: Wed, 15 Apr 2026 21:26:19 -0700 Subject: [PATCH 4/6] updated data connector design --- .../9-generalized-data-source-plugins.md | 37 +- design-docs/9.2-table-group-bundle-loading.md | 244 +++ design-docs/9.3-promoted-data-source-cards.md | 356 +++++ package.json | 1 + py-src/data_formulator/agent_routes.py | 80 +- py-src/data_formulator/agents/__init__.py | 1 + py-src/data_formulator/agents/agent_simple.py | 156 ++ py-src/data_formulator/data_connector.py | 1133 ++++++++----- .../data_loader/athena_data_loader.py | 1 + .../data_loader/azure_blob_data_loader.py | 1 + .../data_loader/bigquery_data_loader.py | 2 + .../data_loader/external_data_loader.py | 257 ++- .../data_loader/kusto_data_loader.py | 1 + .../data_loader/mongodb_data_loader.py | 1 + .../data_loader/mssql_data_loader.py | 3 +- .../data_loader/mysql_data_loader.py | 33 +- .../data_loader/postgresql_data_loader.py | 35 +- .../data_loader/s3_data_loader.py | 1 + .../data_loader/superset_data_loader.py | 409 ++++- py-src/data_formulator/datalake/__init__.py | 2 + py-src/data_formulator/datalake/workspace.py | 37 +- py-src/data_formulator/workspace_factory.py | 5 +- src/app/App.tsx | 17 +- src/app/dfSlice.tsx | 9 +- src/app/tableThunks.ts | 11 +- src/app/useDataRefresh.tsx | 6 +- src/app/utils.tsx | 39 +- src/i18n/locales/en/common.json | 6 +- src/i18n/locales/en/upload.json | 9 + src/i18n/locales/zh/common.json | 6 +- src/i18n/locales/zh/upload.json | 9 + src/views/DBTableManager.tsx | 1404 +++++++++++------ src/views/DataFormulator.tsx | 14 +- src/views/DataThread.tsx | 12 +- src/views/SimpleChartRecBox.tsx | 7 + src/views/UnifiedDataUploadDialog.tsx | 394 ++++- .../unit/test_all_loader_verification.py | 34 +- .../unit/test_data_connector_config.py | 479 +++--- .../unit/test_data_connector_framework.py | 190 ++- .../backend/unit/test_data_connector_vault.py | 77 +- tests/superset/README.md | 28 + tests/superset/sample_data.py | 168 +- tests/superset/superset_config.py | 7 + uv.lock | 236 ++- yarn.lock | 753 +++++++-- 45 files changed, 5050 insertions(+), 1661 deletions(-) create mode 100644 design-docs/9.2-table-group-bundle-loading.md create mode 100644 design-docs/9.3-promoted-data-source-cards.md create mode 100644 py-src/data_formulator/agents/agent_simple.py diff --git a/design-docs/9-generalized-data-source-plugins.md b/design-docs/9-generalized-data-source-plugins.md index 00de048c..861e08e0 100644 --- a/design-docs/9-generalized-data-source-plugins.md +++ b/design-docs/9-generalized-data-source-plugins.md @@ -1,6 +1,6 @@ # Generalized Data Source Plugins — Unifying DataLoader + Plugin into a Lifecycle-Managed Connection -## Status: Phase 3 complete (legacy data-loader endpoints removed) +## Status: Phase 3 complete (legacy data-loader endpoints + plugin system removed) ## 1. Problem @@ -1539,8 +1539,9 @@ For sources that can't filter server-side (e.g., some REST APIs), the framework - 16 verification tests confirm catalog_hierarchy, effective_hierarchy, scope pinning, auth_mode, list_params, blueprint generation for all 10 loaders - Also found and fixed operator-precedence bug in `_build_source_specs` YAML ID assignment -### Phase 3: Cleanup + Unified Panel ✅ (partial) +### Phase 3: Cleanup + Unified Panel ✅ +#### 3a: Legacy route removal ✅ - ✅ Removed 8 legacy `/api/tables/data-loader/*` backend routes from `tables_routes.py` - ✅ Removed 9 `DATA_LOADER_*` URL constants from frontend `utils.tsx` - ✅ `DBTableManager` now uses only `serverConfig.SOURCES` (DataConnector) for data source discovery @@ -1550,9 +1551,39 @@ For sources that can't filter server-side (e.g., some REST APIs), the framework - ✅ Added `connectorId` to `DataSourceConfig` so tables remember their source - ✅ Added `DISABLED_SOURCES` to app-config for greyed-out UI entries - ✅ Enhanced `data/preview` route to support full `import_options` (sort, limit) -- [ ] Remove `DataSourcePlugin` base class, `plugins/` directory, and per-plugin `__init__.py` files + +#### 3b: Connection model (doc 9.1) ✅ +- ✅ Vault-based credential persistence wired into `DataConnector` (`_vault_store`, `_vault_retrieve`, `_vault_delete`) +- ✅ Auto-reconnect from vault on server restart (lazy, on first `/get-status` or catalog/data call) +- ✅ Disconnect preserves vault credentials (fast reconnect), Delete clears them +- ✅ Multi-user isolation via `(user_identity, connector_id)` composite key +- ✅ Centralized `credentials.db` at `DATA_FORMULATOR_HOME/` with Fernet encryption + +#### 3c: Promoted data source cards (doc 9.3) ✅ +- ✅ Single shared `connectors_bp` blueprint — all action routes take `connector_id` in JSON body +- ✅ `GET /api/data-loaders`, `GET /api/connectors`, `POST /api/connectors`, `DELETE /api/connectors/{id}` +- ✅ Action routes: `/connect`, `/disconnect`, `/get-status`, `/get-catalog`, `/get-catalog-tree`, `/preview-data`, `/import-data`, `/import-group`, `/refresh-data` +- ✅ Connected sources promoted as top-level cards on Load Data menu +- ✅ "Add Connection" card with type picker + param form +- ✅ Removed legacy "Database" tab from UI + +#### 3d: Remove legacy plugin system ✅ +- ✅ Relocated `SupersetClient` + `SupersetAuthBridge` from `plugins/superset/` to `data_loader/superset/` (used by `SupersetLoader`) +- ✅ Deleted `py-src/data_formulator/plugins/` directory (base classes, discovery engine, Superset plugin, all routes) +- ✅ Deleted `src/plugins/` directory (frontend plugin host, registry, Superset UI components) +- ✅ Removed plugin registration from `app.py` (`discover_and_register`, `ENABLED_PLUGINS`) +- ✅ Removed frontend plugin imports (`getEnabledPlugins`, `PluginHost`, `registerPluginTranslations`) +- ✅ Deleted legacy plugin tests - [ ] Integrate with unified data source panel ([doc #8](8-unified-data-source-panel.md)) +### Sub-doc Summary (9.1–9.3) + +| Doc | Title | Status | Key Deliverables | +|-----|-------|--------|------------------| +| [9.1](9.1-data-source-connection-model.md) | Connection Model | Complete | Vault credential persistence, auto-reconnect, multi-user isolation, centralized `credentials.db` | +| [9.2](9.2-table-group-bundle-loading.md) | TableGroup Bundle Loading | Draft (design only) | `table_group` node type for BI dashboards, source filters, group load API — not yet implemented | +| [9.3](9.3-promoted-data-source-cards.md) | Promoted Data Source Cards | Complete | Single shared blueprint API, promoted cards UI, "Add Connection" flow, legacy "Database" tab removed | + ### Phase 4: Advanced Features 13. Scheduled refresh (periodic re-fetch) diff --git a/design-docs/9.2-table-group-bundle-loading.md b/design-docs/9.2-table-group-bundle-loading.md new file mode 100644 index 00000000..99c2b114 --- /dev/null +++ b/design-docs/9.2-table-group-bundle-loading.md @@ -0,0 +1,244 @@ +# TableGroup — Bundle Loading for BI Dashboards + +## Status: Draft + +Parent: [9-generalized-data-source-plugins.md](9-generalized-data-source-plugins.md) + +## 1. Problem + +BI dashboards (Superset, Power BI, Metabase, Tableau) organize multiple datasets under a single dashboard with **shared filter context**. Today, Data Formulator treats dashboards as simple folders: the user must manually browse into each dataset and load them one at a time. This loses the relationship between datasets and discards admin-defined filters. + +## 2. Concept: `table_group` + +A **table_group** is a loadable bundle — a catalog node that contains multiple tables and optional shared filters defined by the BI tool's admin. + +| Platform | Group Unit | Tables | Shared Filters | +|----------|-----------|---------------|----------------| +| Superset | Dashboard | Datasets | Native filters | +| Power BI | Report | Dataset tables | Slicers / parameters | +| Metabase | Dashboard | Questions (SQL) | Dashboard parameters | +| Tableau | Workbook | Worksheets / data sources | Workbook filters | +| Database | *(none)* | Tables are standalone | N/A | + +For databases, everything stays as-is — no grouping. The `table_group` concept only appears when the source provides it. + +## 3. Node Type Extension + +Current: `node_type: 'namespace' | 'table'` + +Extended: `node_type: 'namespace' | 'table' | 'table_group'` + +### 3.1 Backend — CatalogNode + +```python +CatalogNode( + name="Sales Dashboard", + node_type="table_group", + path=["workspace", "Sales Dashboard"], + metadata={ + "tables": [ + {"name": "orders", "dataset_id": 42, "row_count": 100000, + "columns": ["order_id", "region", "order_date", "amount", ...]}, + {"name": "customers", "dataset_id": 99, "row_count": 5000, + "columns": ["customer_id", "name", "region", "segment", ...]}, + {"name": "products", "dataset_id": 101, "row_count": 200, + "columns": ["product_id", "category", "price", ...]}, + ], + "source_filters": [ + { + "name": "Region", + "column": "region", + "input_type": "select", # select | numeric | time | text + "column_type": "STRING", # STRING | NUMERIC | TEMPORAL + "multi": True, + "required": False, + "default_value": ["APAC"], + "applies_to": [42, 99], # dataset IDs this filter targets + }, + { + "name": "Year", + "column": "order_year", + "input_type": "select", + "column_type": "NUMERIC", + "multi": False, + "required": False, + "default_value": null, + "applies_to": [42], + }, + ], + }, + children=[] # no children in tree; tables listed in metadata +) +``` + +A `table_group` node is a **leaf** in the catalog tree — not expandable. Member tables are shown in the right panel's group load UI. + +### 3.2 Frontend — CatalogTreeNode + +```typescript +interface CatalogTreeNode { + name: string; + node_type: 'namespace' | 'table' | 'table_group'; + path: string[]; + metadata: Record | null; + children?: CatalogTreeNode[]; +} +``` + +### 3.3 Source Filter Definition + +```typescript +interface SourceFilter { + name: string; // Display label + column: string; // Physical column name + input_type: 'select' | 'numeric' | 'time' | 'text'; + column_type: 'STRING' | 'NUMERIC' | 'TEMPORAL' | 'BOOLEAN'; + multi: boolean; + required: boolean; + default_value?: unknown; + applies_to?: number[]; // dataset IDs; omit = applies to all + options?: string[]; // Pre-fetched for small cardinality; omit = lazy-load +} +``` + +## 4. Tree Display + +`table_group` nodes use a **distinct icon** — dashboard/grid icon rather than a folder icon — to visually distinguish them from namespace folders. + +``` +📁 My Workspace ← namespace (folder icon) + 📊 Sales Dashboard (3 tables) ← table_group (dashboard icon, leaf) + 📊 Marketing Dashboard (1 table) ← table_group (leaf) + 📁 SQL Lab ← namespace + 📋 ad_hoc_query_1 ← table +``` + +`table_group` nodes are **not expandable** — tables are listed in the right panel's group load UI instead. + +## 5. Load Flow + +### 5.1 Load All (Group-Level) + +When the user selects a `table_group` node and clicks **Load Dashboard**: + +1. All member datasets are fetched in parallel (each respecting the per-table row limit and sort settings). +2. Shared `source_filters` are applied as WHERE clauses to each dataset whose `applies_to` list includes that dataset. +3. All tables appear in the Data Formulator workspace, tagged with their group origin. +4. If the user wants to remove some tables afterward, they can do so from the workspace. + +### 5.2 Re-Filter + +If the user changes filter values and loads again, it's a fresh load. No incremental update — just re-query with new filter values. + +## 6. Right Panel — Group Load UI + +When a `table_group` is selected: + +``` +┌─────────────────────────────────────────┐ +│ 📊 Sales Dashboard │ +│ │ +│ ── Tables ──────────────────────────── │ +│ 📋 orders 100,000 rows │ +│ ▸ 12 columns │ +│ 📋 customers 5,000 rows │ +│ ▸ 8 columns │ +│ 📋 products 200 rows │ +│ ▸ 5 columns │ +│ │ +│ (click ▸ to expand column list) │ +│ │ +│ 📋 orders 100,000 rows │ +│ ▾ 12 columns │ +│ order_id, region, order_date, │ +│ amount, customer_id, product_id, │ +│ status, ship_date, ... │ +│ │ +│ ── Source Filters ──────────────────── │ +│ Region [APAC ▾] multi │ +│ Year [ ▾] │ +│ │ +│ ── Load Settings ───────────────────── │ +│ Row limit per table [All ▾] │ +│ │ +│ [Load Dashboard] │ +└─────────────────────────────────────────┘ +``` + +When a child `table` under the group is selected, the standard per-table load panel appears (row limit, sort, etc.). + +## 7. Backend Data Flow + +### 7.1 `ls()` — Attach Filters to Group Metadata + +During `ls()` for a dashboard-level path, the loader: + +1. Fetches the dashboard detail (contains `native_filter_configuration` or equivalent). +2. Parses filter definitions into the `source_filters` format. +3. Attaches them to the `table_group` node's `metadata`. +4. Filter option values are included in `metadata.source_filters[].options` (admin-defined filters are typically low cardinality). + +### 7.2 `fetch_data_as_arrow()` — Apply Source Filters + +`import_options` gains a `source_filters` key: + +```python +import_options = { + "size": 50000, + "sort_by": "order_date", + "sort_order": "DESC", + "source_filters": [ + {"column": "region", "operator": "IN", "value": ["APAC", "EMEA"]}, + {"column": "order_year", "operator": "EQ", "value": 2025}, + ] +} +``` + +`fetch_data_as_arrow()` builds WHERE clauses from `source_filters` and appends them to the base SQL. + +### 7.3 Group Load API + +Load all tables with shared filters: + +``` +POST /api/connectors/{loader}/load-group +{ + "group_path": ["workspace", "Sales Dashboard"], + "row_limit": -1, + "source_filters": [ + {"column": "region", "operator": "IN", "value": ["APAC"]}, + {"column": "order_year", "operator": "EQ", "value": 2025} + ] +} +``` + +Response: streams Arrow IPC batches for each table, or returns them sequentially. + +## 8. Frontend State — Tables in Workspace + +Loaded tables from a group are stored as separate `DFTable` entries — no group tracking needed. Each table is independently usable in Data Formulator's analysis pipeline. The table name carries the dashboard context: + +```typescript +{ + id: "conn_superset_orders_1713200000", + tableName: "Sales Dashboard / orders", + ... +} +``` + +If the user wants to re-load with different filters, they go back to the `table_group` node in the catalog tree and load again. + +## 9. Implementation Phases + +### Phase A: Backend — `table_group` in CatalogNode + filter extraction +- Add `table_group` to `CatalogNode.node_type` +- SupersetLoader: extract native filters during `ls()`, attach to `metadata.source_filters` +- SupersetLoader: honor `source_filters` in `fetch_data_as_arrow()` + +### Phase B: Frontend — Tree + Group Load Panel +- Extend `CatalogTreeNode` type with `table_group` +- Dashboard icon for `table_group` nodes +- Group load panel: shows tables, source filter controls, "Load Dashboard" button +- Wire load action to call backend for each table with filters + + diff --git a/design-docs/9.3-promoted-data-source-cards.md b/design-docs/9.3-promoted-data-source-cards.md new file mode 100644 index 00000000..bf1c7f9f --- /dev/null +++ b/design-docs/9.3-promoted-data-source-cards.md @@ -0,0 +1,356 @@ +# Promoted Data Source Cards + +## Status: Complete (Phase A + B done, Phase C merged into doc 9 Phase 3 cleanup) + +Parent: [9-generalized-data-source-plugins.md](9-generalized-data-source-plugins.md) + +### Implementation Summary + +**Phase A — Backend API redesign (complete):** +- [x] Single shared `connectors_bp` blueprint — all action routes accept `connector_id` in JSON body +- [x] `GET /api/data-loaders` — lists available loader types with param definitions +- [x] `GET /api/connectors` — lists registered instances with connection status +- [x] `POST /api/connectors` — creates user connector instance, auto-connects +- [x] `DELETE /api/connectors/{id}` — tears down instance, clears vault +- [x] Action routes: `/connect`, `/disconnect`, `/get-status`, `/get-catalog`, `/get-catalog-tree`, `/preview-data`, `/import-data`, `/import-group`, `/refresh-data` +- [x] No per-instance Flask blueprints — eliminated `create_blueprint()` and dynamic registration +- [x] Side-effect-free `/get-status`; auto-reconnect moved to `/connect` +- [x] Admin-provisioned connectors from `connectors.yaml` + `DF_SOURCES__*` env vars + +**Phase B — Frontend promoted cards (complete):** +- [x] Connected sources promoted as top-level cards on Load Data menu +- [x] "Add Connection" card with left/right layout: pick type → fill params → Add & Connect +- [x] Each card click → `DataLoaderForm` (browse-only when connected) +- [x] Removed legacy "Database" tab from UI +- [x] `DBTableManager` uses only `serverConfig.SOURCES` (DataConnector) for source discovery + +**Phase C — Cleanup (deferred to doc 9 Phase 3):** +- [ ] Unify Superset plugin into `/api/connectors/` flow (done via SupersetLoader) +- [ ] Disconnect / Delete actions on each card + +> **Note:** `dataLoaderConnectParams` stays in Redux — it manages transient form field state (partially filled connection forms). Registered connection metadata lives server-side via the connectors API. + +## 1. Problem + +Today, all external data sources (MySQL, PostgreSQL, Superset, …) are crammed behind a single "Database" card on the Load Data menu. The user clicks "Database" → picks a source from a list → fills connection params → connects → browses tables. This is: + +- **Deep**: 4 levels of nesting before the user sees any data. +- **Mixed concerns**: The connection form and the data browser share the same panel, wasting space on param fields the user doesn't need after connecting. +- **Inconsistent**: Superset (plugin-based SSO) already gets its own top-level card, while generic connectors are hidden inside the "Database" section. + +## 2. Proposal + +### 2.1 Connected sources become top-level cards + +Once a data source is registered / connected, it appears as its own card on the Load Data menu page — at the same level as "Upload File", "Load from URL", "Sample Datasets", etc. + +**Before:** +``` +Local data + [Sample Datasets] [Upload File] + [Paste Data] [Extract Unstructured] + +Connect to a data source + [Load from URL] [Database] ← everything hidden in here + [Apache Superset] +``` + +**After:** +``` +Local data + [Sample Datasets] [Upload File] + [Paste Data] [Extract Unstructured] + +Data sources + [Load from URL] + [MySQL · mydb] ← promoted, one card per connection + [PostgreSQL · analytics] + [Superset · prod] + [+ Add Connection] ← register a new source +``` + +### 2.2 "Add Connection" card + +A card on the menu page (styled similarly to "add new session" on the front page) that opens the connection registration flow: + +1. User picks a source type (MySQL, PostgreSQL, Superset, …). +2. User fills connection params (host, port, credentials, etc.). +3. On success, a new card appears on the menu page. + +This replaces the current "Database" tab's multi-step flow. + +### 2.3 DataConnectorPane — browse-only + +Clicking a connected source card opens a **DataConnectorPane**: just tree + preview + load controls. No connection params, no source picker — those were handled at registration time. + +``` +┌─────────────────────────────────────────────────────┐ +│ MySQL · mydb [⚙] [Disconnect]│ +├──────────┬──────────────────────────────────────────┤ +│ 🔍 filter│ orders │ +│ ─────────│ 1,234 rows × 8 columns │ +│ 📁 public│ ┌──────────────────────────────────┐ │ +│ 📄 users│ │ id │ name │ email │ created_at │ │ +│ 📄 orders│ │ 1 │ ... │ ... │ ... │ │ +│ 📄 items │ │ 2 │ ... │ ... │ ... │ │ +│ 📁 staging│ └──────────────────────────────────┘ │ +│ │ │ +│ │ Row limit: [2,000,000 ▾] [Load Table] │ +└──────────┴───────────────────────────────────────────┘ +``` + +### 2.4 Disconnect vs Delete + +| Action | Effect | +|------------|---------------------------------------------------------------| +| Disconnect | Drops the active session / token. Card stays on the menu (grayed or with "reconnect" badge). Clicking it triggers re-auth. | +| Delete | Removes the card entirely. Clears saved credentials (vault entries, tokens, cookies). | + +Disconnect is the default quick action (e.g., session expired, user switches accounts). Delete is a deliberate destructive action (confirmation required). + +## 3. Card Data Model + +Each registered connection is persisted as a **DataSourceEntry**: + +```typescript +interface DataSourceEntry { + /** Unique ID for this connection instance */ + id: string; + /** Connector type key (e.g. "mysql", "superset") */ + source_type: string; + /** User-facing label, e.g. "MySQL · mydb" */ + display_name: string; + /** Connection parameters (host, port, database, etc.) */ + params: Record; + /** Whether this connection is currently authenticated / active */ + connected: boolean; + /** Timestamp of last successful connection */ + last_connected?: number; +} +``` + +This replaces the current flat `dataLoaderConnectParams` map (which stores params by loader type, limiting support to one connection per type). + +## 4. Key Design Decisions + +### 4.1 Multiple connections of the same type + +A user may have two MySQL connections (e.g. "MySQL · prod" and "MySQL · staging"). Each gets its own card. The `DataSourceEntry.id` distinguishes them, not the `source_type`. + +### 4.2 Where entries are stored + +- **Local mode**: In Redux state, persisted to localStorage (same as current `dataLoaderConnectParams`). +- **Azure/ephemeral mode**: In workspace session on the server side, with credentials in vault. + +### 4.3 Legacy "Database" tab + +Removed. All its functionality is absorbed by: +- "Add Connection" card → connection registration +- Per-source cards → browsing / loading + +### 4.4 Re-auth flow + +When a connection's `connected` is false (session expired, token revoked): +- The card appears with a visual indicator (dimmed icon, "reconnect" label). +- Clicking it opens a lightweight re-auth prompt (just the credential fields, not the full registration form), since host/port/database are already known. + +## 5. API Redesign + +### 5.1 Current Problems + +| # | Issue | Detail | +|---|-------|--------| +| 1 | **Ghost endpoints** | `register_data_connectors()` pre-creates Flask blueprints for every discovered loader at startup. `/api/connectors/mysql/...` exists even if no MySQL connection was ever created. | +| 2 | ~~POST for reads~~ | Kept POST — params are JSON bodies (paths, filters, options). Common industry pattern (Elasticsearch, GraphQL). Not worth the migration cost for marginal REST-purity benefit. | +| 3 | **snake_case URLs** | `/catalog/list_tables` (snake) vs `/data/load-group` (kebab) vs `/auth/token-connect` (kebab). Inconsistent. | +| 4 | **Status has side effects** | `/auth/status` calls `_try_auto_reconnect()` — creates loaders, hits vault. Should be side-effect-free. | +| 5 | **Dual namespaces** | Connectors: `/api/connectors/{id}/`. Plugins: `/api/plugins/{id}/`. Same concept, different URL trees. | +| 6 | **Single-instance** | `DATA_CONNECTORS` is keyed by loader type. Can't have two MySQL connections. | +| 7 | **Auth ↔ catalog coupling** | `/auth/connect` response includes `hierarchy`, `effective_hierarchy`, `pinned_scope` — catalog data bundled into auth. | +| 8 | **Inconsistent param names** | `filter` vs `table_filter` vs `source_table` vs `table_name` vs `size` vs `row_limit`. | + +### 5.2 New API Surface + +Principle: **one shared blueprint, no per-instance routes**. All action routes accept `connector_id` in the JSON body. This avoids Flask's limitation that blueprints can't be registered after the first request, and eliminates ghost endpoints entirely. + +#### Discovery + CRUD (always available) + +``` +GET /api/data-loaders → list available loader types + param definitions +GET /api/connectors → list registered connector instances + status +POST /api/connectors → create instance (type + display_name + params → auto-connect) +DELETE /api/connectors/{id} → delete instance, clear vault credentials +``` + +#### Action routes (shared — connector_id in body) + +All action routes are `POST` and accept `{"connector_id": "mysql:prod", ...}` in the JSON body. +The handler resolves the `DataConnector` from `DATA_CONNECTORS[connector_id]`. + +``` +# Connection +POST /api/connectors/connect → {connector_id, params?, mode?, persist?} +POST /api/connectors/disconnect → {connector_id} +POST /api/connectors/get-status → {connector_id} (no side effects) + +# Catalog +POST /api/connectors/get-catalog → {connector_id, path?, filter?} +POST /api/connectors/get-catalog-tree → {connector_id, filter?} + +# Data +POST /api/connectors/preview-data → {connector_id, source_table, limit?} +POST /api/connectors/import-data → {connector_id, source_table, table_name?, import_options?} +POST /api/connectors/import-group → {connector_id, tables, row_limit?, source_filters?, group_name?} +POST /api/connectors/refresh-data → {connector_id, table_name} +``` + +#### Why not `/api/connectors/{id}/action`? + +Flask's `register_blueprint()` cannot be called after the app handles its first request. +Per-instance blueprints require dynamic registration at runtime (when user creates a new connection). +Putting `connector_id` in the body instead of the URL means all routes live on a single blueprint registered once at startup. +This is the same pattern used by GraphQL, Elasticsearch, and other APIs that dispatch to resources via payload rather than URL path. + +#### Implementation + +Three layers of storage, merged at runtime: + +``` +┌───────────────────────────────────────────────────────────┐ +│ Admin config (global, read-only for users) │ +│ DATA_FORMULATOR_HOME/connectors.yaml │ +│ + DF_SOURCES__* env vars │ +│ Shared across all users │ +├───────────────────────────────────────────────────────────┤ +│ User config (per-user, cross-workspace) │ +│ DATA_FORMULATOR_HOME/users//connectors.yaml │ +│ CRUD by the user │ +├───────────────────────────────────────────────────────────┤ +│ Credentials (per-identity + connector_id) │ +│ DATA_FORMULATOR_HOME/credentials.db (encrypted) │ +│ Vault — both admin & user connectors │ +├───────────────────────────────────────────────────────────┤ +│ In-memory (transient, union of admin + user) │ +│ connector_id → DataConnector w/ live loader │ +└───────────────────────────────────────────────────────────┘ +``` + +Same file name (`connectors.yaml`), same format, two scopes. `data-sources.yml` is retired. + +**Admin config** (`DATA_FORMULATOR_HOME/connectors.yaml`, `DF_SOURCES__*` env vars): +- Global connectors shared across all users. Read-only for users. +- Loaded at startup and merged into memory. Cannot be deleted by users. + +**User config** (`DATA_FORMULATOR_HOME/users//connectors.yaml`): +- Per-user connector definitions: `{connector_id, loader_type, display_name, icon, default_params}`. +- Same `users//` directory tree used by workspace storage. +- Created via `POST /api/connectors`, deleted via `DELETE /api/connectors/{id}`. +- Survives server restarts, available across all workspaces for that user. + +**Vault** (per-identity + connector_id): +- Encrypted credentials (passwords, tokens) for both admin and user connectors. +- Written on `connect` (with `persist: true`), cleared on `delete`. +- `disconnect` keeps vault credentials by default (so "reconnect" is fast). + +**In-memory cache** (`DATA_CONNECTORS` dict): +- Union of admin + user connectors, lazily hydrated. +- Holds live `ExternalDataLoader` instances per identity. +- Transient — rebuilt from user config + vault on server restart / first request. + +**`GET /api/connectors`** returns the merged list with a `source` field: +```json +[ + {"id": "mysql:prod", "source": "admin", "deletable": false, ...}, + {"id": "mysql:my-local", "source": "user", "deletable": true, ...} +] +``` + +A single `connectors_bp` Flask Blueprint handles everything: +- Helper `_resolve_connector(data)` extracts `connector_id` from the JSON body and looks up `DATA_CONNECTORS[connector_id]`, returning 404 if not found (after attempting lazy load from user config). +- `DataConnector` is a plain Python object with methods like `_connect()`, `_disconnect()`, `_require_loader()`, etc. +- At startup, `register_data_connectors(app)` registers `connectors_bp` once and populates `DATA_CONNECTORS` from admin config + user config. +- At runtime, `POST /api/connectors` (create) writes to user config + adds to memory. +- `DELETE /api/connectors/{id}` removes from user config + vault + memory (blocked for admin connectors). + +**Disconnect vs Delete:** + +| Action | In-memory loader | User config | Vault creds | Card visible | +|--------|-----------------|-------------|-------------|-------------| +| Disconnect | Cleared | Kept | Kept (reconnect fast) | Yes, shows "reconnect" | +| Delete | Cleared | Removed | Cleared | No | + +#### What changed + +| Before | After | Why | +|--------|-------|-----| +| `/api/connectors/{id}/connect` | `POST /api/connectors/connect` `{connector_id}` | No per-instance routes; connector_id in body | +| `/api/connectors/{id}/disconnect` | `POST /api/connectors/disconnect` `{connector_id}` | Same | +| `/api/connectors/{id}/status` | `POST /api/connectors/get-status` `{connector_id}` | Same; verb-based name | +| `/api/connectors/{id}/catalog` | `POST /api/connectors/get-catalog` `{connector_id}` | Same; verb-based name | +| `/api/connectors/{id}/catalog/tree` | `POST /api/connectors/get-catalog-tree` `{connector_id}` | Same; flat path | +| `/api/connectors/{id}/preview` | `POST /api/connectors/preview-data` `{connector_id}` | Same; verb-based name | +| `/api/connectors/{id}/import` | `POST /api/connectors/import-data` `{connector_id}` | Same; verb-based name | +| `/api/connectors/{id}/import-group` | `POST /api/connectors/import-group` `{connector_id}` | Same | +| `/api/connectors/{id}/refresh` | `POST /api/connectors/refresh-data` `{connector_id}` | Same; verb-based name | +| Per-instance Flask Blueprint | None | Eliminated — single shared blueprint, no dynamic registration | + +#### Dropped + +| Endpoint | Reason | +|----------|--------| +| `create_blueprint()` | No per-instance blueprints. `DataConnector` is a plain object. | +| `/catalog/metadata` | Merged into `/catalog` (done in Phase A) | +| `/catalog/list_tables` | Frontend never uses it. `get-catalog-tree` covers the use case | +| `/auth/token-connect` | Absorbed into `/connect` with a `mode` field (done in Phase A) | + +### 5.3 Instance ID scheme + +Each connector instance gets a stable ID: `{loader_type}:{user_label}`, e.g. `mysql:prod`, `superset:analytics`. + +- The `loader_type` portion maps to a `DATA_LOADERS` key for instantiation. +- The `user_label` is a slug provided at creation (defaulting to the database name or host). +- URL-safe: only lowercase alphanumeric + hyphen + colon. + +For admin-provisioned connectors (YAML/env config), instances are pre-created at startup with their configured IDs — they behave identically to user-created ones. + +### 5.4 Migration path + +The old per-instance blueprint routes (`/api/connectors/{id}/connect`, etc.) have never shipped. +No backward compatibility needed — we replace them with shared routes that take `connector_id` in the body. +`create_blueprint()` and `_register_*_routes()` methods are deleted. +`register_data_connectors()` is simplified: register `connectors_bp` once, populate `DATA_CONNECTORS` dict. + +## 6. Scope & Phases + +### Phase A — Single-blueprint backend + multi-instance support + +Existing infrastructure that stays as-is: +- `ExternalDataLoader` base class and all loader implementations +- Vault-based credential storage +- `_loaders` in-memory cache pattern + +New work: +- [x] `GET /api/data-loaders` — returns loader types from `DATA_LOADERS` registry with param definitions. +- [x] `GET /api/connectors` — lists registered instances with connection status. +- [x] `POST /api/connectors` — creates a `DataConnector` in `DATA_CONNECTORS` dict (no blueprint registration). Auto-connects if params provided. +- [x] `DELETE /api/connectors/{id}` — tears down instance, clears vault. +- [ ] Move all per-instance route handlers to shared routes on `connectors_bp` (`/api/connectors/connect`, `/get-status`, `/get-catalog`, `/preview-data`, `/import-data`, `/refresh-data`, etc.) that accept `connector_id` in JSON body. +- [ ] Delete `create_blueprint()`, `_register_connection_routes()`, `_register_catalog_routes()`, `_register_data_routes()`. +- [x] Make `/status` side-effect-free (move auto-reconnect logic to `/connect`). +- [x] Merge `/auth/token-connect` into `/connect` with `mode` field. +- [ ] Simplify `register_data_connectors()` — register `connectors_bp` once, populate `DATA_CONNECTORS` from config (no per-instance blueprint). +- [x] Admin-provisioned connectors (YAML/env) auto-create instances at startup. + +### Phase B — Frontend: menu page cards + generic URLs + +- [ ] Replace `getConnectorUrls(id)` with static `CONNECTOR_ACTION_URLS` constants (no ID in URL path). +- [ ] Update all frontend call sites to send `connector_id` in POST body instead of URL path. +- [x] Render connected connectors as promoted cards on the Load Data menu page. +- [x] "Add Connection" card → left/right layout: pick type, fill params + display name, "Add & Connect" → `POST /api/connectors`. +- [x] Each card click → opens `DataLoaderForm` (browse-only when connected). +- [ ] Disconnect / Delete actions on each card. + +### Phase C — Cleanup + +- [ ] Unify Superset plugin (`/api/plugins/superset/`) into the `/api/connectors/` flow. +- [x] Remove legacy "Database" tab. diff --git a/package.json b/package.json index 61f1d7a0..e44d47ad 100644 --- a/package.json +++ b/package.json @@ -11,6 +11,7 @@ "@mui/icons-material": "^7.1.1", "@mui/lab": "^7.0.1-beta.18", "@mui/material": "^7.1.1", + "@mui/x-tree-view": "^9.0.1", "@reduxjs/toolkit": "^1.8.6", "@tiptap/extension-image": "^3.22.2", "@tiptap/pm": "^3.22.2", diff --git a/py-src/data_formulator/agent_routes.py b/py-src/data_formulator/agent_routes.py index 6a675e39..9b1342d5 100644 --- a/py-src/data_formulator/agent_routes.py +++ b/py-src/data_formulator/agent_routes.py @@ -22,6 +22,7 @@ from data_formulator.agents.agent_data_rec import DataRecAgent from data_formulator.agents.agent_sort_data import SortDataAgent +from data_formulator.agents.agent_simple import SimpleAgents from data_formulator.security.auth import get_identity_id from data_formulator.security.code_signing import sign_result, verify_code, MAX_CODE_SIZE from data_formulator.datalake.workspace import Workspace @@ -1004,42 +1005,57 @@ def workspace_summary(): try: client = get_client(content['model']) - ctx = content.get('context', {}) - table_names = ctx.get('tables', []) - user_query = ctx.get('userQuery', '') - - prompt_parts = [] - if table_names: - prompt_parts.append(f"Data tables: {', '.join(table_names)}") - if user_query: - prompt_parts.append(f"User's first request: {user_query}") - - context_str = '. '.join(prompt_parts) if prompt_parts else 'A data analysis session' - - messages = [ - { - "role": "system", - "content": ( - "You are a helpful assistant. Generate a very short name (3-5 words) " - "for a data analysis workspace based on the context below. " - "Return ONLY the name, no quotes, no explanation." - ), - }, - { - "role": "user", - "content": context_str, - }, - ] - - response = client.get_completion(messages) - summary = response.choices[0].message.content.strip().strip('"\'') - # Truncate if too long - if len(summary) > 60: - summary = summary[:57] + "..." + agent = SimpleAgents(client=client) + summary = agent.workspace_summary( + table_names=ctx.get('tables', []), + user_query=ctx.get('userQuery', ''), + ) return jsonify(status="ok", summary=summary) except Exception as e: logger.warning(f"Failed to generate workspace summary: {e}") return jsonify(status="error", summary=""), 500 + + +# --------------------------------------------------------------------------- +# NL → structured filter conditions +# --------------------------------------------------------------------------- + +@agent_bp.route('/nl-to-filter', methods=['POST']) +def nl_to_filter(): + """Translate a natural language filter instruction to structured conditions. + + Request body: + model: model config object (same as other agent routes) + columns: [{name, type}, ...] — the table's column schema + instruction: str — the user's NL filter description + + Response: + {status, conditions, sort_columns?, sort_order?, limit?} + """ + try: + content = request.get_json() or {} + instruction = (content.get("instruction") or "").strip() + columns = content.get("columns") or [] + model_config = content.get("model") + + if not instruction: + return jsonify(status="ok", conditions=[], sort_columns=[], sort_order=None, limit=None) + + if not model_config: + return jsonify(status="error", message="No model configured"), 400 + + client = get_client(model_config) + agent = SimpleAgents(client=client) + result = agent.nl_to_filter(columns=columns, instruction=instruction) + + return jsonify(status="ok", **result) + + except json.JSONDecodeError: + return jsonify(status="error", message="Failed to parse LLM response as JSON"), 422 + except Exception as e: + logger.warning(f"NL-to-filter failed: {e}") + safe_msg = classify_llm_error(e) + return jsonify(status="error", message=safe_msg), 500 diff --git a/py-src/data_formulator/agents/__init__.py b/py-src/data_formulator/agents/__init__.py index f1e3a43d..64214950 100644 --- a/py-src/data_formulator/agents/__init__.py +++ b/py-src/data_formulator/agents/__init__.py @@ -6,6 +6,7 @@ from data_formulator.agents.agent_data_load import DataLoadAgent from data_formulator.agents.agent_sort_data import SortDataAgent +from data_formulator.agents.agent_simple import SimpleAgents from data_formulator.agents.agent_interactive_explore import InteractiveExploreAgent from data_formulator.agents.agent_chart_insight import ChartInsightAgent diff --git a/py-src/data_formulator/agents/agent_simple.py b/py-src/data_formulator/agents/agent_simple.py new file mode 100644 index 00000000..1b6c2b6a --- /dev/null +++ b/py-src/data_formulator/agents/agent_simple.py @@ -0,0 +1,156 @@ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. + +"""Lightweight single-turn agents that wrap a system prompt + one LLM call. + +Each method takes a ``Client`` instance plus task-specific parameters and +returns a plain dict result (no streaming, no workspace access). +""" + +import json +import logging + +from data_formulator.agents.agent_utils import extract_json_objects + +logger = logging.getLogger(__name__) + + +# --------------------------------------------------------------------------- +# System prompts +# --------------------------------------------------------------------------- + +_NL_FILTER_SYSTEM_PROMPT = """\ +You are a data loading assistant. The user wants to load a subset of a database table \ +based on a natural language description. Your job is to translate their request into a \ +structured JSON query specification (Selection, Projection-free, Join-free — SPJ without projection). + +You will be given: +- A table's column schema (name + type) +- A user's natural language description of what data they want + +Return a JSON object with: +{ + "conditions": [ + {"column": "", "operator": "", "value": } + ], + "sort_columns": [""], // optional — include if the user mentions ordering + "sort_order": "asc" | "desc", // optional, default "asc" + "limit": // optional — include if the user mentions a row limit +} + +All columns will be selected (no projection). Focus on filtering (WHERE), sorting (ORDER BY), and limiting (LIMIT). + +Valid operators: =, !=, >, <, >=, <=, LIKE, NOT LIKE, IN, NOT IN, BETWEEN, IS NULL, IS NOT NULL +- For LIKE: use SQL wildcards (e.g. "value": "%pattern%") +- For IN / NOT IN: "value" is an array +- For BETWEEN: "value" is [lo, hi] +- For IS NULL / IS NOT NULL: omit "value" + +Rules: +- Only use column names from the provided schema. +- Infer reasonable filter values from context (e.g. "recent" → sort by date desc + limit, \ +"last year" → date >= '2025-01-01'). +- If the user mentions sorting or limiting, include sort_columns/sort_order/limit. +- If the instruction is empty or unclear, return {"conditions": []}. +- Return ONLY the JSON object, no markdown fences or explanation.""" + +_WORKSPACE_SUMMARY_SYSTEM_PROMPT = ( + "You are a helpful assistant. Generate a very short name (3-5 words) " + "for a data analysis workspace based on the context below. " + "Return ONLY the name, no quotes, no explanation." +) + + +# --------------------------------------------------------------------------- +# Class +# --------------------------------------------------------------------------- + +class SimpleAgents: + """Collection of lightweight single-turn LLM agents.""" + + def __init__(self, client): + self.client = client + + # -- NL → structured filter conditions ---------------------------------- + + def nl_to_filter(self, columns: list[dict], instruction: str) -> dict: + """Translate *instruction* into structured filter conditions. + + Parameters + ---------- + columns : list[dict] + Column schema, each entry ``{"name": ..., "type": ...}``. + instruction : str + Natural-language filter description from the user. + + Returns + ------- + dict with keys ``conditions``, ``sort_columns``, ``sort_order``, ``limit``. + """ + col_desc = "\n".join( + f" - {c['name']} ({c.get('type', 'unknown')})" for c in columns + ) + user_msg = f"Table columns:\n{col_desc}\n\nFilter instruction: {instruction}" + + messages = [ + {"role": "system", "content": _NL_FILTER_SYSTEM_PROMPT}, + {"role": "user", "content": user_msg}, + ] + + logger.info("[SimpleAgents.nl_to_filter] run start") + response = self.client.get_completion(messages=messages) + raw = response.choices[0].message.content.strip() + + # Strip markdown code fences if present + if raw.startswith("```"): + raw = raw.split("\n", 1)[1] if "\n" in raw else raw[3:] + if raw.endswith("```"): + raw = raw[:-3] + raw = raw.strip() + + result = json.loads(raw) + + # Validate: only allow known column names + known_cols = {c["name"] for c in columns} + valid_conditions = [ + cond for cond in (result.get("conditions") or []) + if cond.get("column") in known_cols + ] + + out = { + "conditions": valid_conditions, + "sort_columns": result.get("sort_columns"), + "sort_order": result.get("sort_order"), + "limit": result.get("limit"), + } + logger.info(f"[SimpleAgents.nl_to_filter] done | {len(valid_conditions)} conditions") + return out + + # -- Workspace summary / auto-name -------------------------------------- + + def workspace_summary(self, table_names: list[str], user_query: str = "") -> str: + """Generate a short 3-5 word name for a workspace. + + Returns the summary string (already truncated to 60 chars). + """ + prompt_parts = [] + if table_names: + prompt_parts.append(f"Data tables: {', '.join(table_names)}") + if user_query: + prompt_parts.append(f"User's first request: {user_query}") + + context_str = ". ".join(prompt_parts) if prompt_parts else "A data analysis session" + + messages = [ + {"role": "system", "content": _WORKSPACE_SUMMARY_SYSTEM_PROMPT}, + {"role": "user", "content": context_str}, + ] + + logger.info("[SimpleAgents.workspace_summary] run start") + response = self.client.get_completion(messages=messages) + summary = response.choices[0].message.content.strip().strip("\"'") + if len(summary) > 60: + summary = summary[:57] + "..." + + logger.info(f"[SimpleAgents.workspace_summary] done | \"{summary}\"") + return summary diff --git a/py-src/data_formulator/data_connector.py b/py-src/data_formulator/data_connector.py index 589de6b7..6ab82ae6 100644 --- a/py-src/data_formulator/data_connector.py +++ b/py-src/data_formulator/data_connector.py @@ -30,7 +30,6 @@ CatalogNode, ExternalDataLoader, ) -from data_formulator.plugins.base import DataSourcePlugin logger = logging.getLogger(__name__) @@ -75,15 +74,15 @@ def _hierarchy_dicts(levels: list[dict[str, str]]) -> list[dict[str, str]]: # DataConnector # --------------------------------------------------------------------------- -class DataConnector(DataSourcePlugin): - """A DataSourcePlugin auto-generated from an ExternalDataLoader. +class DataConnector: + """Generic lifecycle wrapper for an ExternalDataLoader. - Provides: - - **Auth routes**: connect / disconnect / status - - **Catalog routes**: ls / metadata - - **Data routes**: import / refresh / preview + Provides connect / disconnect / status, catalog browsing, and + data import / preview / refresh — all driven by the underlying + loader's existing methods. - All driven by the underlying loader's existing methods. + Routes live on the shared ``connectors_bp`` blueprint; this class + is a plain Python object (no per-instance blueprint). """ def __init__( @@ -125,11 +124,6 @@ def from_loader( # -- DataSourcePlugin interface ---------------------------------------- - @staticmethod - def manifest() -> dict[str, Any]: - # Static stub; per-instance config is in _manifest(). - return {} - def _manifest(self) -> dict[str, Any]: return { "id": self._source_id, @@ -197,21 +191,6 @@ def _resolve_delegated_login(self) -> dict[str, Any] | None: # Only send safe fields to the frontend return {"login_url": login_url, "label": raw.get("label", "")} - def create_blueprint(self) -> Blueprint: - bp = Blueprint( - f"connector_{self._source_id}", - __name__, - url_prefix=f"/api/connectors/{self._source_id}", - ) - self._register_auth_routes(bp) - self._register_catalog_routes(bp) - self._register_data_routes(bp) - - return bp - - def on_enable(self, app: Flask) -> None: - logger.info("DataConnector '%s' enabled", self._source_id) - # -- Identity + Loader Management -------------------------------------- @staticmethod @@ -299,7 +278,8 @@ def _persist_credentials(self, user_params: dict[str, Any]) -> bool: identity = self._get_identity() return self._vault_store(identity, user_params) - def _disconnect(self) -> None: + def _delete_credentials(self) -> None: + """Delete: clear in-memory loader AND vault credentials.""" identity = self._get_identity() self._loaders.pop(identity, None) self._vault_delete(identity) @@ -342,316 +322,655 @@ def _require_loader(self) -> ExternalDataLoader: return loader raise ValueError("Not connected. Please connect first.") - # -- Auth Routes ------------------------------------------------------- - def _register_auth_routes(self, bp: Blueprint) -> None: - source = self +# --------------------------------------------------------------------------- +# Shared action routes — connector_id in JSON body +# --------------------------------------------------------------------------- - @bp.route("/auth/connect", methods=["POST"]) - def auth_connect(): - try: - data = request.get_json() or {} - user_params = data.get("params", {}) - persist = data.get("persist", True) - loader = source._connect(user_params) +def _resolve_connector(data: dict[str, Any]) -> DataConnector: + """Look up a DataConnector from the request body's ``connector_id``. - if not loader.test_connection(): - source._disconnect() - return jsonify({"status": "error", "message": "Connection test failed"}), 400 + Returns the connector or raises ``KeyError`` (→ 404). + """ + connector_id = data.get("connector_id") + if not connector_id: + raise KeyError("connector_id is required") + connector = DATA_CONNECTORS.get(connector_id) + if connector is None: + raise KeyError(f"Unknown connector: {connector_id}") + return connector - # Only persist to vault after connection is verified - persisted = False - if persist: - persisted = source._persist_credentials(user_params) - else: - # User opted out — clear any previously stored credentials - identity = source._get_identity() - source._vault_delete(identity) - - safe = loader.get_safe_params() - return jsonify({ - "status": "connected", - "persisted": persisted, - "params": safe, - "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), - "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), - "pinned_scope": loader.pinned_scope(), - }) - except Exception as e: - source._disconnect() - safe_msg, status_code = _sanitize_error(e) - return jsonify({"status": "error", "message": safe_msg}), status_code - - @bp.route("/auth/disconnect", methods=["POST"]) - def auth_disconnect(): - source._disconnect() - return jsonify({"status": "disconnected"}) - - @bp.route("/auth/token-connect", methods=["POST"]) - def auth_token_connect(): - """Accept tokens from a delegated (popup) login flow and create a connection. - - Expected JSON body:: - - { - "access_token": "eyJ...", - "refresh_token": "eyJ...", // optional - "user": {...}, // optional user info - "params": {"url": "..."}, // extra params (e.g. Superset base URL) - "persist": true - } - """ - try: - data = request.get_json() or {} - access_token = data.get("access_token") - if not access_token: - return jsonify({"status": "error", "message": "Missing access_token"}), 400 - extra_params = data.get("params", {}) - persist = data.get("persist", True) +# --------------------------------------------------------------------------- +# Global connector management routes +# --------------------------------------------------------------------------- - # Build loader params: merge default + extra + tokens - user_params = { - **extra_params, - "access_token": access_token, - "refresh_token": data.get("refresh_token", ""), - } +connectors_bp = Blueprint("connectors_global", __name__) - loader = source._connect(user_params) - if not loader.test_connection(): - source._disconnect() - return jsonify({"status": "error", "message": "Token connection test failed"}), 400 +@connectors_bp.route("/api/data-loaders", methods=["GET"]) +def list_data_loaders(): + """Return available loader types + their param definitions. - persisted = False - if persist: - persisted = source._persist_credentials(user_params) - - safe = loader.get_safe_params() - return jsonify({ - "status": "connected", - "persisted": persisted, - "params": safe, - "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), - "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), - "pinned_scope": loader.pinned_scope(), - "user": data.get("user", {}), - }) - except Exception as e: - source._disconnect() - safe_msg, status_code = _sanitize_error(e) - return jsonify({"status": "error", "message": safe_msg}), status_code + This is the discovery endpoint — tells the frontend what kinds of + connectors can be created. + """ + from data_formulator.data_loader import DATA_LOADERS, DISABLED_LOADERS - @bp.route("/auth/status", methods=["GET"]) - def auth_status(): - identity = source._get_identity() - loader = source._get_loader(identity) - # Try auto-reconnect from vault if no in-memory loader - if loader is None: - loader = source._try_auto_reconnect(identity) - if loader is None: - has_stored = source.has_stored_credentials(identity) - return jsonify({ - "connected": False, - "has_stored_credentials": has_stored, - "params_form": source.get_frontend_config()["params_form"], - }) + loaders = [] + for key, loader_class in DATA_LOADERS.items(): + params = loader_class.list_params() + # Append common table_filter param (same as DataConnector.get_frontend_config) + params.append(DataConnector._TABLE_FILTER_PARAM) + loaders.append({ + "type": key, + "name": key.replace("_", " ").title(), + "params": params, + "hierarchy": _hierarchy_dicts(loader_class.catalog_hierarchy()), + "auth_mode": loader_class.auth_mode(), + "auth_instructions": loader_class.auth_instructions(), + "delegated_login": loader_class.delegated_login_config(), + }) + + disabled = { + name: {"install_hint": hint} + for name, hint in DISABLED_LOADERS.items() + } + + return jsonify({"loaders": loaders, "disabled": disabled}) + + +@connectors_bp.route("/api/connectors", methods=["GET"]) +def list_connectors(): + """List all registered connector instances (admin + user) with connection status.""" + from data_formulator.security.auth import get_identity_id + + try: + identity = get_identity_id() + except Exception: + identity = None + + # Ensure user connectors are loaded + if identity: + load_connectors(identity) + + result = [] + for sid, connector in DATA_CONNECTORS.items(): + connected = False + if identity: + connected = ( + connector._get_loader(identity) is not None + or connector.has_stored_credentials(identity) + ) + is_admin = sid in _ADMIN_CONNECTOR_IDS + cfg = connector.get_frontend_config() + result.append({ + "id": sid, + "source": "admin" if is_admin else "user", + "deletable": not is_admin, + "source_type": connector._loader_class.__name__, + "display_name": connector._display_name, + "icon": connector._icon, + "connected": connected, + "params_form": cfg["params_form"], + "pinned_params": cfg["pinned_params"], + "hierarchy": cfg["hierarchy"], + "effective_hierarchy": cfg["effective_hierarchy"], + "auth_mode": cfg["auth_mode"], + "delegated_login": cfg.get("delegated_login"), + }) + + return jsonify({"connectors": result}) + + +@connectors_bp.route("/api/connectors", methods=["POST"]) +def create_connector(): + """Create a new user connector instance from a loader type. + + Request body:: + + { + "loader_type": "mysql", + "display_name": "MySQL · prod", + "params": {"host": "...", "port": "3306", ...}, + "icon": "mysql", + "persist": true + } + + Persists to ``DATA_FORMULATOR_HOME/users//connectors.yaml``. + """ + from data_formulator.data_loader import DATA_LOADERS + from data_formulator.security.auth import get_identity_id + + data = request.get_json() or {} + loader_type = data.get("loader_type") + if not loader_type: + return jsonify({"status": "error", "message": "loader_type is required"}), 400 + + loader_class = DATA_LOADERS.get(loader_type) + if not loader_class: + return jsonify({"status": "error", "message": f"Unknown loader type: {loader_type}"}), 400 + + display_name = data.get("display_name", loader_type.replace("_", " ").title()) + icon = data.get("icon", loader_type) + default_params = data.get("params", {}) + + # Generate instance ID: loader_type:slug + import re + slug = re.sub(r'[^a-z0-9\-]', '-', display_name.lower()).strip('-') + slug = re.sub(r'-+', '-', slug) + instance_id = f"{loader_type}:{slug}" if slug else loader_type + + # Avoid collision with existing instances + if instance_id in DATA_CONNECTORS: + for i in range(2, 100): + candidate = f"{instance_id}-{i}" + if candidate not in DATA_CONNECTORS: + instance_id = candidate + display_name = f"{display_name} ({i})" + break + else: + return jsonify({"status": "error", "message": "Too many connectors with this name"}), 400 + + connector = DataConnector.from_loader( + loader_class, + source_id=instance_id, + display_name=display_name, + default_params=default_params, + icon=icon, + ) + DATA_CONNECTORS[instance_id] = connector + + # Persist to user connectors.yaml + try: + identity = get_identity_id() + _persist_user_connector(identity, SourceSpec( + source_id=instance_id, + loader_type=loader_type, + display_name=display_name, + default_params=default_params, + icon=icon, + source="user", + )) + except Exception as e: + logger.warning("Failed to persist connector '%s' to user config: %s", instance_id, e) + + # Auto-connect if params were provided + result_data: dict[str, Any] = { + "status": "created", + "id": instance_id, + "display_name": display_name, + "source": "user", + "deletable": True, + } + + if default_params: + try: + user_params = data.get("connect_params", default_params) + persist = data.get("persist", True) + loader = connector._connect(user_params) + if loader.test_connection(): + if persist: + connector._persist_credentials(user_params) + result_data["connected"] = True + else: + identity_c = connector._get_identity() + connector._loaders.pop(identity_c, None) + result_data["connected"] = False + result_data["connect_error"] = "Connection test failed" + except Exception as e: try: - alive = loader.test_connection() + identity_c = connector._get_identity() + connector._loaders.pop(identity_c, None) except Exception: - alive = False - if not alive: - source._disconnect() - return jsonify({ - "connected": False, - "has_stored_credentials": False, - "params_form": source.get_frontend_config()["params_form"], - }) - return jsonify({ - "connected": True, - "persisted": source._get_vault() is not None, - "params": loader.get_safe_params(), - "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), - "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), - "pinned_scope": loader.pinned_scope(), - }) + pass + result_data["connected"] = False + result_data["connect_error"] = str(e) - # -- Catalog Routes ---------------------------------------------------- + logger.info("Created user connector '%s' (type=%s)", instance_id, loader_type) + return jsonify(result_data), 201 - def _register_catalog_routes(self, bp: Blueprint) -> None: - source = self - @bp.route("/catalog/ls", methods=["POST"]) - def catalog_ls(): - try: - loader = source._require_loader() - data = request.get_json() or {} - path = data.get("path", []) - name_filter = data.get("filter") - - nodes = loader.ls(path=path, filter=name_filter) - return jsonify({ - "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), - "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), - "path": path, - "nodes": [_node_to_dict(n) for n in nodes], - }) - except Exception as e: - safe_msg, status_code = _sanitize_error(e) - return jsonify({"status": "error", "message": safe_msg}), status_code +def _persist_user_connector(identity: str, spec: "SourceSpec") -> None: + """Add a connector spec to the user's connectors.yaml (append, no duplicates).""" + existing = _load_user_specs(identity) + # Remove any existing entry with same ID + existing = [s for s in existing if s.source_id != spec.source_id] + existing.append(spec) + _save_user_connectors(identity, existing) - @bp.route("/catalog/metadata", methods=["POST"]) - def catalog_metadata(): - try: - loader = source._require_loader() - data = request.get_json() or {} - path = data.get("path", []) - metadata = loader.get_metadata(path) - return jsonify({"path": path, "metadata": metadata}) - except Exception as e: - safe_msg, status_code = _sanitize_error(e) - return jsonify({"status": "error", "message": safe_msg}), status_code +def _remove_user_connector(identity: str, connector_id: str) -> None: + """Remove a connector spec from the user's connectors.yaml.""" + existing = _load_user_specs(identity) + existing = [s for s in existing if s.source_id != connector_id] + _save_user_connectors(identity, existing) - @bp.route("/catalog/list_tables", methods=["POST"]) - def catalog_list_tables(): - """Flat/eager listing of all tables in pinned scope.""" - try: - loader = source._require_loader() - data = request.get_json() or {} - table_filter = data.get("filter") - tables = loader.list_tables(table_filter=table_filter) - return jsonify({"tables": tables}) - except Exception as e: - safe_msg, status_code = _sanitize_error(e) - return jsonify({"status": "error", "message": safe_msg}), status_code +@connectors_bp.route("/api/connectors/", methods=["DELETE"]) +def delete_connector(connector_id: str): + """Delete a **user** connector instance, clear vault credentials, and remove from config. + + Admin connectors cannot be deleted (returns 403). + """ + from data_formulator.security.auth import get_identity_id + + if connector_id in _ADMIN_CONNECTOR_IDS: + return jsonify({"status": "error", "message": "Admin connectors cannot be deleted"}), 403 + + connector = DATA_CONNECTORS.get(connector_id) + if not connector: + return jsonify({"status": "error", "message": f"Unknown connector: {connector_id}"}), 404 - # -- Data Routes ------------------------------------------------------- + # Full cleanup: in-memory loader + vault credentials + try: + connector._delete_credentials() + except Exception: + pass - def _register_data_routes(self, bp: Blueprint) -> None: - source = self + DATA_CONNECTORS.pop(connector_id, None) - @bp.route("/data/import", methods=["POST"]) - def data_import(): + # Remove from user connectors.yaml + try: + identity = get_identity_id() + _remove_user_connector(identity, connector_id) + except Exception as e: + logger.warning("Failed to remove connector '%s' from user config: %s", connector_id, e) + + logger.info("Deleted user connector '%s'", connector_id) + return jsonify({"status": "deleted", "id": connector_id}) + + +# --------------------------------------------------------------------------- +# Action routes (shared — connector_id in JSON body) +# --------------------------------------------------------------------------- + +@connectors_bp.route("/api/connectors/connect", methods=["POST"]) +def connector_connect(): + """(Re)connect / authenticate a connector instance. + + Accepts ``connector_id`` plus two modes: + + **Credential mode** (default):: + + {"connector_id": "mysql:prod", "params": {...}, "persist": true} + + **Token mode** (delegated/SSO):: + + {"connector_id": "...", "mode": "token", "access_token": "eyJ...", + "refresh_token": "...", "user": {...}, "params": {...}, "persist": true} + """ + data = request.get_json() or {} + try: + source = _resolve_connector(data) + except KeyError as e: + return jsonify({"status": "error", "message": str(e)}), 404 + + try: + mode = data.get("mode", "credentials") + persist = data.get("persist", True) + + if mode == "token": + access_token = data.get("access_token") + if not access_token: + return jsonify({"status": "error", "message": "Missing access_token"}), 400 + extra_params = data.get("params", {}) + user_params = { + **extra_params, + "access_token": access_token, + "refresh_token": data.get("refresh_token", ""), + } + else: + user_params = data.get("params", {}) + + loader = source._connect(user_params) + + if not loader.test_connection(): + identity = source._get_identity() + source._loaders.pop(identity, None) + return jsonify({"status": "error", "message": "Connection test failed"}), 400 + + persisted = False + if persist: + persisted = source._persist_credentials(user_params) + else: + identity = source._get_identity() + source._vault_delete(identity) + + safe = loader.get_safe_params() + result = { + "status": "connected", + "persisted": persisted, + "params": safe, + "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), + "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), + "pinned_scope": loader.pinned_scope(), + } + if mode == "token" and data.get("user"): + result["user"] = data["user"] + return jsonify(result) + except Exception as e: + try: + identity = source._get_identity() + source._loaders.pop(identity, None) + except Exception: + pass + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + +@connectors_bp.route("/api/connectors/get-status", methods=["POST"]) +def connector_get_status(): + """Check connection status (no side effects — no auto-reconnect).""" + data = request.get_json() or {} + try: + source = _resolve_connector(data) + except KeyError as e: + return jsonify({"status": "error", "message": str(e)}), 404 + + identity = source._get_identity() + loader = source._get_loader(identity) + if loader is None: + has_stored = source.has_stored_credentials(identity) + return jsonify({ + "connected": False, + "has_stored_credentials": has_stored, + "params_form": source.get_frontend_config()["params_form"], + }) + try: + alive = loader.test_connection() + except Exception: + alive = False + if not alive: + source._loaders.pop(identity, None) + return jsonify({ + "connected": False, + "has_stored_credentials": source.has_stored_credentials(identity), + "params_form": source.get_frontend_config()["params_form"], + }) + return jsonify({ + "connected": True, + "persisted": source._get_vault() is not None, + "params": loader.get_safe_params(), + "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), + "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), + "pinned_scope": loader.pinned_scope(), + }) + + +@connectors_bp.route("/api/connectors/get-catalog", methods=["POST"]) +def connector_get_catalog(): + """Browse a catalog node (merged ls + metadata).""" + data = request.get_json() or {} + try: + source = _resolve_connector(data) + except KeyError as e: + return jsonify({"status": "error", "message": str(e)}), 404 + + try: + loader = source._require_loader() + path = data.get("path", []) + name_filter = data.get("filter") + + nodes = loader.ls(path=path, filter=name_filter) + result: dict[str, Any] = { + "hierarchy": _hierarchy_dicts(loader.catalog_hierarchy()), + "effective_hierarchy": _hierarchy_dicts(loader.effective_hierarchy()), + "path": path, + "nodes": [_node_to_dict(n) for n in nodes], + } + if path: try: - loader = source._require_loader() - data = request.get_json() or {} + metadata = loader.get_metadata(path) + result["metadata"] = metadata + except Exception: + pass + return jsonify(result) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + +@connectors_bp.route("/api/connectors/get-catalog-tree", methods=["POST"]) +def connector_get_catalog_tree(): + """Build nested tree from ``list_tables()`` with full metadata.""" + data = request.get_json() or {} + try: + source = _resolve_connector(data) + except KeyError as e: + return jsonify({"status": "error", "message": str(e)}), 404 + + try: + loader = source._require_loader() + name_filter = data.get("filter") + + result = loader.list_tables_tree(table_filter=name_filter) + return jsonify({ + "hierarchy": _hierarchy_dicts(result["hierarchy"]), + "effective_hierarchy": _hierarchy_dicts(result["effective_hierarchy"]), + "tree": result["tree"], + }) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + +@connectors_bp.route("/api/connectors/import-data", methods=["POST"]) +def connector_import_data(): + data = request.get_json() or {} + try: + source = _resolve_connector(data) + except KeyError as e: + return jsonify({"status": "error", "message": str(e)}), 404 + + try: + loader = source._require_loader() + + source_table = data.get("source_table") + if not source_table: + return jsonify({"status": "error", "message": "source_table is required"}), 400 + + table_name = data.get("table_name") + import_options = data.get("import_options", {}) + + from data_formulator.security.auth import get_identity_id + from data_formulator.workspace_factory import get_workspace + from data_formulator.datalake.parquet_utils import sanitize_table_name + + workspace = get_workspace(get_identity_id()) + + if not table_name: + raw = source_table.split(".")[-1] if "." in source_table else source_table + table_name = raw + safe_name = sanitize_table_name(table_name) + + meta = loader.ingest_to_workspace( + workspace=workspace, + table_name=safe_name, + source_table=source_table, + import_options=import_options or None, + ) + return jsonify({ + "status": "success", + "table_name": meta.name, + "row_count": meta.row_count, + "refreshable": True, + }) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + +@connectors_bp.route("/api/connectors/refresh-data", methods=["POST"]) +def connector_refresh_data(): + data = request.get_json() or {} + try: + source = _resolve_connector(data) + except KeyError as e: + return jsonify({"status": "error", "message": str(e)}), 404 + + try: + loader = source._require_loader() + table_name = data.get("table_name") + if not table_name: + return jsonify({"status": "error", "message": "table_name is required"}), 400 + + from data_formulator.security.auth import get_identity_id + from data_formulator.workspace_factory import get_workspace + + workspace = get_workspace(get_identity_id()) + meta = workspace.get_table_metadata(table_name) + if meta is None or not meta.source_table: + return jsonify({"status": "error", "message": f"No refreshable source for '{table_name}'"}), 400 + + arrow_table = loader.fetch_data_as_arrow( + source_table=meta.source_table, + import_options=meta.import_options, + ) + new_meta, data_changed = workspace.refresh_parquet_from_arrow(table_name, arrow_table) + return jsonify({ + "status": "success", + "table_name": table_name, + "row_count": new_meta.row_count, + "data_changed": data_changed, + }) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + +@connectors_bp.route("/api/connectors/preview-data", methods=["POST"]) +def connector_preview_data(): + data = request.get_json() or {} + try: + source = _resolve_connector(data) + except KeyError as e: + return jsonify({"status": "error", "message": str(e)}), 404 + + try: + loader = source._require_loader() + source_table = data.get("source_table") + if not source_table: + return jsonify({"status": "error", "message": "source_table is required"}), 400 + + import_options = data.get("import_options", {}) + if not import_options: + size = data.get("limit", 10) + import_options = {"size": size} + + arrow_table = loader.fetch_data_as_arrow( + source_table=source_table, + import_options=import_options, + ) + df = arrow_table.to_pandas() + rows = _json.loads(df.to_json(orient="records", date_format="iso")) + columns = [{"name": col, "type": str(df[col].dtype)} for col in df.columns] + + return jsonify({ + "status": "success", + "columns": columns, + "rows": rows, + "row_count": len(rows), + "total_row_count": len(rows), + }) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code + + +@connectors_bp.route("/api/connectors/import-group", methods=["POST"]) +def connector_import_group(): + """Import all tables from a table_group with shared filters.""" + data = request.get_json() or {} + try: + source = _resolve_connector(data) + except KeyError as e: + return jsonify({"status": "error", "message": str(e)}), 404 + + try: + loader = source._require_loader() + + tables = data.get("tables") + if not tables or not isinstance(tables, list): + return jsonify({"status": "error", "message": "tables list is required"}), 400 + + row_limit = data.get("row_limit", -1) + source_filters = data.get("source_filters", []) + group_name = data.get("group_name", "") + + from data_formulator.security.auth import get_identity_id + from data_formulator.workspace_factory import get_workspace + from data_formulator.datalake.parquet_utils import sanitize_table_name - source_table = data.get("source_table") - if not source_table: - return jsonify({"status": "error", "message": "source_table is required"}), 400 + workspace = get_workspace(get_identity_id()) + results = [] - table_name = data.get("table_name") - import_options = data.get("import_options", {}) + for table_entry in tables: + ds_id = table_entry.get("dataset_id") + ds_name = table_entry.get("name", f"dataset_{ds_id}") + if not ds_id: + continue - from data_formulator.security.auth import get_identity_id - from data_formulator.workspace_factory import get_workspace - from data_formulator.datalake.parquet_utils import sanitize_table_name + table_filters = [ + f for f in source_filters + if not f.get("applies_to") or ds_id in f.get("applies_to", []) + ] - workspace = get_workspace(get_identity_id()) + import_options: dict = {} + if row_limit > 0: + import_options["size"] = row_limit + if table_filters: + import_options["source_filters"] = table_filters - if not table_name: - raw = source_table.split(".")[-1] if "." in source_table else source_table - table_name = raw - safe_name = sanitize_table_name(table_name) + source_table = str(ds_id) + table_name = f"{group_name} / {ds_name}" if group_name else ds_name + safe_name = sanitize_table_name(table_name) + try: meta = loader.ingest_to_workspace( workspace=workspace, table_name=safe_name, source_table=source_table, import_options=import_options or None, ) - return jsonify({ + results.append({ "status": "success", + "dataset_id": ds_id, "table_name": meta.name, "row_count": meta.row_count, - "refreshable": True, }) except Exception as e: - safe_msg, status_code = _sanitize_error(e) - return jsonify({"status": "error", "message": safe_msg}), status_code - - @bp.route("/data/refresh", methods=["POST"]) - def data_refresh(): - try: - loader = source._require_loader() - data = request.get_json() or {} - table_name = data.get("table_name") - if not table_name: - return jsonify({"status": "error", "message": "table_name is required"}), 400 - - from data_formulator.security.auth import get_identity_id - from data_formulator.workspace_factory import get_workspace - - workspace = get_workspace(get_identity_id()) - meta = workspace.get_table_metadata(table_name) - if meta is None or not meta.source_table: - return jsonify({"status": "error", "message": f"No refreshable source for '{table_name}'"}), 400 - - arrow_table = loader.fetch_data_as_arrow( - source_table=meta.source_table, - import_options=meta.import_options, - ) - new_meta, data_changed = workspace.refresh_parquet_from_arrow(table_name, arrow_table) - return jsonify({ - "status": "success", - "table_name": table_name, - "row_count": new_meta.row_count, - "data_changed": data_changed, + logger.warning("import-group: failed to load dataset %s: %s", ds_id, e) + results.append({ + "status": "error", + "dataset_id": ds_id, + "table_name": ds_name, + "message": str(e), }) - except Exception as e: - safe_msg, status_code = _sanitize_error(e) - return jsonify({"status": "error", "message": safe_msg}), status_code - - @bp.route("/data/preview", methods=["POST"]) - def data_preview(): - try: - loader = source._require_loader() - data = request.get_json() or {} - source_table = data.get("source_table") - if not source_table: - return jsonify({"status": "error", "message": "source_table is required"}), 400 - - import_options = data.get("import_options", {}) - if not import_options: - # Legacy: accept top-level size/row_limit params - size = data.get("size") or data.get("row_limit", 10) - import_options = {"size": size} - - arrow_table = loader.fetch_data_as_arrow( - source_table=source_table, - import_options=import_options, - ) - df = arrow_table.to_pandas() - rows = _json.loads(df.to_json(orient="records", date_format="iso")) - columns = [{"name": col, "type": str(df[col].dtype)} for col in df.columns] - return jsonify({ - "status": "success", - "columns": columns, - "rows": rows, - "row_count": len(rows), - "total_row_count": len(rows), - }) - except Exception as e: - safe_msg, status_code = _sanitize_error(e) - return jsonify({"status": "error", "message": safe_msg}), status_code + return jsonify({ + "status": "success", + "results": results, + }) + except Exception as e: + safe_msg, status_code = _sanitize_error(e) + return jsonify({"status": "error", "message": safe_msg}), status_code # --------------------------------------------------------------------------- -# Configuration loading +# Configuration loading — connectors.yaml (admin + user) # --------------------------------------------------------------------------- @dataclasses.dataclass class SourceSpec: - """A single data source entry from config (YAML, env vars, or auto-discovery).""" + """A single connector entry from config (YAML or env vars).""" source_id: str - loader_type: str # registry key in DATA_LOADERS (e.g. "postgresql") + loader_type: str # registry key in DATA_LOADERS (e.g. "mysql") display_name: str default_params: dict[str, Any] = dataclasses.field(default_factory=dict) icon: str = "" auto_connect: bool = False + source: str = "admin" # "admin" or "user" def _resolve_env_refs(params: dict[str, Any]) -> dict[str, Any]: @@ -667,39 +986,70 @@ def _resolve_env_refs(params: dict[str, Any]) -> dict[str, Any]: return resolved -def _load_yaml_config() -> dict | None: - """Search for ``data-sources.yml`` in standard locations and return parsed content.""" +def _get_df_home() -> "Path": + """Return DATA_FORMULATOR_HOME as a Path.""" + from data_formulator.datalake.workspace import get_data_formulator_home + return get_data_formulator_home() + + +def _load_connectors_yaml(path: "Path") -> list[dict]: + """Load a connectors.yaml file and return the list of connector entries.""" + if not path.is_file(): + return [] + try: + import yaml + with open(path) as f: + data = yaml.safe_load(f) or {} + entries = data.get("connectors", []) + if not isinstance(entries, list): + logger.warning("connectors.yaml at %s: 'connectors' must be a list", path) + return [] + logger.info("Loaded %d connector(s) from %s", len(entries), path) + return entries + except Exception as e: + logger.warning("Failed to parse %s: %s", path, e) + return [] + + +def _save_user_connectors(identity: str, specs: list[SourceSpec]) -> None: + """Write user-created connectors to DATA_FORMULATOR_HOME/users//connectors.yaml.""" + from pathlib import Path + from data_formulator.datalake.workspace import get_user_home + user_dir = get_user_home(identity) + user_dir.mkdir(parents=True, exist_ok=True) + path = user_dir / "connectors.yaml" + + entries = [] + for s in specs: + entry: dict[str, Any] = { + "id": s.source_id, + "type": s.loader_type, + "name": s.display_name, + } + if s.default_params: + entry["params"] = s.default_params + if s.icon and s.icon != s.loader_type: + entry["icon"] = s.icon + entries.append(entry) + + try: + import yaml + with open(path, "w") as f: + yaml.safe_dump({"connectors": entries}, f, default_flow_style=False, sort_keys=False) + logger.info("Saved %d user connector(s) to %s", len(entries), path) + except Exception as e: + logger.warning("Failed to write %s: %s", path, e) + + +def _load_admin_specs() -> list[SourceSpec]: + """Load admin connectors from DATA_FORMULATOR_HOME/connectors.yaml + env vars.""" import os from pathlib import Path - search_paths = [ - Path.cwd() / "data-sources.yml", - Path.home() / ".data-formulator" / "data-sources.yml", - Path("/etc/data-formulator/data-sources.yml"), - ] - # Also check DATA_FORMULATOR_HOME - df_home = os.environ.get("DATA_FORMULATOR_HOME") - if df_home: - search_paths.insert(0, Path(df_home) / "data-sources.yml") - - for p in search_paths: - if p.is_file(): - try: - import yaml - with open(p) as f: - data = yaml.safe_load(f) - logger.info("Loaded data source config from %s", p) - return data - except Exception as e: - logger.warning("Failed to parse %s: %s", p, e) - return None - + specs: list[SourceSpec] = [] -def _parse_env_sources() -> list[SourceSpec]: - """Parse ``DF_SOURCES____=`` environment variables.""" - import os + # 1. Env vars (DF_SOURCES____=) — highest priority prefix = "DF_SOURCES__" - # Collect: {instance_id: {key: value}} raw: dict[str, dict[str, str]] = {} for env_key, env_val in os.environ.items(): if not env_key.startswith(prefix): @@ -711,7 +1061,6 @@ def _parse_env_sources() -> list[SourceSpec]: instance_id, field = parts[0], parts[1].lower() raw.setdefault(instance_id, {})[field] = env_val - specs = [] for instance_id, fields in raw.items(): loader_type = fields.pop("type", "") if not loader_type: @@ -719,7 +1068,6 @@ def _parse_env_sources() -> list[SourceSpec]: continue name = fields.pop("name", loader_type.replace("_", " ").title()) icon = fields.pop("icon", "") - # Remaining fields with "params__" prefix → params dict params: dict[str, str] = {} other: dict[str, str] = {} for k, v in fields.items(): @@ -727,7 +1075,6 @@ def _parse_env_sources() -> list[SourceSpec]: params[k[len("params__"):]] = v else: other[k] = v - # Also treat top-level non-reserved keys as params params.update(other) specs.append(SourceSpec( source_id=instance_id, @@ -735,62 +1082,99 @@ def _parse_env_sources() -> list[SourceSpec]: display_name=name, default_params=params, icon=icon, + source="admin", + )) + + env_ids = {s.source_id for s in specs} + + # 2. connectors.yaml in DATA_FORMULATOR_HOME + try: + admin_path = _get_df_home() / "connectors.yaml" + except Exception: + admin_path = Path("__nonexistent__") + + for i, entry in enumerate(_load_connectors_yaml(admin_path)): + loader_type = entry.get("type", "") + if not loader_type: + continue + sid = entry.get("id") or (f"{loader_type}_{i}" if i > 0 else loader_type) + if sid in env_ids: + continue # env var overrides + specs.append(SourceSpec( + source_id=sid, + loader_type=loader_type, + display_name=entry.get("name", loader_type.replace("_", " ").title()), + default_params=_resolve_env_refs(entry.get("params", {})), + icon=entry.get("icon", ""), + auto_connect=entry.get("auto_connect", False), + source="admin", + )) + + return specs + + +def _load_user_specs(identity: str) -> list[SourceSpec]: + """Load user connectors from DATA_FORMULATOR_HOME/users//connectors.yaml.""" + try: + from data_formulator.datalake.workspace import get_user_home + user_path = get_user_home(identity) / "connectors.yaml" + except Exception: + return [] + + specs: list[SourceSpec] = [] + for i, entry in enumerate(_load_connectors_yaml(user_path)): + loader_type = entry.get("type", "") + if not loader_type: + continue + sid = entry.get("id") or (f"{loader_type}_{i}" if i > 0 else loader_type) + specs.append(SourceSpec( + source_id=sid, + loader_type=loader_type, + display_name=entry.get("name", loader_type.replace("_", " ").title()), + default_params=_resolve_env_refs(entry.get("params", {})), + icon=entry.get("icon", ""), + source="user", )) return specs -def _build_source_specs() -> tuple[list[SourceSpec], bool]: - """Build the list of source specs from config (env + YAML + auto-discovery). +# Track which connector IDs came from admin config (immutable by users). +_ADMIN_CONNECTOR_IDS: set[str] = set() + +# Track identities whose user connectors have been loaded. +_LOADED_USER_IDENTITIES: set[str] = set() - Returns ``(specs, auto_discover)`` where ``auto_discover`` indicates - whether unconfigured loaders should also be registered. + +def load_connectors(identity: str | None = None) -> None: + """Ensure DATA_CONNECTORS contains admin + user connectors for *identity*. + + Admin connectors are loaded at startup by :func:`register_data_connectors`. + Calling this with an identity lazily adds the user's connectors on first + request. Subsequent calls for the same identity are no-ops. """ - import os from data_formulator.data_loader import DATA_LOADERS - # 1. Env vars (highest priority) - env_specs = _parse_env_sources() - - # 2. YAML config - yaml_config = _load_yaml_config() - yaml_specs: list[SourceSpec] = [] - auto_discover = True - if yaml_config: - auto_discover = yaml_config.get("auto_discover", True) - for i, entry in enumerate(yaml_config.get("sources", [])): - loader_type = entry.get("type", "") - if not loader_type: - continue - sid = entry.get("id") or (f"{loader_type}_{i}" if i > 0 else loader_type) - yaml_specs.append(SourceSpec( - source_id=sid, - loader_type=loader_type, - display_name=entry.get("name", loader_type.replace("_", " ").title()), - default_params=_resolve_env_refs(entry.get("params", {})), - icon=entry.get("icon", ""), - auto_connect=entry.get("auto_connect", False), - )) - - # Also respect DF_AUTO_DISCOVER_SOURCES env var - if os.environ.get("DF_AUTO_DISCOVER_SOURCES", "").lower() == "false": - auto_discover = False - - # Merge: env specs override yaml specs with same source_id - env_ids = {s.source_id for s in env_specs} - merged = list(env_specs) + [s for s in yaml_specs if s.source_id not in env_ids] - - # 3. Auto-discovery: add any installed loader not already configured - if auto_discover: - configured_types = {s.loader_type for s in merged} - for key in DATA_LOADERS: - if key not in configured_types: - merged.append(SourceSpec( - source_id=key, - loader_type=key, - display_name=key.replace("_", " ").title(), - )) - - return merged, auto_discover + if not identity or identity in _LOADED_USER_IDENTITIES: + return + + _LOADED_USER_IDENTITIES.add(identity) + + user_specs = _load_user_specs(identity) + for spec in user_specs: + if spec.source_id in DATA_CONNECTORS: + continue # admin connector takes precedence + loader_class = DATA_LOADERS.get(spec.loader_type) + if not loader_class: + continue + source = DataConnector.from_loader( + loader_class, + source_id=spec.source_id, + display_name=spec.display_name, + default_params=spec.default_params, + icon=spec.icon or spec.loader_type, + ) + DATA_CONNECTORS[spec.source_id] = source + logger.info("Loaded user connector '%s' (type=%s)", spec.source_id, spec.loader_type) # --------------------------------------------------------------------------- @@ -798,15 +1182,24 @@ def _build_source_specs() -> tuple[list[SourceSpec], bool]: # --------------------------------------------------------------------------- def register_data_connectors(app: Flask) -> None: - """Register DataConnector instances from config + auto-discovery. + """Register the global connectors blueprint + admin-provisioned connectors. Called from ``app.py`` during startup. + + - Registers ``connectors_bp`` with all shared routes. + - Loads admin connectors from ``DATA_FORMULATOR_HOME/connectors.yaml`` + and ``DF_SOURCES__*`` env vars. + - User connectors are loaded lazily on first request (need identity). """ from data_formulator.data_loader import DATA_LOADERS, DISABLED_LOADERS - specs, _auto_discover = _build_source_specs() + # 1. Register the global management blueprint + app.register_blueprint(connectors_bp) + + # 2. Load admin connectors + admin_specs = _load_admin_specs() - for spec in specs: + for spec in admin_specs: loader_class = DATA_LOADERS.get(spec.loader_type) if not loader_class: if spec.loader_type in DISABLED_LOADERS: @@ -825,12 +1218,10 @@ def register_data_connectors(app: Flask) -> None: default_params=spec.default_params, icon=spec.icon or spec.loader_type, ) - bp = source.create_blueprint() - app.register_blueprint(bp) - source.on_enable(app) DATA_CONNECTORS[spec.source_id] = source + _ADMIN_CONNECTOR_IDS.add(spec.source_id) logger.info( - "Registered DataConnector '%s' (type=%s%s)", + "Registered admin connector '%s' (type=%s%s)", spec.source_id, spec.loader_type, f", pinned={list(spec.default_params.keys())}" if spec.default_params else "", diff --git a/py-src/data_formulator/data_loader/athena_data_loader.py b/py-src/data_formulator/data_loader/athena_data_loader.py index 0a2fc126..9764ad75 100644 --- a/py-src/data_formulator/data_loader/athena_data_loader.py +++ b/py-src/data_formulator/data_loader/athena_data_loader.py @@ -416,6 +416,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: results.append({ "name": full_table_name, + "path": [db_name, table_name], "metadata": { "row_count": 0, # Athena doesn't provide row counts directly "columns": columns, diff --git a/py-src/data_formulator/data_loader/azure_blob_data_loader.py b/py-src/data_formulator/data_loader/azure_blob_data_loader.py index 56f38c61..47a12685 100644 --- a/py-src/data_formulator/data_loader/azure_blob_data_loader.py +++ b/py-src/data_formulator/data_loader/azure_blob_data_loader.py @@ -211,6 +211,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: results.append({ "name": azure_url, + "path": [azure_url], "metadata": table_metadata }) except Exception as e: diff --git a/py-src/data_formulator/data_loader/bigquery_data_loader.py b/py-src/data_formulator/data_loader/bigquery_data_loader.py index 9aa59d92..c4a7f31c 100644 --- a/py-src/data_formulator/data_loader/bigquery_data_loader.py +++ b/py-src/data_formulator/data_loader/bigquery_data_loader.py @@ -99,6 +99,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: results.append({ "name": full_table_name, + "path": [dataset_id, table.table_id], "metadata": { "row_count": table_ref.num_rows or 0, "columns": columns, @@ -110,6 +111,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: # Add table without detailed schema results.append({ "name": full_table_name, + "path": [dataset_id, table.table_id], "metadata": { "row_count": 0, "columns": [], diff --git a/py-src/data_formulator/data_loader/external_data_loader.py b/py-src/data_formulator/data_loader/external_data_loader.py index c98b6ab8..c0379871 100644 --- a/py-src/data_formulator/data_loader/external_data_loader.py +++ b/py-src/data_formulator/data_loader/external_data_loader.py @@ -16,6 +16,148 @@ # Sensitive parameter names that should be excluded from stored metadata SENSITIVE_PARAMS = {'password', 'api_key', 'secret', 'token', 'access_token', 'refresh_token', 'access_key', 'secret_key'} +# Valid operators for filter conditions (prevents SQL injection via operator field) +_VALID_OPERATORS = frozenset({ + '=', '!=', '<>', '>', '<', '>=', '<=', + 'LIKE', 'NOT LIKE', 'IN', 'NOT IN', + 'BETWEEN', 'IS NULL', 'IS NOT NULL', +}) + +# Identifier-name validation: reject characters that could indicate SQL injection +# even after quote-doubling (semicolons, comment markers, null bytes). +import re +_DANGEROUS_IDENT_RE = re.compile(r'[;\x00]|--|/\*') + + +def _esc_id(name: str, quote_char: str) -> str: + """Quote a SQL identifier, escaping embedded quote characters. + + E.g. ``_esc_id('col`name', '`')`` → `` `col``name` `` + Rejects names with semicolons, null bytes, or SQL comment sequences. + """ + if not name or _DANGEROUS_IDENT_RE.search(name): + raise ValueError(f"Invalid identifier: {name!r}") + escaped = name.replace(quote_char, quote_char * 2) + return f"{quote_char}{escaped}{quote_char}" + + +def _esc_str(value: str) -> str: + """Escape a string literal for SQL single-quote interpolation. + + Doubles single-quotes and strips null bytes. + """ + return value.replace('\x00', '').replace("'", "''") + + +def build_where_clause( + conditions: list[dict[str, Any]], + quote_char: str = '`', +) -> tuple[str, list[Any]]: + """Build a WHERE clause from structured filter conditions. + + Each condition is a dict with: + - column (str): column name + - operator (str): one of _VALID_OPERATORS + - value: single value, list (IN/NOT IN), or [lo, hi] (BETWEEN) + + Returns (clause_str, params) where clause_str is like + "WHERE `col1` > ? AND `col2` IN (?, ?)" and params is the flat list of + bind values. Returns ("", []) if conditions is empty. + + The caller is responsible for using parameterized execution with the + returned params list. For loaders that use string interpolation (e.g. + ADBC), use :func:`build_where_clause_inline` instead. + """ + if not conditions: + return "", [] + + parts: list[str] = [] + params: list[Any] = [] + for cond in conditions: + col = cond.get("column", "") + op = (cond.get("operator") or "").upper().strip() + val = cond.get("value") + + if not col or op not in _VALID_OPERATORS: + continue + + try: + qcol = _esc_id(col, quote_char) + except ValueError: + continue + + if op in ("IS NULL", "IS NOT NULL"): + parts.append(f"{qcol} {op}") + elif op in ("IN", "NOT IN"): + vals = val if isinstance(val, (list, tuple)) else [val] + placeholders = ", ".join("?" for _ in vals) + parts.append(f"{qcol} {op} ({placeholders})") + params.extend(vals) + elif op == "BETWEEN": + if isinstance(val, (list, tuple)) and len(val) == 2: + parts.append(f"{qcol} BETWEEN ? AND ?") + params.extend(val) + else: + parts.append(f"{qcol} {op} ?") + params.append(val) + + if not parts: + return "", [] + return "WHERE " + " AND ".join(parts), params + + +def build_where_clause_inline( + conditions: list[dict[str, Any]], + quote_char: str = '`', +) -> str: + """Build a WHERE clause with values inlined (for ADBC drivers that don't + support parameterized queries). + + Values are escaped: strings are single-quoted with internal quotes doubled; + numbers are passed as-is; None becomes NULL. + """ + if not conditions: + return "" + + def _lit(v: Any) -> str: + if v is None: + return "NULL" + if isinstance(v, bool): + return "TRUE" if v else "FALSE" + if isinstance(v, (int, float)): + return str(v) + s = str(v).replace('\x00', '').replace("'", "''") + return f"'{s}'" + + parts: list[str] = [] + for cond in conditions: + col = cond.get("column", "") + op = (cond.get("operator") or "").upper().strip() + val = cond.get("value") + + if not col or op not in _VALID_OPERATORS: + continue + + try: + qcol = _esc_id(col, quote_char) + except ValueError: + continue + + if op in ("IS NULL", "IS NOT NULL"): + parts.append(f"{qcol} {op}") + elif op in ("IN", "NOT IN"): + vals = val if isinstance(val, (list, tuple)) else [val] + parts.append(f"{qcol} {op} ({', '.join(_lit(v) for v in vals)})") + elif op == "BETWEEN": + if isinstance(val, (list, tuple)) and len(val) == 2: + parts.append(f"{qcol} BETWEEN {_lit(val[0])} AND {_lit(val[1])}") + else: + parts.append(f"{qcol} {op} {_lit(val)}") + + if not parts: + return "" + return "WHERE " + " AND ".join(parts) + def sanitize_table_name(name_as: str) -> str: """Backward-compatible alias; see :func:`sanitize_external_loader_table_name`.""" @@ -30,11 +172,14 @@ def sanitize_table_name(name_as: str) -> str: class CatalogNode: """A node in the data source's catalog tree. - Only two kinds of node: + Three kinds of node: * ``"namespace"`` — expandable container (database, schema, bucket, …). The hierarchy's ``label`` tells the UI what to call it. * ``"table"`` — importable leaf (table, file, dataset, …). + * ``"table_group"`` — a loadable bundle of related tables with optional + shared filters (e.g. a BI dashboard). Rendered as a non-expandable + leaf in the tree; member tables are listed in ``metadata["tables"]``. The *level name* (e.g. "Database", "Schema") comes from :meth:`ExternalDataLoader.catalog_hierarchy`, not from the node itself. @@ -242,8 +387,16 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: that haven't implemented hierarchical browsing yet. Returns: - List of dicts with: name (table/file identifier), - metadata (row_count, columns, sample_rows). + List of dicts, each with: + + * ``name`` — the table identifier used for import + (e.g. ``"public.users"``). + * ``metadata`` — dict with ``row_count``, ``columns``, + ``sample_rows``. + * ``path`` *(optional)* — explicit hierarchy path as a list + of segments (e.g. ``["public", "users"]``). When present, + :meth:`list_tables_tree` uses it directly to build the + tree instead of splitting ``name`` on dots. """ pass @@ -392,6 +545,104 @@ def get_metadata(self, path: list[str]) -> dict[str, Any]: return n.metadata or {} return {} + def list_tables_tree(self, table_filter: str | None = None) -> dict: + """Build a nested tree from :meth:`list_tables` results. + + Returns ``{"hierarchy": [...], "effective_hierarchy": [...], + "tree": [...]}``. Each table entry keeps the full metadata + (columns, sample_rows, row_count) from ``list_tables()`` plus + ``_source_name`` (the original name used for import). + + If a table entry includes an explicit ``path`` list, it is used + directly to place the table in the tree. Otherwise the ``name`` + is split on ``"."`` as a fallback. + """ + eff = self.effective_hierarchy() + num_ns = len(eff) - 1 # namespace levels before the leaf + + tables = self.list_tables(table_filter=table_filter) + + # Normalise each entry into a (path_segments, original_name, metadata) tuple. + # If the path has more segments than the effective hierarchy depth, + # strip leading segments (they correspond to pinned levels the + # loader included). If it matches or is shorter, use as-is. + eff_depth = len(eff) # expected number of segments (namespace levels + leaf) + + entries: list[tuple[list[str], str, dict | None]] = [] + for t in tables: + orig_name: str = t["name"] + meta = t.get("metadata") + if "path" in t and isinstance(t["path"], list) and t["path"]: + segments = list(t["path"]) + # Strip leading segments if path is longer than effective hierarchy + if len(segments) > eff_depth: + segments = segments[len(segments) - eff_depth:] + else: + # Fallback: split dotted name to fill num_ns namespace levels + leaf + segments = orig_name.split(".", maxsplit=num_ns) if num_ns > 0 else [orig_name] + entries.append((segments, orig_name, meta)) + + # Build tree by grouping on successive path segments. + def _build(items: list[tuple[list[str], str, dict | None]], depth: int, prefix: list[str]) -> list[dict]: + if depth >= num_ns: + # Leaf level — use last segment as the table name + return [ + { + "name": segs[-1] if segs else orig, + "node_type": "table", + "path": prefix + [segs[-1] if segs else orig], + "metadata": { + **(meta or {}), + "_source_name": orig, + }, + } + for segs, orig, meta in items + ] + + # Group by first path segment + from collections import OrderedDict + groups: OrderedDict[str, list[tuple[list[str], str, dict | None]]] = OrderedDict() + ungrouped: list[tuple[list[str], str, dict | None]] = [] + + for segs, orig, meta in items: + if len(segs) > 1: + ns = segs[0] + rest = segs[1:] + groups.setdefault(ns, []).append((rest, orig, meta)) + else: + ungrouped.append((segs, orig, meta)) + + nodes: list[dict] = [] + for ns, children in groups.items(): + ns_path = prefix + [ns] + nodes.append({ + "name": ns, + "node_type": "namespace", + "path": ns_path, + "metadata": None, + "children": _build(children, depth + 1, ns_path), + }) + for segs, orig, meta in ungrouped: + leaf_name = segs[0] if segs else orig + nodes.append({ + "name": leaf_name, + "node_type": "table", + "path": prefix + [leaf_name], + "metadata": { + **(meta or {}), + "_source_name": orig, + }, + }) + return nodes + + tree = _build(entries, 0, []) + + return { + "hierarchy": self.catalog_hierarchy(), + "effective_hierarchy": eff, + "tree": tree, + } + def test_connection(self) -> bool: """Validate the connection is alive. diff --git a/py-src/data_formulator/data_loader/kusto_data_loader.py b/py-src/data_formulator/data_loader/kusto_data_loader.py index 4babff89..f8b3b4fc 100644 --- a/py-src/data_formulator/data_loader/kusto_data_loader.py +++ b/py-src/data_formulator/data_loader/kusto_data_loader.py @@ -211,6 +211,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: tables.append({ "type": "table", "name": table_name, + "path": [table_name], "metadata": table_metadata }) diff --git a/py-src/data_formulator/data_loader/mongodb_data_loader.py b/py-src/data_formulator/data_loader/mongodb_data_loader.py index cf61c070..96735c70 100644 --- a/py-src/data_formulator/data_loader/mongodb_data_loader.py +++ b/py-src/data_formulator/data_loader/mongodb_data_loader.py @@ -271,6 +271,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: results.append({ "name": full_table_name, + "path": [collection_name], "metadata": table_metadata }) except Exception as e: diff --git a/py-src/data_formulator/data_loader/mssql_data_loader.py b/py-src/data_formulator/data_loader/mssql_data_loader.py index 4587cebd..e5500271 100644 --- a/py-src/data_formulator/data_loader/mssql_data_loader.py +++ b/py-src/data_formulator/data_loader/mssql_data_loader.py @@ -391,7 +391,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: "table_type": table_type, } - results.append({"name": full_table_name, "metadata": table_metadata}) + results.append({"name": full_table_name, "path": [schema, table_name], "metadata": table_metadata}) except Exception as e: log.warning(f"Failed to get metadata for table {full_table_name}: {e}") @@ -399,6 +399,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: results.append( { "name": full_table_name, + "path": [schema, table_name], "metadata": { "row_count": 0, "columns": [], diff --git a/py-src/data_formulator/data_loader/mysql_data_loader.py b/py-src/data_formulator/data_loader/mysql_data_loader.py index dfde9026..ae0171da 100644 --- a/py-src/data_formulator/data_loader/mysql_data_loader.py +++ b/py-src/data_formulator/data_loader/mysql_data_loader.py @@ -5,7 +5,7 @@ import pyarrow as pa import pymysql -from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode, build_where_clause_inline, _esc_id, _esc_str logger = logging.getLogger(__name__) @@ -83,7 +83,7 @@ def __init__(self, params: dict[str, Any]): _GEOMETRY_TYPES = {'geometry', 'point', 'linestring', 'polygon', 'multipoint', 'multilinestring', 'multipolygon', 'geometrycollection'} - _OTHER_UNSUPPORTED = {'bit'} + _OTHER_UNSUPPORTED = {'bit', 'blob', 'tinyblob', 'mediumblob', 'longblob', 'binary', 'varbinary'} _UNSUPPORTED_TYPES = _GEOMETRY_TYPES | _OTHER_UNSUPPORTED def _read_sql(self, query: str) -> pa.Table: @@ -110,7 +110,7 @@ def _safe_select_list(self, schema: str, table_name: str) -> str: columns_query = f""" SELECT COLUMN_NAME, DATA_TYPE FROM information_schema.columns - WHERE TABLE_SCHEMA = '{schema}' AND TABLE_NAME = '{table_name}' + WHERE TABLE_SCHEMA = '{_esc_str(schema)}' AND TABLE_NAME = '{_esc_str(table_name)}' ORDER BY ORDINAL_POSITION """ cols_arrow = self._read_sql(columns_query) @@ -143,6 +143,7 @@ def fetch_data_as_arrow( size = opts.get("size", 1000000) sort_columns = opts.get("sort_columns") sort_order = opts.get("sort_order", "asc") + conditions = opts.get("conditions", []) if not source_table: raise ValueError("source_table must be provided") @@ -156,14 +157,19 @@ def fetch_data_as_arrow( col_list = self._safe_select_list(self.database, source_table.strip('`')) base_query = f"SELECT {col_list} FROM `{source_table}`" + # Add WHERE clause from filter conditions + where_clause = build_where_clause_inline(conditions, quote_char='`') + if where_clause: + base_query = f"{base_query} {where_clause}" + # Add ORDER BY if sort columns specified order_by_clause = "" if sort_columns and len(sort_columns) > 0: order_direction = "DESC" if sort_order == 'desc' else "ASC" - sanitized_cols = [f'`{col}` {order_direction}' for col in sort_columns] + sanitized_cols = [f'{_esc_id(col, "`")} {order_direction}' for col in sort_columns] order_by_clause = f" ORDER BY {', '.join(sanitized_cols)}" - query = f"{base_query}{order_by_clause} LIMIT {size}" + query = f"{base_query}{order_by_clause} LIMIT {int(size)}" logger.info(f"Executing MySQL query: {query[:200]}...") @@ -182,7 +188,7 @@ def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: try: # If database is pinned, list only that database; otherwise all user-accessible DBs if self.database: - db_filter = f"TABLE_SCHEMA = '{self.database}'" + db_filter = f"TABLE_SCHEMA = '{_esc_str(self.database)}'" else: db_filter = "TABLE_SCHEMA NOT IN ('information_schema', 'mysql', 'performance_schema', 'sys')" tables_query = f""" @@ -213,7 +219,7 @@ def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: columns_query = f""" SELECT COLUMN_NAME, DATA_TYPE FROM information_schema.columns - WHERE TABLE_SCHEMA = '{schema}' AND TABLE_NAME = '{table_name}' + WHERE TABLE_SCHEMA = '{_esc_str(schema)}' AND TABLE_NAME = '{_esc_str(table_name)}' ORDER BY ORDINAL_POSITION """ columns_arrow = self._read_sql(columns_query) @@ -228,7 +234,7 @@ def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: # Get sample data sample_rows = [] - sample_query = f"SELECT {col_list} FROM `{schema}`.`{table_name}` LIMIT 10" + sample_query = f"SELECT {col_list} FROM {_esc_id(schema, '`')}.{_esc_id(table_name, '`')} LIMIT 10" try: sample_arrow = self._read_sql(sample_query) sample_df = sample_arrow.to_pandas() @@ -237,7 +243,7 @@ def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: logger.warning(f"Could not sample {full_table_name}: {sample_err}") # Get row count - count_query = f"SELECT COUNT(*) as cnt FROM `{schema}`.`{table_name}`" + count_query = f"SELECT COUNT(*) as cnt FROM {_esc_id(schema, '`')}.{_esc_id(table_name, '`')}" count_arrow = self._read_sql(count_query) row_count = int(count_arrow.to_pandas()['cnt'].iloc[0]) @@ -249,6 +255,7 @@ def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: results.append({ "name": full_table_name, + "path": [schema, table_name], "metadata": table_metadata }) except Exception as e: @@ -305,7 +312,7 @@ def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[C query = f""" SELECT TABLE_NAME FROM information_schema.tables - WHERE TABLE_SCHEMA = '{db}' AND TABLE_TYPE = 'BASE TABLE' + WHERE TABLE_SCHEMA = '{_esc_str(db)}' AND TABLE_TYPE = 'BASE TABLE' ORDER BY TABLE_NAME """ rows = self._read_sql(query).to_pandas() @@ -338,7 +345,7 @@ def get_metadata(self, path: list[str]) -> dict[str, Any]: cols_query = f""" SELECT COLUMN_NAME, DATA_TYPE FROM information_schema.columns - WHERE TABLE_SCHEMA = '{db}' AND TABLE_NAME = '{table_name}' + WHERE TABLE_SCHEMA = '{_esc_str(db)}' AND TABLE_NAME = '{_esc_str(table_name)}' ORDER BY ORDINAL_POSITION """ cols_df = self._read_sql(cols_query).to_pandas() @@ -347,12 +354,12 @@ def get_metadata(self, path: list[str]) -> dict[str, Any]: for _, r in cols_df.iterrows() ] count_df = self._read_sql( - f"SELECT COUNT(*) AS cnt FROM `{db}`.`{table_name}`" + f"SELECT COUNT(*) AS cnt FROM {_esc_id(db, '`')}.{_esc_id(table_name, '`')}" ).to_pandas() row_count = int(count_df["cnt"].iloc[0]) col_list = self._safe_select_list(db, table_name) sample_df = self._read_sql( - f"SELECT {col_list} FROM `{db}`.`{table_name}` LIMIT 5" + f"SELECT {col_list} FROM {_esc_id(db, '`')}.{_esc_id(table_name, '`')} LIMIT 5" ).to_pandas() sample_rows = json.loads(sample_df.to_json(orient="records", date_format="iso")) return { diff --git a/py-src/data_formulator/data_loader/postgresql_data_loader.py b/py-src/data_formulator/data_loader/postgresql_data_loader.py index 05f49691..26db2d23 100644 --- a/py-src/data_formulator/data_loader/postgresql_data_loader.py +++ b/py-src/data_formulator/data_loader/postgresql_data_loader.py @@ -5,7 +5,7 @@ import pyarrow as pa import psycopg2 -from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode +from data_formulator.data_loader.external_data_loader import ExternalDataLoader, CatalogNode, build_where_clause_inline, _esc_id, _esc_str logger = logging.getLogger(__name__) @@ -106,7 +106,7 @@ def _safe_select_list(self, schema: str, table_name: str) -> str: columns_query = f""" SELECT column_name, udt_name FROM information_schema.columns - WHERE table_schema = '{schema}' AND table_name = '{table_name}' + WHERE table_schema = '{_esc_str(schema)}' AND table_name = '{_esc_str(table_name)}' ORDER BY ordinal_position """ cols_arrow = self._read_sql(columns_query) @@ -118,11 +118,11 @@ def _safe_select_list(self, schema: str, table_name: str) -> str: for _, r in cols_df.iterrows(): col, dtype = r['column_name'], r['udt_name'].lower() if dtype in self._SPATIAL_TYPES: - parts.append(f'ST_AsText("{col}") AS "{col}"') + parts.append(f'ST_AsText({_esc_id(col, chr(34))}) AS {_esc_id(col, chr(34))}') elif dtype in self._OTHER_UNSUPPORTED: - parts.append(f'"{col}"::text AS "{col}"') + parts.append(f'{_esc_id(col, chr(34))}::text AS {_esc_id(col, chr(34))}') else: - parts.append(f'"{col}"') + parts.append(_esc_id(col, chr(34))) return ', '.join(parts) except Exception: return "*" @@ -139,6 +139,7 @@ def fetch_data_as_arrow( size = opts.get("size", 1000000) sort_columns = opts.get("sort_columns") sort_order = opts.get("sort_order", "asc") + conditions = opts.get("conditions", []) if not source_table: raise ValueError("source_table must be provided") @@ -154,16 +155,21 @@ def fetch_data_as_arrow( else: col_list = self._safe_select_list('public', table_ref.strip('"')) base_query = f"SELECT {col_list} FROM {table_ref}" + + # Add WHERE clause from filter conditions + where_clause = build_where_clause_inline(conditions, quote_char='"') + if where_clause: + base_query = f"{base_query} {where_clause}" # Add ORDER BY if sort columns specified order_by_clause = "" if sort_columns and len(sort_columns) > 0: order_direction = "DESC" if sort_order == 'desc' else "ASC" - sanitized_cols = [f'"{col}" {order_direction}' for col in sort_columns] + sanitized_cols = [f'{_esc_id(col, chr(34))} {order_direction}' for col in sort_columns] order_by_clause = f" ORDER BY {', '.join(sanitized_cols)}" # Build full query with limit - query = f"{base_query}{order_by_clause} LIMIT {size}" + query = f"{base_query}{order_by_clause} LIMIT {int(size)}" logger.info(f"Executing PostgreSQL query: {query[:200]}...") @@ -213,7 +219,7 @@ def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: columns_query = f""" SELECT column_name, data_type FROM information_schema.columns - WHERE table_schema = '{schema}' AND table_name = '{table_name}' + WHERE table_schema = '{_esc_str(schema)}' AND table_name = '{_esc_str(table_name)}' ORDER BY ordinal_position """ columns_arrow = self._read_sql(columns_query) @@ -228,7 +234,7 @@ def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: # Get sample data sample_rows = [] - sample_query = f'SELECT {col_list} FROM "{schema}"."{table_name}" LIMIT 10' + sample_query = f'SELECT {col_list} FROM {_esc_id(schema, chr(34))}.{_esc_id(table_name, chr(34))} LIMIT 10' try: sample_arrow = self._read_sql(sample_query) sample_df = sample_arrow.to_pandas() @@ -237,7 +243,7 @@ def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: logger.warning(f"Could not sample {full_table_name}: {sample_err}") # Get row count - count_query = f'SELECT COUNT(*) as cnt FROM "{schema}"."{table_name}"' + count_query = f'SELECT COUNT(*) as cnt FROM {_esc_id(schema, chr(34))}.{_esc_id(table_name, chr(34))}' count_arrow = self._read_sql(count_query) row_count = count_arrow.to_pandas()['cnt'].iloc[0] @@ -249,6 +255,7 @@ def _list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: results.append({ "name": full_table_name, + "path": [schema, table_name], "metadata": table_metadata }) @@ -369,7 +376,7 @@ def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[C query = f""" SELECT table_name FROM information_schema.tables - WHERE table_schema = '{schema}' + WHERE table_schema = '{_esc_str(schema)}' AND table_type = 'BASE TABLE' AND table_name NOT LIKE '%%/%%' ORDER BY table_name @@ -411,7 +418,7 @@ def get_metadata(self, path: list[str]) -> dict[str, Any]: cols_query = f""" SELECT column_name, data_type FROM information_schema.columns - WHERE table_schema = '{schema}' AND table_name = '{table_name}' + WHERE table_schema = '{_esc_str(schema)}' AND table_name = '{_esc_str(table_name)}' ORDER BY ordinal_position """ cols_df = self._read_sql_on(cols_query, db).to_pandas() @@ -420,12 +427,12 @@ def get_metadata(self, path: list[str]) -> dict[str, Any]: for _, r in cols_df.iterrows() ] count_df = self._read_sql_on( - f'SELECT COUNT(*) AS cnt FROM "{schema}"."{table_name}"', db + f'SELECT COUNT(*) AS cnt FROM {_esc_id(schema, chr(34))}.{_esc_id(table_name, chr(34))}', db ).to_pandas() row_count = int(count_df["cnt"].iloc[0]) col_list = self._safe_select_list(schema, table_name) sample_df = self._read_sql_on( - f'SELECT {col_list} FROM "{schema}"."{table_name}" LIMIT 5', db + f'SELECT {col_list} FROM {_esc_id(schema, chr(34))}.{_esc_id(table_name, chr(34))} LIMIT 5', db ).to_pandas() sample_rows = json.loads(sample_df.to_json(orient="records")) return { diff --git a/py-src/data_formulator/data_loader/s3_data_loader.py b/py-src/data_formulator/data_loader/s3_data_loader.py index d8c3be2b..5f58f6db 100644 --- a/py-src/data_formulator/data_loader/s3_data_loader.py +++ b/py-src/data_formulator/data_loader/s3_data_loader.py @@ -156,6 +156,7 @@ def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: results.append({ "name": s3_url, + "path": [s3_url], "metadata": table_metadata }) except Exception as e: diff --git a/py-src/data_formulator/data_loader/superset_data_loader.py b/py-src/data_formulator/data_loader/superset_data_loader.py index ffe30f87..00c044fb 100644 --- a/py-src/data_formulator/data_loader/superset_data_loader.py +++ b/py-src/data_formulator/data_loader/superset_data_loader.py @@ -4,7 +4,7 @@ """SupersetLoader — ExternalDataLoader implementation for Apache Superset. Treats Superset as a hierarchical data source: - dashboard (namespace) → dataset (table) + dashboard (table_group) → dataset (table) Authentication is JWT-based (``auth_mode() = "token"``). Data is fetched via Superset's SQL Lab API, reusing the existing ``SupersetClient`` and @@ -218,56 +218,98 @@ def test_connection(self) -> bool: except Exception: return False - # -- list_tables (flat/eager) ------------------------------------------ + # -- list_tables (eager, with dashboard hierarchy) --------------------- def list_tables(self, table_filter: str | None = None) -> list[dict[str, Any]]: - """List all datasets the user can access (flat). + """List datasets grouped under dashboards **and** under "All Datasets". - Fetches detail per dataset to populate columns — may be slow for - large Superset instances. + Each dataset appears once under every dashboard it belongs to, plus + once under the synthetic "All Datasets" folder. Metadata (columns, + sample rows) is fetched once per unique dataset and shared across + duplicate entries. """ token = self._ensure_token() + + # 1. Fetch all datasets and build a detail cache keyed by dataset id all_datasets = self._fetch_all_datasets(token) - results = [] - for ds in all_datasets: - name = ds.get("table_name") or "" - if table_filter and table_filter.lower() not in name.lower(): - continue + ds_by_id: dict[int, dict] = {ds["id"]: ds for ds in all_datasets} + detail_cache: dict[int, dict] = {} # dataset_id → metadata dict + + def _get_metadata(ds: dict) -> dict: + ds_id = ds["id"] + if ds_id in detail_cache: + return detail_cache[ds_id] - # The list endpoint doesn't include columns or row_count — - # fetch detail for each dataset. columns: list[dict] = [] row_count = ds.get("row_count") sample_rows: list[dict] = [] try: - detail = self._client.get_dataset_detail(token, ds["id"]) + detail = self._client.get_dataset_detail(token, ds_id) columns = [ - {"name": c.get("column_name") or c.get("name") or "", "type": c.get("type") or ""} + {"name": c.get("column_name") or c.get("name") or "", + "type": c.get("type") or ""} for c in (detail.get("columns") or []) ] row_count = detail.get("row_count") or row_count - # Fetch sample rows via SQL Lab db_id, schema, base_sql = _build_dataset_sql(detail) sql_session = self._client.create_sql_session(token) result = self._client.execute_sql_with_session( - sql_session, db_id, f"SELECT * FROM ({base_sql}) AS _src LIMIT 10", schema, 10, + sql_session, db_id, + f"SELECT * FROM ({base_sql}) AS _src LIMIT 10", + schema, 10, ) sample_rows = result.get("data", []) or [] except Exception: - logger.debug("Failed to fetch detail for dataset %s", ds.get("id")) - - results.append({ - "name": f"{ds.get('id')}:{name}", - "metadata": { - "dataset_id": ds["id"], - "row_count": row_count, - "columns": columns, - "sample_rows": sample_rows, - "schema": ds.get("schema", ""), - "database": (ds.get("database") or {}).get("database_name", ""), - }, - }) + logger.debug("Failed to fetch detail for dataset %s", ds_id) + + meta = { + "dataset_id": ds_id, + "row_count": row_count, + "columns": columns, + "sample_rows": sample_rows, + "schema": ds.get("schema", ""), + "database": (ds.get("database") or {}).get("database_name", ""), + } + detail_cache[ds_id] = meta + return meta + + def _make_entry(ds: dict, folder: str, ds_name: str) -> dict: + return { + "name": f"{ds['id']}:{ds_name}", + "path": [folder, ds_name], + "metadata": dict(_get_metadata(ds)), # shallow copy + } + + results: list[dict[str, Any]] = [] + + # 2. Walk dashboards → datasets + raw = self._client.list_dashboards(token, page=0, page_size=500) + dashboards = raw.get("result", []) + for dash in dashboards: + dash_title = dash.get("dashboard_title", f"Dashboard {dash['id']}") + try: + ds_raw = self._client.get_dashboard_datasets(token, dash["id"]) + dash_datasets = ds_raw.get("result", []) + except Exception: + logger.debug("Failed to fetch datasets for dashboard %s", dash.get("id")) + continue + + for ds in dash_datasets: + ds_name = ds.get("table_name") or ds.get("name") or f"dataset_{ds.get('id', '?')}" + if table_filter and table_filter.lower() not in ds_name.lower(): + continue + # Ensure we have full dataset info from the all-datasets list + full_ds = ds_by_id.get(ds["id"], ds) + results.append(_make_entry(full_ds, dash_title, ds_name)) + + # 3. All datasets under "All Datasets" + for ds in all_datasets: + ds_name = ds.get("table_name") or "" + if table_filter and table_filter.lower() not in ds_name.lower(): + continue + results.append(_make_entry(ds, "All Datasets", ds_name)) + return results # -- ls (lazy/hierarchical) -------------------------------------------- @@ -277,7 +319,7 @@ def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[C token = self._ensure_token() if len(path) == 0: - # Root: list dashboards + "All Datasets" + # Root: list dashboards as table_group nodes + "All Datasets" namespace raw = self._client.list_dashboards(token, page=0, page_size=500) dashboards = raw.get("result", []) nodes = [] @@ -285,13 +327,20 @@ def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[C title = d.get("dashboard_title", f"Dashboard {d['id']}") if filter and filter.lower() not in title.lower(): continue + dash_id = d["id"] + # Build table_group node with tables + source_filters + tables, source_filters = self._build_dashboard_group_metadata(token, dash_id) nodes.append(CatalogNode( name=title, - node_type="namespace", - path=[str(d["id"])], - metadata={"dashboard_id": d["id"]}, + node_type="table_group", + path=[str(dash_id)], + metadata={ + "dashboard_id": dash_id, + "tables": tables, + "source_filters": source_filters, + }, )) - # Add synthetic "All Datasets" entry + # Add synthetic "All Datasets" entry (namespace, not table_group) if not filter or "all datasets" in (filter or "").lower(): nodes.append(CatalogNode( name="All Datasets", @@ -301,11 +350,13 @@ def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[C return nodes if len(path) == 1: - # Expand a dashboard or "All Datasets" + # Expand "All Datasets" namespace (dashboards are table_group leaves, no children) parent_id = path[0] if parent_id == "__all__": datasets = self._fetch_all_datasets(token) else: + # Should not normally be called for dashboards (they're table_group leaves), + # but support it for backwards compatibility try: raw = self._client.get_dashboard_datasets(token, int(parent_id)) datasets = raw.get("result", []) @@ -332,6 +383,222 @@ def ls(self, path: list[str] | None = None, filter: str | None = None) -> list[C return [] + def _build_dashboard_group_metadata( + self, token: str, dashboard_id: int, + ) -> tuple[list[dict], list[dict]]: + """Build tables list and source_filters for a dashboard table_group node. + + Returns (tables, source_filters). + """ + # Fetch datasets under this dashboard + try: + ds_raw = self._client.get_dashboard_datasets(token, dashboard_id) + datasets = ds_raw.get("result", []) + except Exception: + logger.debug("Failed to fetch datasets for dashboard %s", dashboard_id) + return [], [] + + tables = [] + for ds in datasets: + ds_id = ds["id"] + name = ds.get("table_name") or ds.get("name") or f"dataset_{ds_id}" + # Fetch columns for this dataset + columns: list[str] = [] + try: + detail = self._client.get_dataset_detail(token, ds_id) + columns = [ + c.get("column_name") or c.get("name") or "" + for c in (detail.get("columns") or []) + if c.get("column_name") or c.get("name") + ] + except Exception: + logger.debug("Failed to fetch detail for dataset %s", ds_id) + tables.append({ + "name": name, + "dataset_id": ds_id, + "row_count": ds.get("row_count"), + "columns": columns, + }) + + # Extract native filters from dashboard metadata + source_filters = self._extract_dashboard_filters(token, dashboard_id, datasets) + + return tables, source_filters + + def _extract_dashboard_filters( + self, token: str, dashboard_id: int, datasets: list[dict], + ) -> list[dict]: + """Extract native filter definitions from a dashboard's json_metadata. + + Returns a list of source_filter dicts in the generic format defined + in design doc 9.2. + """ + try: + detail = self._client.get_dashboard_detail(token, dashboard_id) + except Exception: + logger.debug("Failed to fetch dashboard detail %s for filters", dashboard_id) + return [] + + json_metadata = detail.get("json_metadata") + if isinstance(json_metadata, str): + try: + json_metadata = json.loads(json_metadata) + except Exception: + json_metadata = {} + if not isinstance(json_metadata, dict): + json_metadata = {} + + raw_filters = ( + json_metadata.get("native_filter_configuration") + or json_metadata.get("filter_configuration") + or [] + ) + if isinstance(raw_filters, str): + try: + raw_filters = json.loads(raw_filters) + except Exception: + return [] + + dataset_ids = {ds["id"] for ds in datasets} + filter_defs: list[dict] = [] + + for raw_filter in raw_filters: + if not isinstance(raw_filter, dict): + continue + + filter_name = raw_filter.get("name") or "Unnamed filter" + filter_type = str(raw_filter.get("filterType") or raw_filter.get("type") or "") + control_values = raw_filter.get("controlValues") or {} + multi = bool( + control_values.get("multiSelect") + or control_values.get("enableMultiple") + or control_values.get("multi_select") + ) + required = bool(raw_filter.get("required")) + + # Extract default value + dm = raw_filter.get("defaultDataMask") or {} + fs = dm.get("filterState") or {} + default_value = fs.get("value") + + targets = raw_filter.get("targets") or [] + applies_to: list[int] = [] + column_name = "" + + for target in targets: + if not isinstance(target, dict): + continue + target_ds_id = target.get("datasetId") or target.get("dataset_id") + if not target_ds_id: + continue + target_ds_id = int(target_ds_id) + if target_ds_id in dataset_ids: + applies_to.append(target_ds_id) + if not column_name: + col_obj = target.get("column") or {} + column_name = ( + col_obj.get("name") + or target.get("column_name") + or target.get("columnName") + or "" + ) + + if not column_name or not applies_to: + continue + + # Infer column_type and input_type + column_type = self._infer_column_type(filter_type) + input_type = self._infer_input_type(filter_type, column_type) + + filter_defs.append({ + "name": filter_name, + "column": column_name, + "input_type": input_type, + "column_type": column_type, + "multi": multi, + "required": required, + "default_value": default_value, + "applies_to": applies_to, + }) + + return filter_defs + + @staticmethod + def _infer_column_type(filter_type: str) -> str: + """Infer column type from Superset filter type string.""" + ft = (filter_type or "").lower() + if any(tok in ft for tok in ("time", "date", "temporal")): + return "TEMPORAL" + if any(tok in ft for tok in ("number", "range", "numeric")): + return "NUMERIC" + return "STRING" + + @staticmethod + def _infer_input_type(filter_type: str, column_type: str) -> str: + """Map Superset filter type to generic input_type.""" + ft = (filter_type or "").lower() + if "time" in ft or column_type == "TEMPORAL": + return "time" + if "number" in ft or "range" in ft or column_type == "NUMERIC": + return "numeric" + if "select" in ft: + return "select" + return "select" # default to select for unknown types + + @staticmethod + def _build_source_filter_clauses(source_filters: list[dict] | None) -> list[str]: + """Convert source_filters from import_options into SQL WHERE clause fragments. + + Each filter has: column, operator, value. + Uses safe quoting — column names are double-quoted, string values are escaped. + """ + if not source_filters: + return [] + + # Valid operators (prevents SQL injection via operator field) + valid_ops = frozenset({ + "EQ", "NEQ", "GT", "GTE", "LT", "LTE", + "IN", "NOT_IN", "LIKE", "ILIKE", + "IS_NULL", "IS_NOT_NULL", + "BETWEEN", + }) + + clauses: list[str] = [] + for sf in source_filters: + if not isinstance(sf, dict): + continue + col = sf.get("column") + op = (sf.get("operator") or "").upper() + value = sf.get("value") + + if not col or op not in valid_ops: + continue + + qcol = _quote_identifier(col) + + if op == "IS_NULL": + clauses.append(f"{qcol} IS NULL") + elif op == "IS_NOT_NULL": + clauses.append(f"{qcol} IS NOT NULL") + elif op in ("IN", "NOT_IN"): + if not isinstance(value, list) or len(value) == 0: + continue + literals = ", ".join(_sql_literal(v) for v in value) + sql_op = "IN" if op == "IN" else "NOT IN" + clauses.append(f"{qcol} {sql_op} ({literals})") + elif op == "BETWEEN": + if not isinstance(value, list) or len(value) != 2: + continue + clauses.append(f"{qcol} BETWEEN {_sql_literal(value[0])} AND {_sql_literal(value[1])}") + else: + sql_ops = { + "EQ": "=", "NEQ": "!=", "GT": ">", "GTE": ">=", + "LT": "<", "LTE": "<=", "LIKE": "LIKE", "ILIKE": "ILIKE", + } + clauses.append(f"{qcol} {sql_ops[op]} {_sql_literal(value)}") + + return clauses + # -- get_metadata ------------------------------------------------------ def get_metadata(self, path: list[str]) -> dict[str, Any]: @@ -388,8 +655,14 @@ def fetch_data_as_arrow( detail = self._client.get_dataset_detail(token, dataset_id) db_id, schema, base_sql = _build_dataset_sql(detail) + # Build WHERE clauses from source_filters + where_clauses = self._build_source_filter_clauses(opts.get("source_filters")) + # Build SQL - full_sql = f"SELECT * FROM ({base_sql}) AS _src LIMIT {size}" + if where_clauses: + full_sql = f"SELECT * FROM ({base_sql}) AS _src WHERE {' AND '.join(where_clauses)} LIMIT {size}" + else: + full_sql = f"SELECT * FROM ({base_sql}) AS _src LIMIT {size}" # Execute via SQL Lab sql_session = self._client.create_sql_session(token) @@ -406,6 +679,70 @@ def fetch_data_as_arrow( col_data = {col: [row.get(col) for row in rows] for col in columns} return pa.table(col_data) + # -- list_tables_tree (override) ---------------------------------------- + + def list_tables_tree(self, table_filter: str | None = None) -> dict: + """Build nested tree using ls() instead of list_tables(). + + Dashboards become ``table_group`` leaf nodes (with tables and + source_filters in metadata). "All Datasets" remains a namespace + with child table nodes that include full metadata (columns, sample_rows). + """ + root_nodes = self.ls(path=[], filter=table_filter) + tree: list[dict] = [] + + # For "All Datasets", use the eager list_tables() which fetches + # columns and sample_rows per dataset (needed for table preview). + all_datasets_meta: dict[str, dict] | None = None + + for node in root_nodes: + d = { + "name": node.name, + "node_type": node.node_type, + "path": node.path, + "metadata": node.metadata, + } + if node.node_type == "namespace": + # Lazily fetch full metadata for All Datasets namespace + if all_datasets_meta is None: + try: + full_tables = self.list_tables(table_filter=table_filter) + all_datasets_meta = {} + for t in full_tables: + # Key by dataset name for lookup + name = t["name"].split(":", 1)[-1] if ":" in t["name"] else t["name"] + # Only keep entries under "All Datasets" + if t.get("path") and t["path"][0] == "All Datasets": + all_datasets_meta[name] = t.get("metadata") or {} + except Exception: + all_datasets_meta = {} + + # Expand namespace children with enriched metadata + child_nodes = self.ls(path=node.path, filter=table_filter) + d["children"] = [] + for cn in child_nodes: + enriched_meta = {**(cn.metadata or {})} + # Merge full metadata (columns, sample_rows) if available + full_meta = all_datasets_meta.get(cn.name, {}) + if full_meta: + enriched_meta.update(full_meta) + d["children"].append({ + "name": cn.name, + "node_type": cn.node_type, + "path": cn.path, + "metadata": enriched_meta, + }) + else: + # table_group: no children in tree + d["children"] = [] + tree.append(d) + + return { + "hierarchy": self.catalog_hierarchy(), + "effective_hierarchy": self.effective_hierarchy(), + "tree": tree, + } + # -- helpers ----------------------------------------------------------- def _fetch_all_datasets(self, token: str) -> list[dict]: diff --git a/py-src/data_formulator/datalake/__init__.py b/py-src/data_formulator/datalake/__init__.py index ff90a275..f92b3acb 100644 --- a/py-src/data_formulator/datalake/__init__.py +++ b/py-src/data_formulator/datalake/__init__.py @@ -39,6 +39,7 @@ WorkspaceWithTempData, get_data_formulator_home, get_default_workspace_root, + get_user_home, ) from data_formulator.datalake.workspace_manager import WorkspaceManager from data_formulator.datalake.azure_blob_workspace import AzureBlobWorkspace @@ -92,6 +93,7 @@ "GlobalCacheManager", "get_data_formulator_home", "get_default_workspace_root", + "get_user_home", "WorkspaceManager", # Metadata "TableMetadata", diff --git a/py-src/data_formulator/datalake/workspace.py b/py-src/data_formulator/datalake/workspace.py index df6ce242..b00a1927 100644 --- a/py-src/data_formulator/datalake/workspace.py +++ b/py-src/data_formulator/datalake/workspace.py @@ -84,6 +84,30 @@ def get_default_workspace_root() -> Path: return get_data_formulator_home() / "workspaces" +def get_user_home(identity_id: str) -> Path: + """Return the per-user home directory: DATA_FORMULATOR_HOME/users//. + + Shared helper used by workspace_factory, data_connector, and any + code that needs per-user storage paths. + """ + safe_id = _sanitize_identity_id(identity_id) + return get_data_formulator_home() / "users" / safe_id + + +def _sanitize_identity_id(identity_id: str) -> str: + """Sanitize identity_id for use as a directory name. + + Uses ``secure_filename`` to produce a safe single-component name. + Raises ``ValueError`` if the result is empty or too long. + """ + if len(identity_id) > 256: + raise ValueError("identity_id too long") + result = secure_filename(identity_id) + if not result: + raise ValueError("identity_id sanitized to empty string") + return result + + def cleanup_stale_temp_files(workspace_path: Path, max_age_hours: int = 24) -> int: """ Remove stale temporary files from workspace directory. @@ -204,18 +228,11 @@ def __init__(self, identity_id: str, root_dir: Optional[str | Path] = None, *, w @staticmethod def _sanitize_identity_id(identity_id: str) -> str: - """ - Sanitize identity_id for use as a directory name. + """Sanitize identity_id for use as a directory name. - Uses ``secure_filename`` to produce a safe single-component name. - Raises ``ValueError`` if the result is empty or too long. + Delegates to module-level :func:`_sanitize_identity_id`. """ - if len(identity_id) > 256: - raise ValueError("identity_id too long") - result = secure_filename(identity_id) - if not result: - raise ValueError("identity_id sanitized to empty string") - return result + return _sanitize_identity_id(identity_id) def _init_metadata(self) -> None: """Initialize a new workspace with empty metadata.""" diff --git a/py-src/data_formulator/workspace_factory.py b/py-src/data_formulator/workspace_factory.py index fe3c0454..7a2eb1b1 100644 --- a/py-src/data_formulator/workspace_factory.py +++ b/py-src/data_formulator/workspace_factory.py @@ -21,7 +21,7 @@ import logging from pathlib import Path -from data_formulator.datalake.workspace import Workspace, get_data_formulator_home +from data_formulator.datalake.workspace import Workspace, get_data_formulator_home, get_user_home from data_formulator.datalake.workspace_manager import WorkspaceManager logger = logging.getLogger(__name__) @@ -77,8 +77,7 @@ def _build_azure_container_client(cfg: dict): def _get_user_workspaces_root(identity_id: str) -> Path: """Return the workspaces root for a user: /users//workspaces/.""" - safe_id = Workspace._sanitize_identity_id(identity_id) - return get_data_formulator_home() / "users" / safe_id / "workspaces" + return get_user_home(identity_id) / "workspaces" def _get_backend() -> str: diff --git a/src/app/App.tsx b/src/app/App.tsx index 50f69a35..b329a0d7 100644 --- a/src/app/App.tsx +++ b/src/app/App.tsx @@ -10,6 +10,8 @@ import { dfActions, dfSelectors, fetchGlobalModelList, + DEFAULT_ROW_LIMIT, + DEFAULT_ROW_LIMIT_EPHEMERAL, } from './dfSlice' import { getBrowserId } from './identity'; import { getAuthInfo, getOidcUser, getUserManager } from './oidcConfig'; @@ -470,13 +472,16 @@ const ConfigDialog: React.FC = () => { const dispatch = useDispatch(); const { t } = useTranslation(); const config = useSelector((state: DataFormulatorState) => state.config); + const isEphemeral = useSelector((state: DataFormulatorState) => state.serverConfig?.WORKSPACE_BACKEND === 'ephemeral'); + const rowLimitDefault = isEphemeral ? DEFAULT_ROW_LIMIT_EPHEMERAL : DEFAULT_ROW_LIMIT; + const rowLimitMax = DEFAULT_ROW_LIMIT; const [formulateTimeoutSeconds, setFormulateTimeoutSeconds] = useState(config.formulateTimeoutSeconds ?? 60); const [defaultChartWidth, setDefaultChartWidth] = useState(config.defaultChartWidth ?? 300); const [defaultChartHeight, setDefaultChartHeight] = useState(config.defaultChartHeight ?? 300); const [maxStretchFactor, setMaxStretchFactor] = useState(config.maxStretchFactor ?? 2.0); - const [frontendRowLimit, setFrontendRowLimit] = useState(config.frontendRowLimit ?? 50000); + const [frontendRowLimit, setFrontendRowLimit] = useState(config.frontendRowLimit ?? rowLimitDefault); const [paletteKey, setPaletteKey] = useState( (config.paletteKey && palettes[config.paletteKey]) ? config.paletteKey : defaultPaletteKey ); @@ -605,12 +610,12 @@ const ConfigDialog: React.FC = () => { input: { inputProps: { min: 100, - max: 1000000 + max: rowLimitMax } } }} - error={frontendRowLimit < 100 || frontendRowLimit > 1000000} - helperText={frontendRowLimit < 100 || frontendRowLimit > 1000000 ? + error={frontendRowLimit < 100 || frontendRowLimit > rowLimitMax} + helperText={frontendRowLimit < 100 || frontendRowLimit > rowLimitMax ? t('config.localRowLimitRangeError') : ""} /> @@ -682,7 +687,7 @@ const ConfigDialog: React.FC = () => { setDefaultChartWidth(300); setDefaultChartHeight(300); setMaxStretchFactor(2.0); - setFrontendRowLimit(50000); + setFrontendRowLimit(rowLimitDefault); setPaletteKey(defaultPaletteKey); }}>{t('session.resetToDefault')} @@ -692,7 +697,7 @@ const ConfigDialog: React.FC = () => { || isNaN(defaultChartWidth) || defaultChartWidth <= 0 || defaultChartWidth > 1000 || isNaN(defaultChartHeight) || defaultChartHeight <= 0 || defaultChartHeight > 1000 || isNaN(maxStretchFactor) || maxStretchFactor < 1 || maxStretchFactor > 5 - || isNaN(frontendRowLimit) || frontendRowLimit < 100 || frontendRowLimit > 1000000} + || isNaN(frontendRowLimit) || frontendRowLimit < 100 || frontendRowLimit > rowLimitMax} onClick={() => { dispatch(dfActions.setConfig({formulateTimeoutSeconds, defaultChartWidth, defaultChartHeight, maxStretchFactor, frontendRowLimit, paletteKey})); setOpen(false); diff --git a/src/app/dfSlice.tsx b/src/app/dfSlice.tsx index 430e1d0a..c4a9a1df 100644 --- a/src/app/dfSlice.tsx +++ b/src/app/dfSlice.tsx @@ -93,6 +93,9 @@ export type FocusedId = | { type: 'report'; reportId: string } | undefined; +export const DEFAULT_ROW_LIMIT = 2_000_000; +export const DEFAULT_ROW_LIMIT_EPHEMERAL = 20_000; + export interface ClientConfig { formulateTimeoutSeconds: number; defaultChartWidth: number; @@ -225,7 +228,7 @@ const initialState: DataFormulatorState = { defaultChartWidth: 400, defaultChartHeight: 300, maxStretchFactor: 2.0, - frontendRowLimit: 50000, + frontendRowLimit: DEFAULT_ROW_LIMIT, paletteKey: 'fluent', }, @@ -603,6 +606,10 @@ export const dataFormulatorSlice = createSlice({ }, setServerConfig: (state, action: PayloadAction) => { state.serverConfig = action.payload; + // Auto-adjust frontendRowLimit for ephemeral mode if still at default + if (action.payload.WORKSPACE_BACKEND === 'ephemeral' && state.config.frontendRowLimit === DEFAULT_ROW_LIMIT) { + state.config.frontendRowLimit = DEFAULT_ROW_LIMIT_EPHEMERAL; + } }, setConfig: (state, action: PayloadAction) => { state.config = action.payload; diff --git a/src/app/tableThunks.ts b/src/app/tableThunks.ts index 266b1574..a080d730 100644 --- a/src/app/tableThunks.ts +++ b/src/app/tableThunks.ts @@ -16,7 +16,7 @@ import { createAsyncThunk } from '@reduxjs/toolkit'; import { DataSourceConfig, DictTable } from '../components/ComponentType'; import { Type } from '../data/types'; import { inferTypeFromValueArray } from '../data/utils'; -import { fetchWithIdentity, getUrls, getConnectorUrls, computeContentHash } from './utils'; +import { fetchWithIdentity, getUrls, CONNECTOR_ACTION_URLS, computeContentHash } from './utils'; import { DataFormulatorState, dfActions, fetchFieldSemanticType } from './dfSlice'; import { tableDataDB } from './workspaceDB'; @@ -47,6 +47,7 @@ export interface LoadTablePayload { rowLimit?: number; sortColumns?: string[]; sortOrder?: 'asc' | 'desc'; + conditions?: { column: string; operator: string; value?: any }[]; }; } @@ -75,7 +76,7 @@ export const loadTable = createAsyncThunk< async (payload, { dispatch, getState }) => { const { table, file, replaceSource, sourceTableName, connectorId, importOptions } = payload; const state = getState(); - const frontendRowLimit = state.config?.frontendRowLimit ?? 50000; + const frontendRowLimit = state.config?.frontendRowLimit ?? 2_000_000; const existingTables = state.tables; // Storage determined by backend config @@ -131,8 +132,9 @@ export const loadTable = createAsyncThunk< if (sourceType === 'database' && sourceTableName && connectorId) { // Database source: ingest to workspace via data connector try { - const ingestUrl = getConnectorUrls(connectorId).DATA_IMPORT; + const ingestUrl = CONNECTOR_ACTION_URLS.IMPORT_DATA; const ingestBody = { + connector_id: connectorId, source_table: sourceTableName, table_name: sourceTableName, import_options: importOptions || {}, @@ -230,10 +232,11 @@ export const loadTable = createAsyncThunk< if (sourceType === 'database' && connectorId && sourceTableName) { // Database source: fetch data via data connector preview (no workspace save) try { - const response = await fetchWithIdentity(getConnectorUrls(connectorId).DATA_PREVIEW, { + const response = await fetchWithIdentity(CONNECTOR_ACTION_URLS.PREVIEW_DATA, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ + connector_id: connectorId, source_table: sourceTableName, import_options: { size: frontendRowLimit, diff --git a/src/app/useDataRefresh.tsx b/src/app/useDataRefresh.tsx index 41ee25ef..80e28e62 100644 --- a/src/app/useDataRefresh.tsx +++ b/src/app/useDataRefresh.tsx @@ -7,7 +7,7 @@ import { DataFormulatorState, dfActions, selectRefreshConfigs } from './dfSlice' import { AppDispatch } from './store'; import { DictTable } from '../components/ComponentType'; import { createTableFromText } from '../data/utils'; -import { fetchWithIdentity, getUrls, getConnectorUrls, computeContentHash } from './utils'; +import { fetchWithIdentity, getUrls, CONNECTOR_ACTION_URLS, computeContentHash } from './utils'; /** Gzip-compress a string into a Blob using the browser's CompressionStream API. */ async function compressBlob(data: string): Promise { @@ -123,10 +123,10 @@ export function useDataRefresh() { try { console.log(`[DataRefresh] Requesting connector '${connectorId}' to refresh "${tableName}"...`); - const refreshResponse = await fetchWithIdentity(getConnectorUrls(connectorId).DATA_REFRESH, { + const refreshResponse = await fetchWithIdentity(CONNECTOR_ACTION_URLS.REFRESH_DATA, { method: 'POST', headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ table_name: tableName }) + body: JSON.stringify({ connector_id: connectorId, table_name: tableName }) }); const refreshData = await refreshResponse.json(); diff --git a/src/app/utils.tsx b/src/app/utils.tsx index e0374f77..0cbcca73 100644 --- a/src/app/utils.tsx +++ b/src/app/utils.tsx @@ -55,6 +55,9 @@ export function getUrls() { // Workspace summary (auto-naming) WORKSPACE_SUMMARY: `/api/agent/workspace-summary`, + // NL-to-filter + NL_TO_FILTER: `/api/agent/nl-to-filter`, + // Refresh data endpoint REFRESH_DERIVED_DATA: `/api/agent/refresh-derived-data`, @@ -72,23 +75,27 @@ export function getUrls() { } /** - * Build API URLs for a DataConnector by connector ID. + * Static API URLs for connector actions. + * All action routes accept `connector_id` in the JSON body. */ -export function getConnectorUrls(connectorId: string) { - const base = `/api/connectors/${connectorId}`; - return { - AUTH_CONNECT: `${base}/auth/connect`, - AUTH_DISCONNECT: `${base}/auth/disconnect`, - AUTH_STATUS: `${base}/auth/status`, - AUTH_TOKEN_CONNECT: `${base}/auth/token-connect`, - CATALOG_LS: `${base}/catalog/ls`, - CATALOG_METADATA: `${base}/catalog/metadata`, - CATALOG_LIST_TABLES: `${base}/catalog/list_tables`, - DATA_IMPORT: `${base}/data/import`, - DATA_REFRESH: `${base}/data/refresh`, - DATA_PREVIEW: `${base}/data/preview`, - }; -} +export const CONNECTOR_ACTION_URLS = { + CONNECT: '/api/connectors/connect', + GET_STATUS: '/api/connectors/get-status', + GET_CATALOG: '/api/connectors/get-catalog', + GET_CATALOG_TREE: '/api/connectors/get-catalog-tree', + IMPORT_DATA: '/api/connectors/import-data', + REFRESH_DATA: '/api/connectors/refresh-data', + PREVIEW_DATA: '/api/connectors/preview-data', + IMPORT_GROUP: '/api/connectors/import-group', +} as const; + +/** Global connector management URLs. */ +export const CONNECTOR_URLS = { + DATA_LOADERS: '/api/data-loaders', + LIST: '/api/connectors', + CREATE: '/api/connectors', + DELETE: (id: string) => `/api/connectors/${id}`, +} as const; /** * Get the current namespaced identity from the Redux store, or fall back to browser ID. diff --git a/src/i18n/locales/en/common.json b/src/i18n/locales/en/common.json index a09448c1..89f63fe1 100644 --- a/src/i18n/locales/en/common.json +++ b/src/i18n/locales/en/common.json @@ -106,7 +106,7 @@ "maxRepairAttemptsHint": "How many attempts LLM will make to repair code if code fails to execute (recommended = 1, higher values might increase the chance of success but it's slow).", "colorTheme": "Color Theme", "localRowLimit": "local-only row limit", - "localRowLimitRangeError": "Value must be between 100 and 1,000,000 rows", + "localRowLimitRangeError": "Value must be between 100 and 2,000,000 rows", "localRowLimitHint": "Maximum number of rows kept when loading data locally (not stored on server).", "maxStretchFactor": "max chart stretch factor", "maxStretchFactorRangeError": "Value must be between 1.0 and 5.0", @@ -318,7 +318,9 @@ "tierAuth": "Sign in", "tierFilter": "Scope", "tierAuthOr": "or", - "tierAuthManual": "Enter credentials manually" + "tierAuthManual": "Enter credentials manually", + "selectTableFromTree": "Select a table from the tree to preview", + "noTablesFound": "No tables found" }, "dataThread": { "title": "Data Threads", diff --git a/src/i18n/locales/en/upload.json b/src/i18n/locales/en/upload.json index 18efce4d..ca15ea38 100644 --- a/src/i18n/locales/en/upload.json +++ b/src/i18n/locales/en/upload.json @@ -40,6 +40,15 @@ "loadLocalData": "Load local data", "localData": "Local data", "orConnectToDataSource": "Or connect to a data source (with optional auto-refresh)", + "addConnection": "Add Connection", + "addConnectionDesc": "Connect to a new database or data service", + "connectorConnected": "Connected", + "connectorDisconnected": "Click to connect", + "pickDataSourceType": "Choose a data source type to create a new connection.", + "nameYourConnection": "Name your {{type}} connection.", + "connectionName": "Connection name", + "createConnection": "Create Connection", + "creating": "Creating...", "extractFromDocuments": "Extract from Documents", "addData": "Add Data", "loadDataIn": "Load data in", diff --git a/src/i18n/locales/zh/common.json b/src/i18n/locales/zh/common.json index ffb88758..903633cc 100644 --- a/src/i18n/locales/zh/common.json +++ b/src/i18n/locales/zh/common.json @@ -106,7 +106,7 @@ "maxRepairAttemptsHint": "当代码执行失败时,LLM 可尝试修复的次数(建议为 1,较高值可能提高成功率但会更慢)。", "colorTheme": "颜色主题", "localRowLimit": "本地数据行数上限", - "localRowLimitRangeError": "取值必须在 100 到 1,000,000 行之间", + "localRowLimitRangeError": "取值必须在 100 到 2,000,000 行之间", "localRowLimitHint": "本地加载数据时保留的最大行数(不存储到服务端)。", "maxStretchFactor": "图表最大拉伸系数", "maxStretchFactorRangeError": "取值必须在 1.0 到 5.0 之间", @@ -311,7 +311,9 @@ "loadTableSubset": "加载表子集", "loadTableBtn": "加载表", "rememberCredentials": "记住凭据", - "connectionTimeout": "连接超时。请检查凭据后重试。" + "connectionTimeout": "连接超时。请检查凭据后重试。", + "selectTableFromTree": "从树中选择一个表来预览", + "noTablesFound": "未找到表" }, "dataThread": { "title": "数据线程", diff --git a/src/i18n/locales/zh/upload.json b/src/i18n/locales/zh/upload.json index b6160009..7c998b0b 100644 --- a/src/i18n/locales/zh/upload.json +++ b/src/i18n/locales/zh/upload.json @@ -40,6 +40,15 @@ "loadLocalData": "加载本地数据", "localData": "本地数据", "orConnectToDataSource": "或连接数据源(支持自动刷新)", + "addConnection": "添加连接", + "addConnectionDesc": "连接到新的数据库或数据服务", + "connectorConnected": "已连接", + "connectorDisconnected": "点击连接", + "pickDataSourceType": "选择数据源类型以创建新连接。", + "nameYourConnection": "为您的 {{type}} 连接命名。", + "connectionName": "连接名称", + "createConnection": "创建连接", + "creating": "创建中...", "extractFromDocuments": "从文档提取", "addData": "添加数据", "loadDataIn": "数据加载到", diff --git a/src/views/DBTableManager.tsx b/src/views/DBTableManager.tsx index 8b95c33b..4798e431 100644 --- a/src/views/DBTableManager.tsx +++ b/src/views/DBTableManager.tsx @@ -3,29 +3,20 @@ import React, { useState, useEffect, useCallback, FC, useRef, useMemo } from 're import { useTranslation } from 'react-i18next'; import { Card, - CardContent, Typography, Button, - Grid, Box, IconButton, - Paper, TextField, Divider, - SxProps, CircularProgress, - ButtonGroup, ToggleButton, ToggleButtonGroup, MenuItem, - Menu, - Chip, Checkbox, FormControlLabel, styled, useTheme, - Link, - alpha, Tooltip, } from '@mui/material'; @@ -33,61 +24,132 @@ import SearchIcon from '@mui/icons-material/Search'; import Autocomplete from '@mui/material/Autocomplete'; -import { getUrls, getConnectorUrls, fetchWithIdentity } from '../app/utils'; +import { getUrls, CONNECTOR_ACTION_URLS, fetchWithIdentity } from '../app/utils'; import { borderColor } from '../app/tokens'; import { CustomReactTable } from './ReactTable'; -import { DataSourceConfig, DictTable } from '../components/ComponentType'; -import { Type } from '../data/types'; +import { DictTable } from '../components/ComponentType'; import { useDispatch, useSelector } from 'react-redux'; -import { dfActions, dfSelectors } from '../app/dfSlice'; +import { dfActions } from '../app/dfSlice'; import { DataFormulatorState } from '../app/dfSlice'; import { fetchFieldSemanticType } from '../app/dfSlice'; -import { loadTable } from '../app/tableThunks'; +import { loadTable, buildDictTableFromWorkspace } from '../app/tableThunks'; import { AppDispatch } from '../app/store'; import Markdown from 'markdown-to-jsx'; import CheckIcon from '@mui/icons-material/Check'; -import CleaningServicesIcon from '@mui/icons-material/CleaningServices'; -import MoreVertIcon from '@mui/icons-material/MoreVert'; -import UploadFileIcon from '@mui/icons-material/UploadFile'; -import DownloadIcon from '@mui/icons-material/Download'; -import RestartAltIcon from '@mui/icons-material/RestartAlt'; -import CloudUploadIcon from '@mui/icons-material/CloudUpload'; -import ClearIcon from '@mui/icons-material/Clear'; +import FolderOutlinedIcon from '@mui/icons-material/FolderOutlined'; +import DashboardOutlinedIcon from '@mui/icons-material/DashboardOutlined'; +import RefreshIcon from '@mui/icons-material/Refresh'; +import { TableIcon } from '../icons'; +import { SimpleTreeView } from '@mui/x-tree-view/SimpleTreeView'; +import { TreeItem, treeItemClasses } from '@mui/x-tree-view/TreeItem'; -export const handleDBDownload = async (identityId: string) => { - try { - const response = await fetchWithIdentity( - getUrls().DOWNLOAD_DB_FILE, - { method: 'GET' } - ); - - // Check if the response is ok - if (!response.ok) { - const errorData = await response.json(); - throw new Error(errorData.error || errorData.message || 'Failed to download database file'); +// ---------- Catalog tree types & helpers ---------- + +/** A node returned by the catalog/tree endpoint */ +interface CatalogTreeNode { + name: string; + node_type: 'namespace' | 'table' | 'table_group'; + path: string[]; + metadata: Record | null; + children?: CatalogTreeNode[]; +} + +/** A source filter definition from the backend (e.g. Superset native filter). */ +interface SourceFilter { + name: string; + column: string; + input_type: 'select' | 'numeric' | 'time' | 'text'; + column_type: string; + multi: boolean; + required: boolean; + default_value?: unknown; + applies_to?: number[]; + options?: string[]; +} + +/** Collect all namespace item IDs for default-expanded state */ +function collectNamespaceIds(nodes: CatalogTreeNode[]): string[] { + const ids: string[] = []; + for (const n of nodes) { + if (n.node_type === 'namespace') { + ids.push(n.path.join('/')); + if (n.children) ids.push(...collectNamespaceIds(n.children)); + } + } + return ids; +} + +/** Find a node by path in the catalog tree */ +function findNodeByPath(nodes: CatalogTreeNode[], itemId: string): CatalogTreeNode | null { + for (const n of nodes) { + if (n.path.join('/') === itemId) return n; + if (n.children) { + const found = findNodeByPath(n.children, itemId); + if (found) return found; } + } + return null; +} - // Get the blob directly from response - const blob = await response.blob(); - const url = URL.createObjectURL(blob); - - // Create a temporary link element - const link = document.createElement('a'); - link.href = url; - link.download = `df_${identityId?.slice(0, 4) || 'db'}.db`; - document.body.appendChild(link); - - // Trigger download - link.click(); - - // Clean up - document.body.removeChild(link); - URL.revokeObjectURL(url); - } catch (error) { - throw error; +/** Styled TreeItem — clean, compact, GitHub-flavoured. */ +const StyledTreeItem = styled(TreeItem)(({ theme }) => ({ + [`& .${treeItemClasses.groupTransition}`]: { + marginLeft: 12, + paddingLeft: 8, + borderLeft: `1px solid ${theme.palette.divider}`, + }, + [`& > .${treeItemClasses.content}`]: { + padding: '2px 6px', + borderRadius: 6, + gap: 4, + [`& .${treeItemClasses.iconContainer}`]: { + width: 16, minWidth: 16, + color: theme.palette.text.disabled, + }, + // Hide the empty icon container on leaf items (no expand/collapse arrow) + [`& .${treeItemClasses.iconContainer}:empty`]: { + display: 'none', + }, + [`& .${treeItemClasses.label}`]: { + fontSize: 13, + }, + '&:hover': { backgroundColor: theme.palette.action.hover }, + }, + [`& > .${treeItemClasses.content}.Mui-selected`]: { + backgroundColor: theme.palette.action.selected, + fontWeight: 500, + '&:hover': { backgroundColor: theme.palette.action.selected }, + }, +})) as typeof TreeItem; + +// ---------- End catalog tree ---------- + + +export const handleDBDownload = async (identityId: string) => { + const response = await fetchWithIdentity( + getUrls().DOWNLOAD_DB_FILE, + { method: 'GET' } + ); + + if (!response.ok) { + const errorData = await response.json(); + throw new Error(errorData.error || errorData.message || 'Failed to download database file'); } + + const blob = await response.blob(); + const url = URL.createObjectURL(blob); + + const link = document.createElement('a'); + link.href = url; + link.download = `df_${identityId?.slice(0, 4) || 'db'}.db`; + document.body.appendChild(link); + + link.click(); + + document.body.removeChild(link); + URL.revokeObjectURL(url); }; interface DBTable { @@ -111,18 +173,6 @@ interface DBTable { } | null; } -interface ColumnStatistics { - column: string; - type: string; - statistics: { - count: number; - unique_count: number; - null_count: number; - min?: number; - max?: number; - avg?: number; - }; -} export const DBManagerPane: React.FC<{ @@ -132,10 +182,8 @@ export const DBManagerPane: React.FC<{ const theme = useTheme(); const dispatch = useDispatch(); - const identity = useSelector((state: DataFormulatorState) => state.identity); const tables = useSelector((state: DataFormulatorState) => state.tables); const serverConfig = useSelector((state: DataFormulatorState) => state.serverConfig); - const dataLoaderConnectParams = useSelector((state: DataFormulatorState) => state.dataLoaderConnectParams); // Disabled data sources (missing deps) from app-config const disabledSources = serverConfig.DISABLED_SOURCES ?? {}; @@ -322,7 +370,7 @@ export const DBManagerPane: React.FC<{ {/* Data source forms (connected + available) */} {allSources.map((source) => ( selectedDataLoader === source.source_id && ( - + { setConnectedIds(prev => new Set([...prev, source.source_id])); }} - onDisconnected={() => { - setConnectedIds(prev => { - const next = new Set(prev); - next.delete(source.source_id); - return next; - }); - }} /> ) @@ -426,6 +467,276 @@ export const DBManagerPane: React.FC<{ } +// --------------------------------------------------------------------------- +// GroupLoadPanel — right panel for table_group nodes (BI dashboards) +// --------------------------------------------------------------------------- + +const GroupLoadPanel: React.FC<{ + groupName: string; + tables: { name: string; dataset_id: number; row_count?: number; columns?: string[] }[]; + sourceFilters: SourceFilter[]; + frontendRowLimit: number; + rowLimitPresets: number[]; + connectorId: string; + loadedKey?: string; + onLoaded: (label: string) => void; + onImport: () => void; + onFinish: (severity: "error" | "success", msg: string, tableIds?: string[]) => void; +}> = ({ groupName, tables, sourceFilters, frontendRowLimit, rowLimitPresets, connectorId, loadedKey, onLoaded, onImport, onFinish }) => { + const { t } = useTranslation(); + const dispatch = useDispatch(); + + // Filter values state — keyed by filter name + const [filterValues, setFilterValues] = useState>(() => { + const defaults: Record = {}; + for (const f of sourceFilters) { + if (f.default_value != null) defaults[f.name] = f.default_value; + } + return defaults; + }); + + // Row limit + const [rowLimit, setRowLimit] = useState(-1); + + // Loading state + const [isLoading, setIsLoading] = useState(false); + + const totalRows = tables.reduce((sum, t) => sum + (t.row_count ?? 0), 0); + + const handleLoadGroup = async () => { + setIsLoading(true); + onImport(); + try { + // Build source_filters payload from user-selected values + const appliedFilters: { column: string; operator: string; value: any; applies_to?: number[] }[] = []; + for (const f of sourceFilters) { + const val = filterValues[f.name]; + if (val == null || val === '' || (Array.isArray(val) && val.length === 0)) continue; + if (f.multi && Array.isArray(val)) { + appliedFilters.push({ column: f.column, operator: 'IN', value: val, applies_to: f.applies_to }); + } else if (f.input_type === 'numeric') { + appliedFilters.push({ column: f.column, operator: 'EQ', value: val, applies_to: f.applies_to }); + } else { + appliedFilters.push({ column: f.column, operator: 'EQ', value: val, applies_to: f.applies_to }); + } + } + + const resp = await fetchWithIdentity(CONNECTOR_ACTION_URLS.IMPORT_GROUP, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ + connector_id: connectorId, + tables: tables.map(t => ({ dataset_id: t.dataset_id, name: t.name })), + row_limit: rowLimit > 0 ? rowLimit : -1, + source_filters: appliedFilters, + group_name: groupName, + }), + }); + const data = await resp.json(); + + if (data.status === 'success') { + const results: any[] = data.results || []; + const succeeded = results.filter(r => r.status === 'success'); + const failed = results.filter(r => r.status === 'error'); + + // Fetch workspace table list to get full data for loaded tables + const listResp = await fetchWithIdentity(getUrls().LIST_TABLES, { method: 'GET' }); + const listData = await listResp.json(); + if (listData.status === 'success') { + for (const r of succeeded) { + const wsTable = (listData.tables || []).find((t: any) => t.name === r.table_name); + if (wsTable) { + const source = { + type: 'database' as const, + databaseTable: r.table_name, + canRefresh: true, + lastRefreshed: Date.now(), + connectorId, + }; + const tableObj = buildDictTableFromWorkspace(wsTable, source); + dispatch(dfActions.addTableToStore(tableObj)); + dispatch(fetchFieldSemanticType(tableObj)); + } + } + } + + onLoaded('loaded'); + if (failed.length > 0) { + onFinish("error", `Loaded ${succeeded.length} tables, ${failed.length} failed`); + } else { + onFinish("success", `Loaded ${succeeded.length} tables from "${groupName}"`, + succeeded.map(r => r.table_name)); + } + } else { + throw new Error(data.message || 'Failed to load group'); + } + } catch (err: any) { + onFinish("error", err.message || 'Failed to load dashboard'); + } finally { + setIsLoading(false); + } + }; + + return ( + + {/* Header */} + + + {groupName} + + {tables.length} {tables.length === 1 ? 'table' : 'tables'} + {totalRows > 0 && ` · ~${totalRows.toLocaleString()} rows`} + + + + {/* Scrollable content */} + + {/* Tables list */} + + + Tables + + + {tables.map((tbl, idx) => ( + + + + {tbl.name} + {tbl.row_count != null && ( + + {Number(tbl.row_count).toLocaleString()} rows + + )} + {tbl.columns && ( + + {tbl.columns.length} cols + + )} + + {tbl.columns && tbl.columns.length > 0 && ( + + {tbl.columns.join(', ')} + + )} + + ))} + + + + {/* Source Filters */} + {sourceFilters.length > 0 && ( + + + Filters + + + {sourceFilters.map((f, idx) => ( + + + {f.name} + {f.required && *} + + {f.input_type === 'select' ? ( + setFilterValues(prev => ({ ...prev, [f.name]: newVal }))} + sx={{ flex: 1, '& .MuiInputBase-root': { fontSize: 11, minHeight: 28, py: '0px !important' } }} + renderInput={(params) => } + slotProps={{ popper: { sx: { '& .MuiAutocomplete-option': { fontSize: 11, minHeight: 28 } } } }} + /> + ) : f.input_type === 'numeric' ? ( + setFilterValues(prev => ({ ...prev, [f.name]: e.target.value ? Number(e.target.value) : '' }))} + placeholder={f.column} + sx={{ flex: 1, '& .MuiInputBase-root': { fontSize: 11, height: 28 } }} + /> + ) : ( + setFilterValues(prev => ({ ...prev, [f.name]: e.target.value }))} + placeholder={f.column} + sx={{ flex: 1, '& .MuiInputBase-root': { fontSize: 11, height: 28 } }} + /> + )} + + ))} + + + )} + + + {/* Load controls — pinned at bottom */} + + {loadedKey ? ( + + ) : ( + <> + Rows/table + ({ label: n.toLocaleString(), value: n })), + { label: 'All', value: -1 }, + ]} + value={rowLimit === -1 + ? { label: 'All', value: -1 } + : { label: rowLimit.toLocaleString(), value: rowLimit } + } + onChange={(_e, newVal) => { + if (newVal == null) return; + if (typeof newVal === 'string') { + const v = parseInt(newVal.replace(/,/g, '')); + if (!isNaN(v) && v > 0) setRowLimit(v); + } else { + setRowLimit(newVal.value); + } + }} + getOptionLabel={(opt) => typeof opt === 'string' ? opt : opt.label} + isOptionEqualToValue={(opt, val) => opt.value === val.value} + disableClearable + sx={{ width: 100, '& .MuiInputBase-root': { fontSize: 11, height: 28, py: '0px !important' } }} + renderInput={(params) => } + slotProps={{ popper: { sx: { '& .MuiAutocomplete-option': { fontSize: 11, minHeight: 28 } } } }} + /> + + + + )} + + + ); +}; + export const DataLoaderForm: React.FC<{ dataLoaderType: string, paramDefs: {name: string, default?: string, type: string, required: boolean, description?: string, sensitive?: boolean, tier?: 'connection' | 'auth' | 'filter'}[], @@ -437,22 +748,42 @@ export const DataLoaderForm: React.FC<{ onImport: () => void, onFinish: (status: "success" | "error", message: string, importedTables?: string[]) => void, onConnected?: () => void, - onDisconnected?: () => void, -}> = ({dataLoaderType, paramDefs, authInstructions, connectorId, autoConnect, delegatedLogin, authMode, onImport, onFinish, onConnected, onDisconnected}) => { + /** Called when the user clicks Delete. Receives the connectorId. */ + onDelete?: (connectorId: string) => void, + /** Called before the connect step. Returns the effective connectorId to use. + * Used by AddConnectionPanel to create the connector before connecting. */ + onBeforeConnect?: (params: Record) => Promise, +}> = ({dataLoaderType, paramDefs, authInstructions, connectorId, autoConnect, delegatedLogin, authMode, onImport, onFinish, onConnected, onDelete, onBeforeConnect}) => { const { t } = useTranslation(); const dispatch = useDispatch(); const theme = useTheme(); + // Effective connectorId — may be updated by onBeforeConnect (e.g. AddConnectionPanel) + const connectorIdRef = useRef(connectorId); + useEffect(() => { connectorIdRef.current = connectorId; }, [connectorId]); const params = useSelector((state: DataFormulatorState) => state.dataLoaderConnectParams[dataLoaderType] ?? {}); - const frontendRowLimit = useSelector((state: DataFormulatorState) => state.config?.frontendRowLimit ?? 50000); + const frontendRowLimit = useSelector((state: DataFormulatorState) => state.config?.frontendRowLimit ?? 2_000_000); const workspaceTables = useSelector((state: DataFormulatorState) => state.tables); const [tableMetadata, setTableMetadata] = useState>({}); const [selectedPreviewTable, setSelectedPreviewTable] = useState(null); - // Import mode for the currently selected table - const [importMode, setImportMode] = useState<'full' | 'subset'>('full'); - const [subsetConfig, setSubsetConfig] = useState<{ rowLimit: number; sortColumns: string[]; sortOrder: 'asc' | 'desc' }>({ rowLimit: 1000, sortColumns: [], sortOrder: 'asc' }); + // Catalog tree state (hierarchical browsing) + const [catalogTree, setCatalogTree] = useState([]); + const [selectedTreeNode, setSelectedTreeNode] = useState(null); + const [expandedItems, setExpandedItems] = useState([]); + // Import options for the currently selected table + // Standard row-limit presets, capped by the system frontendRowLimit setting + const rowLimitPresets = useMemo( + () => [1000, 5000, 10000, 50000, 100000, 200000, 500000, 1000000].filter(n => n <= frontendRowLimit), + [frontendRowLimit], + ); + const [loadConfig, setLoadConfig] = useState<{ + limit: number; + sortColumn: string; + sortOrder: 'asc' | 'desc'; + }>({ limit: frontendRowLimit, sortColumn: '', sortOrder: 'desc' }); + // Track which tables have been loaded and how (persists across table selections) - const [loadedTables, setLoadedTables] = useState>({}); + const [loadedTables, setLoadedTables] = useState>({}); // Cross-reference workspace tables with database tables to detect already-loaded ones const workspaceLoadedTables = useMemo(() => { @@ -497,20 +828,62 @@ export const DataLoaderForm: React.FC<{ // Connection timeout in milliseconds (30 seconds) const CONNECTION_TIMEOUT_MS = 30_000; + // Helper: extract flat table metadata from the tree for preview/load logic + const extractTableMetadata = useCallback((tree: CatalogTreeNode[]) => { + const result: Record = {}; + const walk = (nodes: CatalogTreeNode[]) => { + for (const n of nodes) { + if (n.node_type === 'table') { + // Use the path-based key so duplicate table names under different namespaces stay distinct + const key = n.path.join('/'); + result[key] = { ...n.metadata, _catalogName: n.name, _catalogPath: n.path }; + } else if (n.node_type === 'table_group') { + const key = n.path.join('/'); + result[key] = { ...n.metadata, _catalogName: n.name, _catalogPath: n.path, _isGroup: true }; + } + if (n.children) walk(n.children); + } + }; + walk(tree); + return result; + }, []); + + // Helper: fetch catalog tree and update state + const fetchCatalogTree = useCallback(async (filter?: string) => { + const treeResp = await fetchWithIdentity(CONNECTOR_ACTION_URLS.GET_CATALOG_TREE, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ connector_id: connectorIdRef.current, filter: filter?.trim() || null }), + }); + const treeData = await treeResp.json(); + if (treeData.tree) { + setCatalogTree(treeData.tree); + setExpandedItems(collectNamespaceIds(treeData.tree)); + const flatMeta = extractTableMetadata(treeData.tree); + setTableMetadata(flatMeta); + return treeData; + } else if (treeData.status === 'error') { + throw new Error(treeData.message || 'Failed to load catalog tree'); + } + return treeData; + }, [extractTableMetadata]); + // Helper: connect and list tables via data connector const connectAndListTables = useCallback(async (filter?: string) => { setIsConnecting(true); const controller = new AbortController(); const timeoutId = setTimeout(() => controller.abort(), CONNECTION_TIMEOUT_MS); try { - const sourceId = connectorId!; - const urls = getConnectorUrls(sourceId); // Strip table_filter from params sent to connect (it's for catalog browsing, not connection) const { table_filter: _tf, ...connectParams } = mergedParams as Record; - const connectResp = await fetchWithIdentity(urls.AUTH_CONNECT, { + // If onBeforeConnect is provided (e.g. AddConnectionPanel), create the connector first + if (onBeforeConnect) { + connectorIdRef.current = await onBeforeConnect(connectParams); + } + const connectResp = await fetchWithIdentity(CONNECTOR_ACTION_URLS.CONNECT, { method: 'POST', headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ params: connectParams, persist: persistCredentials }), + body: JSON.stringify({ connector_id: connectorIdRef.current, params: connectParams, persist: persistCredentials }), signal: controller.signal, }); clearTimeout(timeoutId); @@ -518,22 +891,10 @@ export const DataLoaderForm: React.FC<{ if (connectData.status !== 'connected') { throw new Error(connectData.message || 'Connection failed'); } - // List tables before promoting to "connected" state + // Fetch catalog tree before promoting to "connected" state const tableFilterValue = filter ?? (mergedParams as Record).table_filter ?? ''; - const listResp = await fetchWithIdentity(urls.CATALOG_LIST_TABLES, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ filter: tableFilterValue?.trim() || null }), - }); - const listData = await listResp.json(); - if (listData.tables) { - setTableMetadata(Object.fromEntries( - listData.tables.map((t: any) => [t.name, t.metadata]) - )); - } else if (listData.status === 'error') { - throw new Error(listData.message || 'Failed to list tables'); - } - // Only promote to "connected" after tables are loaded + await fetchCatalogTree(tableFilterValue); + // Only promote to "connected" after tree is loaded onConnected?.(); } catch (error: any) { clearTimeout(timeoutId); @@ -545,15 +906,26 @@ export const DataLoaderForm: React.FC<{ } finally { setIsConnecting(false); } - }, [connectorId, mergedParams, persistCredentials, onFinish, onConnected, t]); + }, [mergedParams, persistCredentials, onFinish, onConnected, onBeforeConnect, fetchCatalogTree, t]); // Delegated (popup-based) login flow for token-based connectors - const popupRef = useRef(null); const pollTimerRef = useRef | null>(null); - const handleDelegatedLogin = useCallback(() => { - if (!delegatedLogin?.login_url || !connectorId) return; + const handleDelegatedLogin = useCallback(async () => { + if (!delegatedLogin?.login_url) return; setIsConnecting(true); + try { + // If onBeforeConnect is provided (e.g. AddConnectionPanel), create the connector first + if (onBeforeConnect) { + const { table_filter: _tf, ...connectParams } = mergedParams as Record; + connectorIdRef.current = await onBeforeConnect(connectParams); + } + if (!connectorIdRef.current) return; + } catch (err: any) { + onFinish('error', err.message || 'Failed to create connector'); + setIsConnecting(false); + return; + } const url = new URL(delegatedLogin.login_url, window.location.origin); url.searchParams.set('df_origin', window.location.origin); @@ -579,7 +951,6 @@ export const DataLoaderForm: React.FC<{ setIsConnecting(false); return; } - popupRef.current = popup; const handler = async (event: MessageEvent) => { if (event.data?.type !== 'df-sso-auth') return; @@ -590,12 +961,13 @@ export const DataLoaderForm: React.FC<{ const { access_token, refresh_token, user } = event.data; if (access_token) { try { - const urls = getConnectorUrls(connectorId); // Send tokens to backend token-connect endpoint - const connectResp = await fetchWithIdentity(urls.AUTH_TOKEN_CONNECT, { + const connectResp = await fetchWithIdentity(CONNECTOR_ACTION_URLS.CONNECT, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ + connector_id: connectorIdRef.current, + mode: 'token', access_token, refresh_token, user, @@ -607,18 +979,8 @@ export const DataLoaderForm: React.FC<{ if (connectData.status !== 'connected') { throw new Error(connectData.message || 'Token connection failed'); } - // List tables - const listResp = await fetchWithIdentity(urls.CATALOG_LIST_TABLES, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ filter: null }), - }); - const listData = await listResp.json(); - if (listData.tables) { - setTableMetadata(Object.fromEntries( - listData.tables.map((t: any) => [t.name, t.metadata]) - )); - } + // Fetch catalog tree + await fetchCatalogTree(null as any); onConnected?.(); } catch (err: any) { onFinish("error", err.message || 'Login failed'); @@ -636,37 +998,41 @@ export const DataLoaderForm: React.FC<{ setIsConnecting(false); } }, 1000); - }, [delegatedLogin, connectorId, params, persistCredentials, onFinish, onConnected, t]); + }, [delegatedLogin, mergedParams, persistCredentials, onFinish, onConnected, onBeforeConnect, t]); // Auto-connect on mount if this source has stored vault credentials. // Uses auth/status which auto-reconnects from vault, then lists tables. const autoConnectTriggered = useRef(false); useEffect(() => { - if (autoConnect && connectorId && !autoConnectTriggered.current && Object.keys(tableMetadata).length === 0) { + if (autoConnect && connectorIdRef.current && !autoConnectTriggered.current && Object.keys(tableMetadata).length === 0) { autoConnectTriggered.current = true; (async () => { setIsConnecting(true); try { - const urls = getConnectorUrls(connectorId); - // auth/status triggers auto-reconnect from vault - const statusResp = await fetchWithIdentity(urls.AUTH_STATUS, { method: 'GET' }); + // Check current connection status (no side effects) + const statusResp = await fetchWithIdentity(CONNECTOR_ACTION_URLS.GET_STATUS, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ connector_id: connectorIdRef.current }), + }); const statusData = await statusResp.json(); if (statusData.connected) { - // Already connected / reconnected from vault — list tables - const listResp = await fetchWithIdentity(urls.CATALOG_LIST_TABLES, { + // Already connected — fetch catalog tree + await fetchCatalogTree(); + } else if (statusData.has_stored_credentials) { + // Vault has creds — attempt reconnect + const connectResp = await fetchWithIdentity(CONNECTOR_ACTION_URLS.CONNECT, { method: 'POST', headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ filter: null }), + body: JSON.stringify({ connector_id: connectorIdRef.current, params: {}, persist: true }), }); - const listData = await listResp.json(); - if (listData.tables) { - setTableMetadata(Object.fromEntries( - listData.tables.map((t: any) => [t.name, t.metadata]) - )); + const connectData = await connectResp.json(); + if (connectData.status === 'connected') { + await fetchCatalogTree(); } } } catch (err) { - console.warn('Auto-connect failed for', connectorId, err); + console.warn('Auto-connect failed for', connectorIdRef.current, err); } finally { setIsConnecting(false); } @@ -682,12 +1048,14 @@ export const DataLoaderForm: React.FC<{ } }, [tableMetadata]); - // Reset import mode when switching tables + // Reset load config when switching tables useEffect(() => { if (selectedPreviewTable && tableMetadata[selectedPreviewTable]) { - setImportMode('full'); - const metadata = tableMetadata[selectedPreviewTable]; - setSubsetConfig({ rowLimit: Math.min(1000, metadata.row_count || 1000), sortColumns: [], sortOrder: 'asc' }); + const rowCount = tableMetadata[selectedPreviewTable].row_count || 0; + // Default to All unless the table exceeds the system row limit + const defaultLimit = rowCount > frontendRowLimit ? frontendRowLimit : -1; + setLoadConfig({ limit: defaultLimit, sortColumn: '', sortOrder: 'desc' }); + } }, [selectedPreviewTable]); @@ -716,271 +1084,132 @@ export const DataLoaderForm: React.FC<{ const tableNames = Object.keys(tableMetadata); - let tableMetadataBox = [ - // Tables as chips + preview below - tableNames.length > 0 && ( - - {/* Table chips */} - - {tableNames.map((tableName) => { - const metadata = tableMetadata[tableName]; - const isSelected = tableName === selectedPreviewTable; - const loaded = effectiveLoadedTables[tableName]; - return ( - setSelectedPreviewTable(tableName)} - icon={loaded ? : undefined} - sx={{ - cursor: 'pointer', - fontSize: 11, - height: 26, - borderRadius: 1, - ...(loaded === 'full' ? { - backgroundColor: alpha(theme.palette.success.main, 0.12), - borderColor: alpha(theme.palette.success.main, 0.5), - color: theme.palette.success.dark, - '& .MuiChip-icon': { color: theme.palette.success.main }, - } : loaded === 'subset' ? { - backgroundColor: alpha('#f9a825', 0.15), - borderColor: alpha('#f9a825', 0.5), - color: '#e65100', - '& .MuiChip-icon': { color: '#f9a825' }, - } : isSelected ? { - backgroundColor: alpha(theme.palette.primary.main, 0.12), - borderColor: alpha(theme.palette.primary.main, 0.5), - color: theme.palette.primary.main, - } : {}), - border: '1px solid', - borderColor: loaded === 'full' - ? alpha(theme.palette.success.main, 0.5) - : loaded === 'subset' - ? alpha('#f9a825', 0.5) - : isSelected - ? alpha(theme.palette.primary.main, 0.5) - : 'rgba(0,0,0,0.15)', - '&:hover': { - backgroundColor: loaded === 'full' - ? alpha(theme.palette.success.main, 0.18) - : loaded === 'subset' - ? alpha('#f9a825', 0.22) - : alpha(theme.palette.primary.main, 0.08), - }, - }} - /> - ); - })} - + // Handler for selecting a table node from the catalog tree + const handleTreeTableSelect = useCallback((node: CatalogTreeNode) => { + setSelectedTreeNode(node); + const pathKey = node.path.join('/'); + setSelectedPreviewTable(pathKey); + }, []); - {/* Preview + load controls */} - {previewTable && selectedPreviewTable && ( - - - ({ - id: name, - label: name, - minWidth: 60, - }))} - rowsPerPageNum={-1} - compact={false} - isIncompleteTable={previewTable.rows.length > 12} - maxHeight={240} - /> - - - {tableMetadata[selectedPreviewTable]?.row_count > 0 - ? t('db.rowsCount', { count: tableMetadata[selectedPreviewTable].row_count.toLocaleString() }) - : t('db.sampleRowsCount', { count: previewTable.rows.length }) - } × {previewTable.names.length} {t('db.columns')} - + // The source_table identifier for import: use the original name from list_tables() + // For flat sources this is the table name; for hierarchical sources it's the dotted path (e.g. "schema.table") + const getSourceTableName = useCallback((pathKey: string): string => { + const meta = tableMetadata[pathKey]; + if (meta?._source_name) return meta._source_name; + if (meta?._catalogName) return meta._catalogName; + // Fallback: last segment of the path + return pathKey.split('/').pop() || pathKey; + }, [tableMetadata]); - {/* Load controls */} - - {/* Subset option - hidden when already loaded */} - {!effectiveLoadedTables[selectedPreviewTable] && - setImportMode(e.target.checked ? 'subset' : 'full')} - size="small" - sx={{ p: 0.25 }} - /> - setImportMode(importMode === 'subset' ? 'full' : 'subset')} - > - {t('db.loadSubset')} - - {importMode === 'subset' && selectedPreviewTable && tableMetadata[selectedPreviewTable] && (() => { - const metadata = tableMetadata[selectedPreviewTable]; - return ( - <> - - {t('db.rowsLabel')} - { - const value = parseInt(e.target.value) || 1; - const maxRows = metadata.row_count || 100000; - setSubsetConfig(prev => ({ ...prev, rowLimit: Math.min(Math.max(1, value), maxRows) })); - }} - slotProps={{ input: { inputProps: { min: 1, max: metadata.row_count || 100000, step: 100 } } }} - sx={{ width: 90, '& .MuiInputBase-root': { fontSize: 11, height: 26 }, '& .MuiInputBase-input': { py: 0.25, px: 0.75 } }} - /> - / {(metadata.row_count || '?').toLocaleString()} - - - {t('app.sort')}: - col.name)} - value={subsetConfig.sortColumns} - onChange={(_, newValue) => setSubsetConfig(prev => ({ ...prev, sortColumns: newValue }))} - renderInput={(params) => ( - - )} - renderTags={(value, getTagProps) => - value.map((option, index) => ( - - )) - } - slotProps={{ paper: { sx: { fontSize: 12, '& .MuiAutocomplete-option': { fontSize: 12, py: 0.5, minHeight: 28 } } } }} - sx={{ flex: 1, minWidth: 0 }} - /> - {subsetConfig.sortColumns.length > 0 && ( - { if (v) setSubsetConfig(prev => ({ ...prev, sortOrder: v })); }} - size="small" - sx={{ height: 24 }} - > - - - - )} - - - ); - })()} - } - {/* Load Table button */} - {effectiveLoadedTables[selectedPreviewTable] ? ( - - - - - ) : ( - - )} + /** Shared helper: build DictTable + dispatch loadTable */ + const doLoadTable = useCallback((importOptions: Record, label?: string) => { + const pathKey = selectedPreviewTable; + if (!pathKey) return; + const meta = tableMetadata[pathKey]; + if (!meta) return; + + const sourceTableName = getSourceTableName(pathKey); + const sampleRows = meta.sample_rows || []; + const columns = meta.columns || []; + const tableObj: DictTable = { + kind: 'table' as const, + id: sourceTableName.split('.').pop() || sourceTableName, + displayId: sourceTableName, + names: columns.map((c: any) => c.name), + metadata: columns.reduce((acc: Record, col: any) => ({ + ...acc, + [col.name]: { type: 'string' as any, semanticType: '', levels: [] } + }), {}), + rows: sampleRows, + virtual: { tableId: sourceTableName.split('.').pop() || sourceTableName, rowCount: meta.row_count || sampleRows.length }, + anchored: true, + attachedMetadata: '', + source: { + type: 'database' as const, + databaseTable: pathKey, + canRefresh: true, + lastRefreshed: Date.now(), + connectorId: connectorIdRef.current, + }, + }; + + onImport(); + dispatch(loadTable({ + table: tableObj, + connectorId: connectorIdRef.current, + sourceTableName, + importOptions, + })).unwrap() + .then((result) => { + setLoadedTables(prev => ({ ...prev, [pathKey]: label || 'loaded' })); + onFinish("success", `Loaded table "${sourceTableName}"`, [result.table.id]); + }) + .catch((error) => { + console.error('Failed to load data:', error); + onFinish("error", `Failed to load "${sourceTableName}": ${error}`); + }); + }, [selectedPreviewTable, tableMetadata, getSourceTableName, onImport, onFinish, dispatch]); + + + const isConnected = catalogTree.length > 0 || Object.keys(tableMetadata).length > 0; + + /** Recursively render CatalogTreeNode[] as styled TreeItem elements */ + const countBadgeSx = { + fontSize: 11, color: 'text.disabled', bgcolor: 'action.selected', + borderRadius: 10, px: 0.8, lineHeight: '18px', flexShrink: 0, + fontVariantNumeric: 'tabular-nums', minWidth: 22, textAlign: 'center', + } as const; + + const renderCatalogTreeItems = (nodes: CatalogTreeNode[], loadedMap: Record, expandedSet: Set): React.ReactNode => + nodes.map((node) => { + const itemId = node.path.join('/'); + const isTable = node.node_type === 'table'; + const isGroup = node.node_type === 'table_group'; + const loaded = isTable ? loadedMap[node.name] || loadedMap[itemId] : undefined; + const groupLoaded = isGroup ? loadedMap[itemId] : undefined; + const childCount = !isTable && !isGroup ? (node.children?.length ?? 0) : 0; + const tableCount = isGroup ? (node.metadata?.tables?.length ?? 0) : 0; + const isExpanded = expandedSet.has(itemId); + + const labelContent = ( + + {isGroup + ? + : isTable + ? + : + } + + {node.name} + + {(loaded || groupLoaded) && } + {isTable && node.metadata?.row_count != null && ( + + {Number(node.metadata.row_count).toLocaleString()} + + )} + {isGroup && tableCount > 0 && ( + + {tableCount} - - )} - - ), - ] + )} + {childCount > 0 && !isExpanded && ( + + {childCount} + + )} + + ); - const isConnected = Object.keys(tableMetadata).length > 0; + return ( + + {!isGroup && node.children && renderCatalogTreeItems(node.children, loadedMap, expandedSet)} + + ); + }); return ( - + {isConnecting && } {isConnected ? ( - // Connected state: show connection info + table browser - - {/* Header: source name · connection params · disconnect */} - + // Connected state: tree browser (left) + table detail (right) + + {/* Header: source name · connection params · delete */} + {dataLoaderType} @@ -1002,74 +1231,294 @@ export const DataLoaderForm: React.FC<{ ))} - + {onDelete && connectorIdRef.current && ( + + )} - {/* Search bar: filter + refresh in a pill-shaped container */} - - - { - if (e.key === 'Enter') { - const val = (e.target as HTMLInputElement).value; - dispatch(dfActions.updateDataLoaderConnectParam({dataLoaderType, paramName: 'table_filter', paramValue: val})); - connectAndListTables(val); - } - }} - inputRef={filterInputRef} - /> - - + {/* Main content: tree (left) + detail (right) */} + + {/* Left: catalog tree */} + + {/* Inline search */} + + + { + if (e.key === 'Enter') { + const val = (e.target as HTMLInputElement).value; + dispatch(dfActions.updateDataLoaderConnectParam({dataLoaderType, paramName: 'table_filter', paramValue: val})); + connectAndListTables(val); + } + }} + inputRef={filterInputRef} + /> + { + const val = filterInputRef.current?.value ?? params.table_filter ?? ''; + dispatch(dfActions.updateDataLoaderConnectParam({dataLoaderType, paramName: 'table_filter', paramValue: val})); + connectAndListTables(val); + }} + > + + + + + + {catalogTree.length > 0 ? ( + setExpandedItems(itemIds)} + selectedItems={selectedPreviewTable} + onSelectedItemsChange={(_event, itemId) => { + if (itemId == null) return; + const node = findNodeByPath(catalogTree, itemId); + if (node && (node.node_type === 'table' || node.node_type === 'table_group')) { + handleTreeTableSelect(node); + } + }} + itemChildrenIndentation={0} + sx={{ px: 0.5 }} + > + {renderCatalogTreeItems(catalogTree, effectiveLoadedTables, new Set(expandedItems))} + + ) : ( + + {t('db.noTablesFound')} + + )} + + + + {/* Right: table detail + preview + load controls */} + + {/* Group load panel for table_group nodes */} + {selectedPreviewTable && tableMetadata[selectedPreviewTable]?._isGroup ? (() => { + const metadata = tableMetadata[selectedPreviewTable]; + const groupName = metadata._catalogName || selectedPreviewTable; + const tables: any[] = metadata.tables || []; + const sourceFilters: SourceFilter[] = metadata.source_filters || []; + return ( + setLoadedTables(prev => ({ ...prev, [selectedPreviewTable!]: label }))} + onImport={onImport} + onFinish={onFinish} + /> + ); + })() : previewTable && selectedPreviewTable && tableMetadata[selectedPreviewTable] ? (() => { + const metadata = tableMetadata[selectedPreviewTable]; + const displayName = metadata?._catalogName || selectedPreviewTable.split('/').pop() || selectedPreviewTable; + return ( + + {/* Table header */} + + + {displayName} + + {selectedTreeNode && selectedTreeNode.path.length > 1 && ( + + {selectedTreeNode.path.slice(0, -1).join(' / ')} + + )} + + {/* Summary line */} + + {metadata?.row_count > 0 + ? t('db.rowsCount', { count: Number(metadata.row_count).toLocaleString() }) + : t('db.sampleRowsCount', { count: previewTable.rows.length }) + } × {previewTable.names.length} {t('db.columns')} + + {/* Preview table — scrolls when tall, shrink-wraps when short */} + + + ({ + id: name, + label: name, + minWidth: 60, + }))} + rowsPerPageNum={-1} + compact={false} + isIncompleteTable={previewTable.rows.length > 20} + /> + + + + {/* Load & filter panel — pinned below table */} + + {effectiveLoadedTables[selectedPreviewTable] ? ( + /* Already loaded */ + + + + + ) : ( + /* Not yet loaded — show options */ + <> + {/* Row 1: Limit + Sort + Load Button */} + + {metadata?.row_count > 1000 && (<> + Rows + n <= metadata.row_count && n <= frontendRowLimit).map(n => ({ + label: n.toLocaleString(), value: n, + })), + { label: 'All', value: -1 }, + ]} + value={loadConfig.limit === -1 + ? { label: 'All', value: -1 } + : { label: loadConfig.limit.toLocaleString(), value: loadConfig.limit } + } + onChange={(_e, newVal) => { + if (newVal == null) return; + if (typeof newVal === 'string') { + const v = parseInt(newVal.replace(/,/g, '')); + if (!isNaN(v) && v > 0) setLoadConfig(prev => ({ ...prev, limit: v })); + } else { + setLoadConfig(prev => ({ ...prev, limit: newVal.value })); + } + }} + onInputChange={(_e, inputVal, reason) => { + if (reason !== 'input') return; + const v = parseInt(inputVal.replace(/,/g, '')); + if (!isNaN(v) && v > 0) setLoadConfig(prev => ({ ...prev, limit: v })); + }} + getOptionLabel={(opt) => typeof opt === 'string' ? opt : opt.label} + isOptionEqualToValue={(opt, val) => opt.value === val.value} + disableClearable + sx={{ width: 110, '& .MuiInputBase-root': { fontSize: 11, height: 28, py: '0px !important' }, '& .MuiInputBase-input': { px: 0.75 } }} + renderInput={(params) => } + slotProps={{ popper: { sx: { '& .MuiAutocomplete-option': { fontSize: 11, minHeight: 28 } } } }} + /> + + Sort + setLoadConfig(prev => ({ ...prev, sortColumn: e.target.value }))} + slotProps={{ select: { displayEmpty: true } }} + sx={{ width: 110, '& .MuiInputBase-root': { fontSize: 11, height: 28 }, '& .MuiSelect-select': { py: 0.25, px: 0.75 } }} + > + none + {(metadata.columns || []).map((col: any) => ( + {col.name} + ))} + + {loadConfig.sortColumn && ( + { if (v) setLoadConfig(prev => ({ ...prev, sortOrder: v })); }} + size="small" sx={{ height: 28 }} + > + ASC + DESC + + )} + )} + + + + + + )} + + + ); + })() : ( + + + {tableNames.length > 0 ? t('db.selectTableFromTree') : t('db.noTablesFound')} + + + )} + - - {tableMetadataBox} ) : ( // Not connected: show connection forms <> - - {dataLoaderType} - + {!onBeforeConnect && ( + + {dataLoaderType} + + )} {(() => { const hasTiers = paramDefs.some(p => p.tier); // Section wrapper: subtle background, rounded, with label @@ -1083,7 +1532,7 @@ export const DataLoaderForm: React.FC<{ '& .MuiInputLabel-root': { fontSize: 11, color: 'text.secondary', fontWeight: 500 }, '& .MuiInputLabel-root.Mui-focused': { color: 'primary.main' }, }; - const shrinkProps = { shrink: true }; + const labelShrinkSlotProps = { inputLabel: { shrink: true } }; // Pick 2 or 3 columns to minimise orphan fields on the last row const balancedCols = (n: number) => { if (n <= 2) return 2; @@ -1095,13 +1544,13 @@ export const DataLoaderForm: React.FC<{ // Legacy: no tier field, render flat grid const cols = balancedCols(paramDefs.length); return ( - + {paramDefs.map((paramDef) => ( { const cols = balancedCols(tierParams.length); return ( - + {tierParams.map((paramDef) => ( p.tier === 'filter'); const authParams = paramDefs.filter(p => p.tier === 'auth'); const hasDelegated = !!delegatedLogin?.login_url; + const connectLabel = onBeforeConnect + ? t('db.createConnector', { defaultValue: 'Create Connector' }) + : t('db.connect', { suffix: (params.table_filter || '').trim() ? t('db.withFilter') : '' }); return ( <> @@ -1199,13 +1651,13 @@ export const DataLoaderForm: React.FC<{ {/* Right: credential fields + connect */} - + {authParams.map((paramDef) => ( connectAndListTables()}> - {t('db.connect', { suffix: (params.table_filter || '').trim() ? t('db.withFilter') : '' })} + {connectLabel} @@ -1246,7 +1698,7 @@ export const DataLoaderForm: React.FC<{ variant="contained" color="primary" size="small" sx={{ textTransform: "none", minWidth: 80, height: 30, mt: 1.5, fontSize: 12 }} onClick={() => connectAndListTables()}> - {t('db.connect', { suffix: (params.table_filter || '').trim() ? t('db.withFilter') : '' })} + {connectLabel} )} @@ -1294,7 +1746,19 @@ export const DataLoaderForm: React.FC<{ })}> {authInstructions.trim()} - )} + )} + {onDelete && connectorIdRef.current && ( + + + + )} + )} ); diff --git a/src/views/DataFormulator.tsx b/src/views/DataFormulator.tsx index 6cadd391..80ffb1b2 100644 --- a/src/views/DataFormulator.tsx +++ b/src/views/DataFormulator.tsx @@ -40,7 +40,7 @@ import { DataThread } from './DataThread'; import dfLogo from '../assets/df-logo.png'; import exampleImageTable from "../assets/example-image-table.png"; import { ModelSelectionButton } from './ModelSelectionDialog'; -import { UnifiedDataUploadDialog, UploadTabType, DataLoadMenu } from './UnifiedDataUploadDialog'; +import { UnifiedDataUploadDialog, UploadTabType, DataLoadMenu, ConnectorInstance } from './UnifiedDataUploadDialog'; import { ReportView } from './ReportView'; import GitHubIcon from '@mui/icons-material/GitHub'; import YouTubeIcon from '@mui/icons-material/YouTube'; @@ -48,7 +48,7 @@ import { ExampleSession, exampleSessions, ExampleSessionCard } from './ExampleSe import { useDataRefresh, useDerivedTableRefresh } from '../app/useDataRefresh'; import type { DictTable } from '../components/ComponentType'; import { useTranslation } from 'react-i18next'; -import { fetchWithIdentity, getUrls } from '../app/utils'; +import { fetchWithIdentity, getUrls, CONNECTOR_URLS } from '../app/utils'; import { listWorkspaces, loadWorkspace, deleteWorkspace, exportWorkspace, importWorkspace } from '../app/workspaceService'; import { AppDispatch } from '../app/store'; import Card from '@mui/material/Card'; @@ -92,6 +92,15 @@ export const DataFormulatorFC = ({ }) => { } }, [focusedId, tables, dispatch]); + // ── Connector instances (for landing page menu) ───────────── + const [pageConnectors, setPageConnectors] = useState([]); + useEffect(() => { + fetchWithIdentity(CONNECTOR_URLS.LIST, { method: 'GET' }) + .then(r => r.json()) + .then(data => setPageConnectors(data.connectors || [])) + .catch(() => {}); + }, []); + // ── Workspace list (shown on landing page) ──────────────────── const [savedWorkspaces, setSavedWorkspaces] = useState<{id: string, display_name: string, saved_at: string | null}[]>([]); const [confirmDeleteWs, setConfirmDeleteWs] = useState(null); @@ -497,6 +506,7 @@ export const DataFormulatorFC = ({ }) => { onSelectTab={(tab) => openUploadDialog(tab)} serverConfig={serverConfig} variant="page" + connectors={pageConnectors} /> {/* ── Saved workspaces section ──────────────────────────── */} diff --git a/src/views/DataThread.tsx b/src/views/DataThread.tsx index e17ba09b..4f7ed158 100644 --- a/src/views/DataThread.tsx +++ b/src/views/DataThread.tsx @@ -1193,7 +1193,7 @@ let SingleThreadGroupView: FC<{ const clarifyDraft = draftNodes.find(d => d.derive?.status === 'clarifying' && d.derive.trigger.tableId === tableId); const clarifyInteraction = clarifyDraft?.derive?.trigger?.interaction; if (clarifyInteraction && clarifyInteraction.length > 0) { - pushInteractionEntries(clarifyInteraction, tableId, triggerType, highlighted, 'agent-clarify-entry', { isClarifying: false }); + pushInteractionEntries(clarifyInteraction, tableId, triggerType, highlighted, 'agent-clarify-entry', { isClarifying: false, tableId }); // Mark the last clarify entry with isClarifying so the gutter shows the hourglass const lastItem = timelineItems[timelineItems.length - 1]; if (lastItem?.interactionEntry?.role === 'clarify') lastItem.isClarifying = true; @@ -1203,6 +1203,7 @@ let SingleThreadGroupView: FC<{ type: 'chart', highlighted, isClarifying: true, + tableId, element: {t('dataThread.waitingForClarification')}, }); } @@ -1547,8 +1548,15 @@ let SingleThreadGroupView: FC<{ ? getEntryGutterIcon(entry, iconColor) : getDefaultGutterIcon(iconColor); + // Clarification items are clickable to focus on the associated table + const clarifyClickHandler = (item.isClarifying && item.tableId) + ? () => dispatch(dfActions.setFocused({ type: 'table', tableId: item.tableId! })) + : undefined; + return ( - + { const draftId = `draft-${actionId}-${Date.now()}`; dispatch(dfActions.createDraftNode({ @@ -455,6 +456,7 @@ export const SimpleChartRecBox: FC = function () { actionId, })); currentDraftId = draftId; + currentDraftParentTableId = parentTableId; currentDraftInteraction = [...initialInteraction]; return draftId; }; @@ -464,6 +466,7 @@ export const SimpleChartRecBox: FC = function () { currentDraftId = pendingClarification.draftId; // Seed local accumulator from the existing draft's interaction (fresh at this point) const existingDraft = draftNodes.find(d => d.id === pendingClarification.draftId); + currentDraftParentTableId = existingDraft?.derive?.trigger?.tableId || null; currentDraftInteraction = [...(existingDraft?.derive?.trigger?.interaction || [])]; // The user reply was already appended above, add to local accumulator too currentDraftInteraction.push({ from: 'user', to: 'data-agent', role: 'prompt', content: prompt, timestamp: Date.now() }); @@ -656,6 +659,10 @@ export const SimpleChartRecBox: FC = function () { completedStepCount: result.completed_step_count || 0, lastCreatedTableId, }})); + // Auto-focus the table that owns the clarification so the user sees the question + if (currentDraftParentTableId) { + dispatch(dfActions.setFocused({ type: 'table', tableId: currentDraftParentTableId })); + } } setIsChatFormulating(false); agentAbortRef.current = null; diff --git a/src/views/UnifiedDataUploadDialog.tsx b/src/views/UnifiedDataUploadDialog.tsx index fded3363..f9fa2913 100644 --- a/src/views/UnifiedDataUploadDialog.tsx +++ b/src/views/UnifiedDataUploadDialog.tsx @@ -31,35 +31,33 @@ import ImageSearchIcon from '@mui/icons-material/ImageSearch'; import ExploreIcon from '@mui/icons-material/Explore'; import RestartAltIcon from '@mui/icons-material/RestartAlt'; import ArrowBackIcon from '@mui/icons-material/ArrowBack'; +import AddIcon from '@mui/icons-material/Add'; import Paper from '@mui/material/Paper'; import CircularProgress from '@mui/material/CircularProgress'; import Backdrop from '@mui/material/Backdrop'; import { useDispatch, useSelector } from 'react-redux'; -import { DataFormulatorState, dfActions, fetchFieldSemanticType } from '../app/dfSlice'; +import { DataFormulatorState, dfActions } from '../app/dfSlice'; import { AppDispatch } from '../app/store'; import { loadTable, loadPluginTable } from '../app/tableThunks'; import { DataSourceConfig, DictTable } from '../components/ComponentType'; import { createTableFromFromObjectArray, createTableFromText, loadTextDataWrapper, loadBinaryDataWrapper, readFileText } from '../data/utils'; import { DataLoadingChat } from './DataLoadingChat'; import { DatasetSelectionView, DatasetMetadata } from './TableSelectionView'; -import { getUrls, fetchWithIdentity } from '../app/utils'; -import { DBManagerPane } from './DBTableManager'; +import { getUrls, fetchWithIdentity, CONNECTOR_URLS } from '../app/utils'; +import { DataLoaderForm } from './DBTableManager'; import { MultiTablePreview } from './MultiTablePreview'; import { - ToggleButton, - ToggleButtonGroup, FormControlLabel, Switch, } from '@mui/material'; import FolderOpenIcon from '@mui/icons-material/FolderOpen'; import CloudIcon from '@mui/icons-material/Cloud'; -import OpenInNewIcon from '@mui/icons-material/OpenInNew'; import LanguageIcon from '@mui/icons-material/Language'; import { useTranslation } from 'react-i18next'; import { getEnabledPlugins, PluginHost } from '../plugins'; -export type UploadTabType = 'menu' | 'upload' | 'paste' | 'url' | 'database' | 'extract' | 'explore' | `plugin:${string}`; +export type UploadTabType = 'menu' | 'upload' | 'paste' | 'url' | 'database' | 'extract' | 'explore' | `plugin:${string}` | 'add-connection' | `connector:${string}`; interface TabPanelProps { children?: React.ReactNode; @@ -91,6 +89,7 @@ interface DataSourceCardProps { description: string; onClick: () => void; disabled?: boolean; + dashed?: boolean; } const DataSourceCard: React.FC = ({ @@ -99,6 +98,7 @@ const DataSourceCard: React.FC = ({ description, onClick, disabled = false, + dashed = false, }) => { const theme = useTheme(); @@ -109,7 +109,7 @@ const DataSourceCard: React.FC = ({ sx={{ p: 1.5, cursor: disabled ? 'not-allowed' : 'pointer', - border: `1px solid ${borderColor.divider}`, + border: `1px ${dashed ? 'dashed' : 'solid'} ${borderColor.divider}`, borderRadius: radius.sm, opacity: disabled ? 0.5 : 1, display: 'flex', @@ -176,19 +176,38 @@ const getUniqueTableName = (baseName: string, existingNames: Set): strin return uniqueName; }; +/** A registered connector instance from GET /api/connectors */ +export interface ConnectorInstance { + id: string; + source_type: string; + display_name: string; + icon: string; + connected: boolean; + deletable?: boolean; + params_form: Array<{name: string; type: string; required: boolean; default?: string; description?: string; sensitive?: boolean; tier?: 'connection' | 'auth' | 'filter'}>; + pinned_params: Record; + hierarchy: Array<{key: string; label: string}>; + effective_hierarchy: Array<{key: string; label: string}>; + auth_mode?: string; + auth_instructions?: string; + delegated_login?: { login_url: string; label?: string } | null; +} + // Reusable Data Load Menu Component export interface DataLoadMenuProps { onSelectTab: (tab: UploadTabType) => void; serverConfig?: { WORKSPACE_BACKEND?: string }; variant?: 'dialog' | 'page'; // 'dialog' uses smaller cards, 'page' uses larger cards hideSampleDatasets?: boolean; + connectors?: ConnectorInstance[]; } export const DataLoadMenu: React.FC = ({ onSelectTab, serverConfig = { WORKSPACE_BACKEND: 'local' }, variant = 'dialog', - hideSampleDatasets = false + hideSampleDatasets = false, + connectors = [], }) => { const theme = useTheme(); const { t } = useTranslation(); @@ -226,7 +245,8 @@ export const DataLoadMenu: React.FC = ({ const enabledPlugins = getEnabledPlugins((serverConfig as any)?.PLUGINS); - const liveDataSources: Array<{ value: UploadTabType; title: string; description: string; icon: React.ReactNode; disabled: boolean }> = [ + // All connectors get cards — connected ones show status, disconnected show type + const liveDataSources: Array<{ value: UploadTabType; title: string; description: string; icon: React.ReactNode; disabled: boolean; dashed?: boolean }> = [ { value: 'url' as UploadTabType, title: t('upload.loadFromUrl'), @@ -234,12 +254,24 @@ export const DataLoadMenu: React.FC = ({ icon: , disabled: false }, - { - value: 'database' as UploadTabType, - title: t('upload.database'), - description: t('upload.databaseDesc'), - icon: , - disabled: false + // Per-connector cards — all instances + ...connectors.map((conn) => ({ + value: `connector:${conn.id}` as UploadTabType, + title: conn.display_name, + description: conn.connected + ? t('upload.connectorConnected', { defaultValue: 'Connected' }) + : conn.source_type || t('upload.connectorDisconnected', { defaultValue: 'Not connected' }), + icon: , + disabled: false, + })), + // "Add Connection" card (dashed style) + { + value: 'add-connection' as UploadTabType, + title: t('upload.addConnection', { defaultValue: 'Add Connection' }), + description: t('upload.addConnectionDesc', { defaultValue: 'Connect to a new database or data service' }), + icon: , + disabled: false, + dashed: true, }, ...enabledPlugins.map(({ module, config }) => ({ value: `plugin:${module.id}` as UploadTabType, @@ -338,6 +370,7 @@ export const DataLoadMenu: React.FC = ({ description={source.description} onClick={() => onSelectTab(source.value)} disabled={source.disabled} + dashed={source.dashed} /> ))} @@ -444,6 +477,7 @@ export const DataLoadMenu: React.FC = ({ description={source.description} onClick={() => onSelectTab(source.value)} disabled={source.disabled} + dashed={source.dashed} /> ))} @@ -451,6 +485,216 @@ export const DataLoadMenu: React.FC = ({ ); }; +// --------------------------------------------------------------------------- +// AddConnectionPanel — left sidebar lists loader types, right shows DataLoaderForm +// --------------------------------------------------------------------------- + +interface LoaderType { + type: string; + name: string; + params: Array<{name: string; type: string; required: boolean; default?: string; description?: string; sensitive?: boolean; tier?: 'connection' | 'auth' | 'filter'}>; + hierarchy: Array<{key: string; label: string}>; + auth_mode?: string; + auth_instructions?: string; + delegated_login?: { login_url: string; label?: string } | null; +} + +const AddConnectionPanel: React.FC<{ + onCreated: (connector: ConnectorInstance) => void; +}> = ({ onCreated }) => { + const { t } = useTranslation(); + const [loaderTypes, setLoaderTypes] = useState([]); + const [disabledLoaders, setDisabledLoaders] = useState>({}); + const [selectedType, setSelectedType] = useState(''); + const [displayName, setDisplayName] = useState(''); + const dispatch = useDispatch(); + // Track the created connector ID so DataLoaderForm can use it + const createdIdRef = useRef(null); + + // Fetch available loader types + useEffect(() => { + fetchWithIdentity(CONNECTOR_URLS.DATA_LOADERS, { method: 'GET' }) + .then(r => r.json()) + .then(data => { + setLoaderTypes(data.loaders || []); + setDisabledLoaders(data.disabled || {}); + if (data.loaders?.length > 0) { + setSelectedType(data.loaders[0].type); + setDisplayName(data.loaders[0].name); + } + }) + .catch(() => {}); + }, []); + + const selectedLoader = loaderTypes.find(l => l.type === selectedType); + + const handleSelectLoader = (loader: LoaderType) => { + setSelectedType(loader.type); + setDisplayName(loader.name); + createdIdRef.current = null; + }; + + // Called by DataLoaderForm before connecting — creates the connector and returns its ID + const handleBeforeConnect = useCallback(async (params: Record): Promise => { + // If already created (e.g. retry after failed connect), reuse the ID + if (createdIdRef.current) return createdIdRef.current; + + const resp = await fetchWithIdentity(CONNECTOR_URLS.CREATE, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ + loader_type: selectedType, + display_name: displayName.trim() || selectedLoader?.name || selectedType, + icon: selectedType, + params, + persist: true, + }), + }); + const data = await resp.json(); + if (data.status === 'error') { + throw new Error(data.message || 'Failed to create connector'); + } + createdIdRef.current = data.id; + return data.id; + }, [selectedType, displayName, selectedLoader]); + + // After DataLoaderForm successfully connects, fetch full connector info and notify parent + const handleConnected = useCallback(async () => { + const cid = createdIdRef.current; + if (!cid) return; + try { + const listResp = await fetchWithIdentity(CONNECTOR_URLS.LIST, { method: 'GET' }); + const listData = await listResp.json(); + const created = (listData.connectors || []).find((c: ConnectorInstance) => c.id === cid); + if (created) { + onCreated({ ...created, connected: true }); + dispatch(dfActions.addMessages({ + timestamp: Date.now(), component: 'connector', type: 'success', + value: `Connected to "${created.display_name}"`, + })); + } + } catch { + // Connection succeeded even if list fetch fails + } + }, [onCreated, dispatch]); + + // Shared input style + const inputSx = { + '& .MuiInput-underline:before': { borderBottomColor: 'rgba(0,0,0,0.15)' }, + '& .MuiInputBase-root': { fontSize: 12, mt: 1.5 }, + '& .MuiInputBase-input': { fontSize: 12, py: 0.5, px: 0 }, + '& .MuiInputBase-input::placeholder': { fontSize: 11, opacity: 0.45 }, + '& .MuiInputLabel-root': { fontSize: 11, color: 'text.secondary', fontWeight: 500 }, + '& .MuiInputLabel-root.Mui-focused': { color: 'primary.main' }, + }; + + // Left sidebar button style + const sidebarButtonSx = (typeKey: string) => ({ + fontSize: 12, + textTransform: 'none' as const, + width: '100%', + justifyContent: 'flex-start', + textAlign: 'left' as const, + borderRadius: 0, + py: 1, + px: 2, + color: selectedType === typeKey ? 'primary.main' : 'text.secondary', + borderRight: selectedType === typeKey ? 2 : 0, + borderColor: 'primary.main', + }); + + return ( + + {/* Left sidebar: loader types */} + + + {t('upload.dataSourceTypes', { defaultValue: 'Data Sources' })} + + {loaderTypes.map((loader) => ( + + ))} + {Object.entries(disabledLoaders).map(([name, { install_hint }]) => ( + + + + + + ))} + + + {/* Right panel: display name + DataLoaderForm */} + + {selectedLoader ? ( + + {/* Connection name + DataLoaderForm */} + + setDisplayName(e.target.value)} + style={{ width: 280, marginBottom: 8 }} + /> + {}} + onFinish={(status, message) => { + dispatch(dfActions.addMessages({ + timestamp: Date.now(), component: 'connector', + type: status === 'success' ? 'success' : 'error', + value: message, + })); + }} + onConnected={handleConnected} + onBeforeConnect={handleBeforeConnect} + /> + + + ) : ( + + + {t('upload.selectDataSourceType', { defaultValue: 'Select a data source type' })} + + + )} + + + ); +}; + export interface UnifiedDataUploadDialogProps { open: boolean; onClose: () => void; @@ -470,13 +714,30 @@ export const UnifiedDataUploadDialog: React.FC = ( const existingTables = useSelector((state: DataFormulatorState) => state.tables); const serverConfig = useSelector((state: DataFormulatorState) => state.serverConfig); const dataCleanBlocks = useSelector((state: DataFormulatorState) => state.dataCleanBlocks); - const frontendRowLimit = useSelector((state: DataFormulatorState) => state.config?.frontendRowLimit ?? 50000); + const frontendRowLimit = useSelector((state: DataFormulatorState) => state.config?.frontendRowLimit ?? 2_000_000); const existingNames = new Set(existingTables.map(t => t.id)); const [activeTab, setActiveTab] = useState(initialTab === 'menu' ? 'menu' : initialTab); const fileInputRef = useRef(null); const urlInputRef = useRef(null); + // Connector instances fetched from GET /api/connectors + const [connectorInstances, setConnectorInstances] = useState([]); + + // Fetch connector list when dialog opens + const refreshConnectors = useCallback(() => { + fetchWithIdentity(CONNECTOR_URLS.LIST, { method: 'GET' }) + .then(r => r.json()) + .then(data => setConnectorInstances(data.connectors || [])) + .catch(() => {}); + }, []); + + useEffect(() => { + if (open) { + refreshConnectors(); + } + }, [open, refreshConnectors]); + // Storage is determined by backend config — no user toggle const isEphemeral = serverConfig.WORKSPACE_BACKEND === 'ephemeral'; const storeOnServer = !isEphemeral; // used to decide file upload behavior @@ -1056,6 +1317,14 @@ export const UnifiedDataUploadDialog: React.FC = ( const found = enabledPluginsForDialog.find(p => p.module.id === pluginId); return found?.config.name || pluginId; } + if (activeTab.startsWith('connector:')) { + const connId = activeTab.slice(10); + const found = connectorInstances.find(c => c.id === connId); + return found?.display_name || connId; + } + if (activeTab === 'add-connection') { + return t('upload.addConnection', { defaultValue: 'Add Connection' }); + } const tabTitles: Record = { 'menu': t('upload.title'), 'explore': t('upload.sampleDatasets'), @@ -1075,9 +1344,9 @@ export const UnifiedDataUploadDialog: React.FC = ( maxWidth={false} sx={{ '& .MuiDialog-paper': { - width: 1100, + width: 1200, maxWidth: '95vw', - height: 600, + height: 700, maxHeight: '90vh', display: 'flex', flexDirection: 'column', @@ -1158,6 +1427,7 @@ export const UnifiedDataUploadDialog: React.FC = ( serverConfig={serverConfig} variant="dialog" hideSampleDatasets={hideSampleDatasets} + connectors={connectorInstances} /> @@ -1175,9 +1445,11 @@ export const UnifiedDataUploadDialog: React.FC = ( }}> = ( value={pasteContent} onChange={handleContentChange} placeholder={t('upload.placeholder.paste')} - InputProps={{ - readOnly: isLargeContent && !showFullContent, + slotProps={{ + input: { readOnly: isLargeContent && !showFullContent }, }} sx={{ flex: hasPasteContent ? 1 : 'none', @@ -1595,9 +1867,77 @@ export const UnifiedDataUploadDialog: React.FC = ( - {/* Database Tab */} - - + {/* Per-connector Tabs — one per registered instance */} + {connectorInstances.map((conn) => ( + + + {}} + onFinish={(status, message) => { + dispatch(dfActions.addMessages({ + timestamp: Date.now(), + component: 'connector', + type: status === 'success' ? 'success' : 'error', + value: message, + })); + }} + onConnected={() => { + setConnectorInstances(prev => + prev.map(c => c.id === conn.id ? { ...c, connected: true } : c) + ); + }} + onDelete={conn.deletable ? async (cid) => { + try { + const resp = await fetchWithIdentity(CONNECTOR_URLS.DELETE(cid), { method: 'DELETE' }); + const data = await resp.json(); + if (data.status === 'deleted') { + setConnectorInstances(prev => prev.filter(c => c.id !== cid)); + setActiveTab('menu'); + dispatch(dfActions.addMessages({ + timestamp: Date.now(), component: 'connector', type: 'success', + value: `Deleted connector "${conn.display_name}"`, + })); + } else { + dispatch(dfActions.addMessages({ + timestamp: Date.now(), component: 'connector', type: 'error', + value: data.message || 'Failed to delete connector', + })); + } + } catch (err: any) { + dispatch(dfActions.addMessages({ + timestamp: Date.now(), component: 'connector', type: 'error', + value: err.message || 'Failed to delete connector', + })); + } + } : undefined} + /> + + + ))} + + {/* Add Connection Tab */} + + { + // Update connector list — card will appear on menu + setConnectorInstances(prev => { + const exists = prev.find(c => c.id === newConnector.id); + if (exists) { + return prev.map(c => c.id === newConnector.id ? newConnector : c); + } + return [...prev, newConnector]; + }); + // Jump to the new connector's tab + setActiveTab(`connector:${newConnector.id}` as UploadTabType); + }} + /> {/* Plugin Tabs */} diff --git a/tests/backend/unit/test_all_loader_verification.py b/tests/backend/unit/test_all_loader_verification.py index c50fc348..6eab5419 100644 --- a/tests/backend/unit/test_all_loader_verification.py +++ b/tests/backend/unit/test_all_loader_verification.py @@ -205,22 +205,22 @@ def test_all_loaders_can_be_wrapped(self): assert len(cfg["params_form"]) > 0 def test_all_loaders_blueprints_have_all_routes(self): - """Each wrapped loader should have 9 routes.""" + """The shared connectors blueprint should have all expected action routes.""" import flask - from data_formulator.data_connector import DataConnector - expected_suffixes = [ - "/auth/connect", "/auth/disconnect", "/auth/status", - "/catalog/ls", "/catalog/metadata", "/catalog/list_tables", - "/data/import", "/data/refresh", "/data/preview", + from data_formulator.data_connector import connectors_bp + expected_routes = [ + "/api/connectors/connect", + "/api/connectors/get-status", + "/api/connectors/get-catalog", + "/api/connectors/get-catalog-tree", + "/api/connectors/import-data", + "/api/connectors/refresh-data", + "/api/connectors/preview-data", + "/api/connectors/import-group", ] - for key, cls in _get_available_loaders().items(): - app = flask.Flask(__name__) - app.config["TESTING"] = True - source = DataConnector.from_loader(cls, source_id=key) - app.register_blueprint(source.create_blueprint()) - rules = [rule.rule for rule in app.url_map.iter_rules()] - for suffix in expected_suffixes: - expected_rule = f"/api/connectors/{key}{suffix}" - assert expected_rule in rules, ( - f"{key}: missing route {expected_rule}" - ) + app = flask.Flask(__name__) + app.config["TESTING"] = True + app.register_blueprint(connectors_bp) + rules = [rule.rule for rule in app.url_map.iter_rules()] + for route in expected_routes: + assert route in rules, f"missing shared route {route}" diff --git a/tests/backend/unit/test_data_connector_config.py b/tests/backend/unit/test_data_connector_config.py index f8fd5420..6e8a266a 100644 --- a/tests/backend/unit/test_data_connector_config.py +++ b/tests/backend/unit/test_data_connector_config.py @@ -4,14 +4,15 @@ """Tests for config-driven data source registration. Covers: -- YAML configuration parsing and source spec generation +- connectors.yaml parsing and source spec generation (admin + user) - Environment variable parsing (DF_SOURCES____=) -- Auto-discovery of installed loaders -- Config priority: env > YAML > auto-discovery +- Config priority: env > connectors.yaml - Multiple instances of the same loader type -- DF_AUTO_DISCOVER_SOURCES=false - ${ENV_REF} expansion in params - register_data_connectors() end-to-end +- _ensure_connectors_loaded() lazy user hydration +- Admin connector immutability +- User connector persistence (save/load/remove) """ from __future__ import annotations @@ -28,10 +29,14 @@ DATA_CONNECTORS, DataConnector, SourceSpec, - _build_source_specs, - _load_yaml_config, - _parse_env_sources, + _ADMIN_CONNECTOR_IDS, + _LOADED_USER_IDENTITIES, + _load_admin_specs, + _load_connectors_yaml, + _load_user_specs, _resolve_env_refs, + _save_user_connectors, + load_connectors, register_data_connectors, ) from data_formulator.data_loader.external_data_loader import ( @@ -78,12 +83,20 @@ def auth_instructions(): @pytest.fixture(autouse=True) def _clean_data_connectors(): - """Reset the global DATA_CONNECTORS dict between tests.""" + """Reset the global DATA_CONNECTORS dict, _ADMIN_CONNECTOR_IDS, and _LOADED_USER_IDENTITIES between tests.""" old = dict(DATA_CONNECTORS) + old_admin = set(_ADMIN_CONNECTOR_IDS) + old_loaded = set(_LOADED_USER_IDENTITIES) DATA_CONNECTORS.clear() + _ADMIN_CONNECTOR_IDS.clear() + _LOADED_USER_IDENTITIES.clear() yield DATA_CONNECTORS.clear() DATA_CONNECTORS.update(old) + _ADMIN_CONNECTOR_IDS.clear() + _ADMIN_CONNECTOR_IDS.update(old_admin) + _LOADED_USER_IDENTITIES.clear() + _LOADED_USER_IDENTITIES.update(old_loaded) @pytest.fixture @@ -95,18 +108,79 @@ def app(): # ================================================================== -# Tests: Environment Variable Parsing +# Tests: _resolve_env_refs +# ================================================================== + +class TestResolveEnvRefs: + + def test_resolves_env_var(self, monkeypatch): + monkeypatch.setenv("DB_PASSWORD", "s3cret") + result = _resolve_env_refs({"password": "${DB_PASSWORD}", "host": "db.corp"}) + assert result["password"] == "s3cret" + assert result["host"] == "db.corp" + + def test_missing_env_var_becomes_empty(self, monkeypatch): + monkeypatch.delenv("MISSING_VAR", raising=False) + result = _resolve_env_refs({"val": "${MISSING_VAR}"}) + assert result["val"] == "" + + def test_non_env_ref_passed_through(self): + result = _resolve_env_refs({"host": "db.corp", "port": "3306"}) + assert result == {"host": "db.corp", "port": "3306"} + + +# ================================================================== +# Tests: _load_connectors_yaml +# ================================================================== + +class TestLoadConnectorsYaml: + + def test_load_valid_file(self, tmp_path): + yaml_content = textwrap.dedent("""\ + connectors: + - type: postgresql + name: "My PG" + params: + host: pg.example.com + database: mydb + - type: mysql + name: "My MySQL" + params: + host: mysql.example.com + """) + yaml_file = tmp_path / "connectors.yaml" + yaml_file.write_text(yaml_content) + entries = _load_connectors_yaml(yaml_file) + assert len(entries) == 2 + assert entries[0]["type"] == "postgresql" + assert entries[0]["params"]["host"] == "pg.example.com" + + def test_returns_empty_for_missing_file(self, tmp_path): + entries = _load_connectors_yaml(tmp_path / "nonexistent.yaml") + assert entries == [] + + def test_returns_empty_for_bad_yaml(self, tmp_path): + yaml_file = tmp_path / "connectors.yaml" + yaml_file.write_text("connectors: not-a-list") + entries = _load_connectors_yaml(yaml_file) + assert entries == [] + + +# ================================================================== +# Tests: Environment Variable Parsing (via _load_admin_specs) # ================================================================== class TestEnvVarParsing: - def test_parse_env_sources_basic(self, monkeypatch): + def test_parse_env_sources_basic(self, monkeypatch, tmp_path): monkeypatch.setenv("DF_SOURCES__pg_prod__type", "postgresql") monkeypatch.setenv("DF_SOURCES__pg_prod__name", "Production DB") monkeypatch.setenv("DF_SOURCES__pg_prod__params__host", "db.example.com") monkeypatch.setenv("DF_SOURCES__pg_prod__params__database", "prod") - specs = _parse_env_sources() + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path): + specs = _load_admin_specs() + assert len(specs) == 1 s = specs[0] assert s.source_id == "pg_prod" @@ -114,210 +188,238 @@ def test_parse_env_sources_basic(self, monkeypatch): assert s.display_name == "Production DB" assert s.default_params["host"] == "db.example.com" assert s.default_params["database"] == "prod" + assert s.source == "admin" - def test_parse_env_sources_multiple(self, monkeypatch): + def test_parse_env_sources_multiple(self, monkeypatch, tmp_path): monkeypatch.setenv("DF_SOURCES__pg__type", "postgresql") monkeypatch.setenv("DF_SOURCES__pg__params__host", "pg.local") monkeypatch.setenv("DF_SOURCES__mysql__type", "mysql") monkeypatch.setenv("DF_SOURCES__mysql__params__host", "mysql.local") - specs = _parse_env_sources() + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path): + specs = _load_admin_specs() + assert len(specs) == 2 types = {s.loader_type for s in specs} assert types == {"postgresql", "mysql"} - def test_parse_env_sources_missing_type_skipped(self, monkeypatch): + def test_parse_env_sources_missing_type_skipped(self, monkeypatch, tmp_path): monkeypatch.setenv("DF_SOURCES__broken__params__host", "localhost") - # No DF_SOURCES__broken__type set - specs = _parse_env_sources() + + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path): + specs = _load_admin_specs() + assert len(specs) == 0 - def test_parse_env_sources_default_name(self, monkeypatch): + def test_parse_env_sources_default_name(self, monkeypatch, tmp_path): monkeypatch.setenv("DF_SOURCES__pg__type", "postgresql") - specs = _parse_env_sources() + + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path): + specs = _load_admin_specs() + assert specs[0].display_name == "Postgresql" # ================================================================== -# Tests: YAML Config Loading +# Tests: _load_admin_specs (YAML + env priority) # ================================================================== -class TestYamlConfigLoading: +class TestLoadAdminSpecs: + + def test_load_from_connectors_yaml(self, tmp_path, monkeypatch): + for key in list(os.environ): + if key.startswith("DF_SOURCES__"): + monkeypatch.delenv(key) - def test_load_yaml_from_cwd(self, tmp_path, monkeypatch): yaml_content = textwrap.dedent("""\ - auto_discover: false - sources: + connectors: - type: postgresql name: "My PG" params: host: pg.example.com - database: mydb - - type: mysql - name: "My MySQL" - params: - host: mysql.example.com """) - yaml_file = tmp_path / "data-sources.yml" - yaml_file.write_text(yaml_content) - monkeypatch.chdir(tmp_path) - monkeypatch.delenv("DATA_FORMULATOR_HOME", raising=False) + (tmp_path / "connectors.yaml").write_text(yaml_content) - config = _load_yaml_config() - assert config is not None - assert config["auto_discover"] is False - assert len(config["sources"]) == 2 - assert config["sources"][0]["type"] == "postgresql" - assert config["sources"][0]["params"]["host"] == "pg.example.com" - - def test_load_yaml_from_df_home(self, tmp_path, monkeypatch): - yaml_content = textwrap.dedent("""\ - sources: - - type: bigquery - params: - project: my-gcp-project - """) - yaml_file = tmp_path / "data-sources.yml" - yaml_file.write_text(yaml_content) - monkeypatch.setenv("DATA_FORMULATOR_HOME", str(tmp_path)) - # Make sure cwd doesn't have one too - monkeypatch.chdir(Path(__file__).parent) + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path): + specs = _load_admin_specs() - config = _load_yaml_config() - assert config is not None - assert config["sources"][0]["type"] == "bigquery" + assert len(specs) == 1 + assert specs[0].loader_type == "postgresql" + assert specs[0].display_name == "My PG" + assert specs[0].source == "admin" - def test_load_yaml_returns_none_if_missing(self, tmp_path, monkeypatch): - monkeypatch.chdir(tmp_path) - monkeypatch.delenv("DATA_FORMULATOR_HOME", raising=False) - config = _load_yaml_config() - assert config is None + def test_env_overrides_yaml(self, tmp_path, monkeypatch): + """Env var source with same ID overrides YAML source.""" + monkeypatch.setenv("DF_SOURCES__pg__type", "postgresql") + monkeypatch.setenv("DF_SOURCES__pg__name", "Env PG") + yaml_content = textwrap.dedent("""\ + connectors: + - type: postgresql + id: pg + name: "YAML PG" + """) + (tmp_path / "connectors.yaml").write_text(yaml_content) -# ================================================================== -# Tests: _build_source_specs -# ================================================================== + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path): + specs = _load_admin_specs() -class TestBuildSourceSpecs: + pg_spec = next(s for s in specs if s.source_id == "pg") + assert pg_spec.display_name == "Env PG" - def test_auto_discovery_includes_all_data_loaders(self, monkeypatch): - """Without config, all DATA_LOADERS should appear.""" - monkeypatch.delenv("DATA_FORMULATOR_HOME", raising=False) - monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) - # Clear env source vars + def test_multiple_instances_same_type(self, tmp_path, monkeypatch): for key in list(os.environ): if key.startswith("DF_SOURCES__"): monkeypatch.delenv(key) - mock_loaders = {"stub_a": _StubLoader, "stub_b": _StubLoader} + yaml_content = textwrap.dedent("""\ + connectors: + - type: stub + id: stub_prod + name: Production + params: + host: prod.corp + - type: stub + id: stub_stage + name: Staging + params: + host: stage.corp + """) + (tmp_path / "connectors.yaml").write_text(yaml_content) - with patch("data_formulator.data_connector._load_yaml_config", return_value=None), \ - patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): - specs, auto_discover = _build_source_specs() + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path): + specs = _load_admin_specs() - assert auto_discover is True + assert len(specs) == 2 ids = {s.source_id for s in specs} - assert "stub_a" in ids - assert "stub_b" in ids + assert ids == {"stub_prod", "stub_stage"} - def test_auto_discovery_disabled_by_env(self, monkeypatch): - monkeypatch.setenv("DF_AUTO_DISCOVER_SOURCES", "false") + def test_env_ref_resolution_in_yaml_params(self, tmp_path, monkeypatch): + monkeypatch.setenv("DB_PASSWORD", "s3cret") for key in list(os.environ): if key.startswith("DF_SOURCES__"): monkeypatch.delenv(key) - mock_loaders = {"stub": _StubLoader} - with patch("data_formulator.data_connector._load_yaml_config", return_value=None), \ - patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): - specs, auto_discover = _build_source_specs() + yaml_content = textwrap.dedent("""\ + connectors: + - type: stub + params: + host: db.corp + password: "${DB_PASSWORD}" + """) + (tmp_path / "connectors.yaml").write_text(yaml_content) - assert auto_discover is False - # No env specs + no yaml specs + no auto-discovery → empty - assert len(specs) == 0 + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path): + specs = _load_admin_specs() - def test_auto_discovery_disabled_by_yaml(self, monkeypatch): - for key in list(os.environ): - if key.startswith("DF_SOURCES__"): - monkeypatch.delenv(key) - monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) + assert specs[0].default_params["password"] == "s3cret" + assert specs[0].default_params["host"] == "db.corp" - yaml_config = { - "auto_discover": False, - "sources": [{"type": "stub", "name": "My Stub"}], - } - mock_loaders = {"stub": _StubLoader, "other": _StubLoader} - with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ - patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): - specs, auto_discover = _build_source_specs() +# ================================================================== +# Tests: User connector persistence +# ================================================================== - assert auto_discover is False - assert len(specs) == 1 - assert specs[0].loader_type == "stub" +class TestUserConnectorPersistence: - def test_env_overrides_yaml(self, monkeypatch): - """Env var source with same ID overrides YAML source.""" - monkeypatch.setenv("DF_SOURCES__pg__type", "postgresql") - monkeypatch.setenv("DF_SOURCES__pg__name", "Env PG") - monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) - - yaml_config = { - "auto_discover": False, - "sources": [ - {"type": "postgresql", "id": "pg", "name": "YAML PG"}, - ], - } - mock_loaders = {"postgresql": _StubLoader} - with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ - patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): - specs, _ = _build_source_specs() + def test_save_and_load_user_connectors(self, tmp_path): + user_dir = tmp_path / "users" / "test-user" - # Env wins - pg_spec = next(s for s in specs if s.source_id == "pg") - assert pg_spec.display_name == "Env PG" + specs = [ + SourceSpec( + source_id="mysql:prod", + loader_type="mysql", + display_name="MySQL Prod", + default_params={"host": "mysql.corp"}, + source="user", + ), + ] + + with patch("data_formulator.datalake.workspace.get_user_home", return_value=user_dir): + _save_user_connectors("test-user", specs) + + assert (user_dir / "connectors.yaml").is_file() + + with patch("data_formulator.datalake.workspace.get_user_home", return_value=user_dir): + loaded = _load_user_specs("test-user") + + assert len(loaded) == 1 + assert loaded[0].source_id == "mysql:prod" + assert loaded[0].loader_type == "mysql" + assert loaded[0].display_name == "MySQL Prod" + assert loaded[0].source == "user" + + def test_load_user_specs_returns_empty_if_no_file(self, tmp_path): + user_dir = tmp_path / "users" / "new-user" + with patch("data_formulator.datalake.workspace.get_user_home", return_value=user_dir): + specs = _load_user_specs("new-user") + assert specs == [] + + +# ================================================================== +# Tests: load_connectors +# ================================================================== + +class TestLoadConnectors: + + def test_loads_user_connectors_on_first_call(self, tmp_path): + """User connectors should be lazily loaded on first call with identity.""" + user_dir = tmp_path / "users" / "alice" + user_dir.mkdir(parents=True) + yaml_content = textwrap.dedent("""\ + connectors: + - type: stub + id: user_db + name: "Alice DB" + """) + (user_dir / "connectors.yaml").write_text(yaml_content) - def test_multiple_instances_same_type(self, monkeypatch): - for key in list(os.environ): - if key.startswith("DF_SOURCES__"): - monkeypatch.delenv(key) - monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) - - yaml_config = { - "auto_discover": False, - "sources": [ - {"type": "stub", "id": "stub_prod", "name": "Production", "params": {"host": "prod.corp"}}, - {"type": "stub", "id": "stub_stage", "name": "Staging", "params": {"host": "stage.corp"}}, - ], - } mock_loaders = {"stub": _StubLoader} - with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + + with patch("data_formulator.datalake.workspace.get_user_home", return_value=user_dir), \ patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): - specs, _ = _build_source_specs() + load_connectors("alice") - assert len(specs) == 2 - ids = {s.source_id for s in specs} - assert ids == {"stub_prod", "stub_stage"} + assert "user_db" in DATA_CONNECTORS + + def test_does_not_overwrite_admin_connectors(self, tmp_path): + """Admin connector should not be replaced by user connector with same ID.""" + user_dir = tmp_path / "users" / "alice" + user_dir.mkdir(parents=True) + yaml_content = textwrap.dedent("""\ + connectors: + - type: stub + id: shared_db + name: "User version" + """) + (user_dir / "connectors.yaml").write_text(yaml_content) + + admin_connector = DataConnector.from_loader( + _StubLoader, source_id="shared_db", display_name="Admin version", + ) + DATA_CONNECTORS["shared_db"] = admin_connector + _ADMIN_CONNECTOR_IDS.add("shared_db") - def test_env_ref_resolution_in_yaml_params(self, monkeypatch): - monkeypatch.setenv("DB_PASSWORD", "s3cret") - for key in list(os.environ): - if key.startswith("DF_SOURCES__"): - monkeypatch.delenv(key) - monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) - - yaml_config = { - "auto_discover": False, - "sources": [ - {"type": "stub", "params": {"host": "db.corp", "password": "${DB_PASSWORD}"}}, - ], - } mock_loaders = {"stub": _StubLoader} - with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + + with patch("data_formulator.datalake.workspace.get_user_home", return_value=user_dir), \ patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): - specs, _ = _build_source_specs() + load_connectors("alice") - assert specs[0].default_params["password"] == "s3cret" - assert specs[0].default_params["host"] == "db.corp" + assert DATA_CONNECTORS["shared_db"]._display_name == "Admin version" + + def test_second_call_is_noop(self, tmp_path): + """Second call with same identity should be a no-op.""" + user_dir = tmp_path / "users" / "alice" + user_dir.mkdir(parents=True) + (user_dir / "connectors.yaml").write_text("connectors: []") + + mock_loaders = {"stub": _StubLoader} + with patch("data_formulator.datalake.workspace.get_user_home", return_value=user_dir), \ + patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders): + load_connectors("alice") + assert "alice" in _LOADED_USER_IDENTITIES + load_connectors("alice") # ================================================================== @@ -326,79 +428,83 @@ def test_env_ref_resolution_in_yaml_params(self, monkeypatch): class TestRegisterConnectedSources: - def test_registers_blueprints(self, app, monkeypatch): + def test_registers_blueprints(self, app, tmp_path, monkeypatch): for key in list(os.environ): if key.startswith("DF_SOURCES__"): monkeypatch.delenv(key) - monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) mock_loaders = {"stub": _StubLoader} mock_disabled = {} - yaml_config = { - "auto_discover": False, - "sources": [{"type": "stub", "name": "Test Stub"}], - } + yaml_content = textwrap.dedent("""\ + connectors: + - type: stub + name: "Test Stub" + """) + (tmp_path / "connectors.yaml").write_text(yaml_content) - with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path), \ patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders), \ patch("data_formulator.data_loader.DISABLED_LOADERS", mock_disabled): register_data_connectors(app) assert "stub" in DATA_CONNECTORS + assert "stub" in _ADMIN_CONNECTOR_IDS rules = [rule.rule for rule in app.url_map.iter_rules()] - assert "/api/connectors/stub/auth/connect" in rules + assert "/api/connectors/connect" in rules + assert "/api/connectors/get-status" in rules - def test_skips_unknown_loader_type(self, app, monkeypatch): + def test_skips_unknown_loader_type(self, app, tmp_path, monkeypatch): for key in list(os.environ): if key.startswith("DF_SOURCES__"): monkeypatch.delenv(key) - monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) - yaml_config = { - "auto_discover": False, - "sources": [{"type": "nonexistent"}], - } + yaml_content = textwrap.dedent("""\ + connectors: + - type: nonexistent + """) + (tmp_path / "connectors.yaml").write_text(yaml_content) - with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path), \ patch("data_formulator.data_loader.DATA_LOADERS", {}), \ patch("data_formulator.data_loader.DISABLED_LOADERS", {}): register_data_connectors(app) assert len(DATA_CONNECTORS) == 0 - def test_logs_disabled_loaders(self, app, monkeypatch): + def test_logs_disabled_loaders(self, app, tmp_path, monkeypatch): for key in list(os.environ): if key.startswith("DF_SOURCES__"): monkeypatch.delenv(key) - monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) - yaml_config = { - "auto_discover": False, - "sources": [{"type": "kusto"}], - } + yaml_content = textwrap.dedent("""\ + connectors: + - type: kusto + """) + (tmp_path / "connectors.yaml").write_text(yaml_content) - with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path), \ patch("data_formulator.data_loader.DATA_LOADERS", {}), \ patch("data_formulator.data_loader.DISABLED_LOADERS", {"kusto": "pip install azure-kusto-data"}): register_data_connectors(app) assert len(DATA_CONNECTORS) == 0 - def test_frontend_config_in_sources(self, app, monkeypatch): + def test_frontend_config_in_sources(self, app, tmp_path, monkeypatch): for key in list(os.environ): if key.startswith("DF_SOURCES__"): monkeypatch.delenv(key) - monkeypatch.delenv("DF_AUTO_DISCOVER_SOURCES", raising=False) mock_loaders = {"stub": _StubLoader} - yaml_config = { - "auto_discover": False, - "sources": [ - {"type": "stub", "name": "My Stub", "params": {"host": "db.corp"}}, - ], - } - - with patch("data_formulator.data_connector._load_yaml_config", return_value=yaml_config), \ + yaml_content = textwrap.dedent("""\ + connectors: + - type: stub + name: "My Stub" + params: + host: db.corp + """) + (tmp_path / "connectors.yaml").write_text(yaml_content) + + with patch("data_formulator.data_connector._get_df_home", return_value=tmp_path), \ patch("data_formulator.data_loader.DATA_LOADERS", mock_loaders), \ patch("data_formulator.data_loader.DISABLED_LOADERS", {}): register_data_connectors(app) @@ -407,7 +513,6 @@ def test_frontend_config_in_sources(self, app, monkeypatch): cfg = source.get_frontend_config() assert cfg["name"] == "My Stub" assert cfg["pinned_params"]["host"] == "db.corp" - # host should NOT be in form fields form_names = {f["name"] for f in cfg["params_form"]} assert "host" not in form_names assert "database" in form_names diff --git a/tests/backend/unit/test_data_connector_framework.py b/tests/backend/unit/test_data_connector_framework.py index f72f4bd6..f43fd4eb 100644 --- a/tests/backend/unit/test_data_connector_framework.py +++ b/tests/backend/unit/test_data_connector_framework.py @@ -30,10 +30,11 @@ DATA_CONNECTORS, DataConnector, SourceSpec, - _build_source_specs, _node_to_dict, + _resolve_connector, _resolve_env_refs, _sanitize_error, + connectors_bp, ) from data_formulator.data_loader.external_data_loader import ( CatalogNode, @@ -164,43 +165,56 @@ def test_connection(self) -> bool: # Fixtures # ------------------------------------------------------------------ +@pytest.fixture(autouse=True) +def _clean_data_connectors(): + """Reset the global DATA_CONNECTORS dict between tests.""" + old = dict(DATA_CONNECTORS) + DATA_CONNECTORS.clear() + yield + DATA_CONNECTORS.clear() + DATA_CONNECTORS.update(old) + + @pytest.fixture def app(): - """Minimal Flask app with a DataConnector for MockLoader.""" + """Minimal Flask app with the shared connectors blueprint.""" _app = flask.Flask(__name__) _app.config["TESTING"] = True _app.secret_key = "test-secret" + _app.register_blueprint(connectors_bp) return _app @pytest.fixture def source(): """A DataConnector wrapping MockLoader.""" - return DataConnector.from_loader( + s = DataConnector.from_loader( MockLoader, source_id="mock_db", display_name="Mock Database", default_params={"host": "localhost"}, icon="mock", ) + DATA_CONNECTORS["mock_db"] = s + return s @pytest.fixture def source_pinned(): """A DataConnector with database pre-pinned.""" - return DataConnector.from_loader( + s = DataConnector.from_loader( MockLoader, source_id="mock_pinned", display_name="Mock Pinned", default_params={"host": "localhost", "database": "testdb"}, ) + DATA_CONNECTORS["mock_pinned"] = s + return s @pytest.fixture def client(app, source): - """Flask test client with source blueprint registered.""" - bp = source.create_blueprint() - app.register_blueprint(bp) + """Flask test client with shared connectors blueprint.""" return app.test_client() @@ -208,7 +222,8 @@ def client(app, source): def connected_client(client): """Client that is already connected.""" with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = client.post("/api/connectors/mock_db/auth/connect", json={ + resp = client.post("/api/connectors/connect", json={ + "connector_id": "mock_db", "params": {"host": "localhost", "user": "test", "password": "test"}, }) assert resp.status_code == 200 @@ -219,25 +234,18 @@ def connected_client(client): # Tests: Blueprint & Registration # ================================================================== -class TestBlueprintCreation: +class TestSharedRouteRegistration: - def test_blueprint_has_correct_prefix(self, source): - bp = source.create_blueprint() - assert bp.url_prefix == "/api/connectors/mock_db" - - def test_blueprint_registers_routes(self, app, source): - bp = source.create_blueprint() - app.register_blueprint(bp) + def test_shared_routes_registered(self, app, source): rules = [rule.rule for rule in app.url_map.iter_rules()] - assert "/api/connectors/mock_db/auth/connect" in rules - assert "/api/connectors/mock_db/auth/disconnect" in rules - assert "/api/connectors/mock_db/auth/status" in rules - assert "/api/connectors/mock_db/catalog/ls" in rules - assert "/api/connectors/mock_db/catalog/metadata" in rules - assert "/api/connectors/mock_db/catalog/list_tables" in rules - assert "/api/connectors/mock_db/data/import" in rules - assert "/api/connectors/mock_db/data/refresh" in rules - assert "/api/connectors/mock_db/data/preview" in rules + assert "/api/connectors/connect" in rules + assert "/api/connectors/get-status" in rules + assert "/api/connectors/get-catalog" in rules + assert "/api/connectors/get-catalog-tree" in rules + assert "/api/connectors/import-data" in rules + assert "/api/connectors/import-group" in rules + assert "/api/connectors/refresh-data" in rules + assert "/api/connectors/preview-data" in rules # ================================================================== @@ -286,7 +294,8 @@ class TestAuthRoutes: def test_connect_success(self, client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = client.post("/api/connectors/mock_db/auth/connect", json={ + resp = client.post("/api/connectors/connect", json={ + "connector_id": "mock_db", "params": {"host": "localhost", "user": "test", "password": "test"}, }) data = resp.get_json() @@ -298,7 +307,8 @@ def test_connect_success(self, client): def test_connect_merges_default_params(self, client): """Default params (host=localhost) merged with user params.""" with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = client.post("/api/connectors/mock_db/auth/connect", json={ + resp = client.post("/api/connectors/connect", json={ + "connector_id": "mock_db", "params": {"user": "test", "password": "test"}, }) data = resp.get_json() @@ -307,10 +317,11 @@ def test_connect_merges_default_params(self, client): def test_connect_bad_host_returns_error(self, app): source = DataConnector.from_loader(MockLoader, source_id="mock_bad") - app.register_blueprint(source.create_blueprint()) + DATA_CONNECTORS["mock_bad"] = source c = app.test_client() with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = c.post("/api/connectors/mock_bad/auth/connect", json={ + resp = c.post("/api/connectors/connect", json={ + "connector_id": "mock_bad", "params": {"host": "bad-host", "user": "x", "password": "x"}, }) assert resp.status_code in (400, 500, 502) @@ -321,45 +332,47 @@ def test_connect_fails_when_test_connection_fails(self, app): source = DataConnector.from_loader( FailingTestConnectionLoader, source_id="mock_fail" ) - app.register_blueprint(source.create_blueprint()) + DATA_CONNECTORS["mock_fail"] = source c = app.test_client() with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = c.post("/api/connectors/mock_fail/auth/connect", json={ + resp = c.post("/api/connectors/connect", json={ + "connector_id": "mock_fail", "params": {"host": "localhost", "user": "x", "password": "x"}, }) assert resp.status_code == 400 assert resp.get_json()["status"] == "error" - def test_disconnect(self, connected_client): + def test_delete_connector_clears_status(self, connected_client): + """After DELETE, the connector is gone.""" with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.post("/api/connectors/mock_db/auth/disconnect") + resp = connected_client.delete("/api/connectors/mock_db") assert resp.status_code == 200 - assert resp.get_json()["status"] == "disconnected" + assert resp.get_json()["status"] == "deleted" def test_status_connected(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.get("/api/connectors/mock_db/auth/status") + resp = connected_client.post("/api/connectors/get-status", json={"connector_id": "mock_db"}) data = resp.get_json() assert data["connected"] is True assert "hierarchy" in data def test_status_not_connected(self, client): with patch.object(DataConnector, "_get_identity", return_value="other-user"): - resp = client.get("/api/connectors/mock_db/auth/status") + resp = client.post("/api/connectors/get-status", json={"connector_id": "mock_db"}) data = resp.get_json() assert data["connected"] is False assert "params_form" in data - def test_disconnect_then_status(self, connected_client): - with patch.object(DataConnector, "_get_identity", return_value="test-user"): - connected_client.post("/api/connectors/mock_db/auth/disconnect") - resp = connected_client.get("/api/connectors/mock_db/auth/status") + def test_status_not_connected_after_no_connect(self, client): + """Status should show not-connected for a user that never connected.""" + with patch.object(DataConnector, "_get_identity", return_value="other-user"): + resp = client.post("/api/connectors/get-status", json={"connector_id": "mock_db"}) data = resp.get_json() assert data["connected"] is False def test_safe_params_exclude_password(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.get("/api/connectors/mock_db/auth/status") + resp = connected_client.post("/api/connectors/get-status", json={"connector_id": "mock_db"}) data = resp.get_json() assert "password" not in data.get("params", {}) @@ -372,7 +385,8 @@ class TestCatalogRoutes: def test_ls_root(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={ + resp = connected_client.post("/api/connectors/get-catalog", json={ + "connector_id": "mock_db", "path": [], }) data = resp.get_json() @@ -384,7 +398,7 @@ def test_ls_root(self, connected_client): def test_ls_returns_hierarchy(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={"path": []}) + resp = connected_client.post("/api/connectors/get-catalog", json={"connector_id": "mock_db", "path": []}) data = resp.get_json() assert "hierarchy" in data assert "effective_hierarchy" in data @@ -394,19 +408,21 @@ def test_ls_drill_down_to_tables(self, connected_client): """Expand database → schema → tables.""" with patch.object(DataConnector, "_get_identity", return_value="test-user"): # Level 1: databases - resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={"path": []}) + resp = connected_client.post("/api/connectors/get-catalog", json={"connector_id": "mock_db", "path": []}) db_node = resp.get_json()["nodes"][0] assert db_node["name"] == "testdb" # Level 2: schemas - resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={ + resp = connected_client.post("/api/connectors/get-catalog", json={ + "connector_id": "mock_db", "path": db_node["path"], }) schema_node = resp.get_json()["nodes"][0] assert schema_node["name"] == "public" # Level 3: tables - resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={ + resp = connected_client.post("/api/connectors/get-catalog", json={ + "connector_id": "mock_db", "path": schema_node["path"], }) tables = resp.get_json()["nodes"] @@ -418,7 +434,8 @@ def test_ls_drill_down_to_tables(self, connected_client): def test_ls_with_filter(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.post("/api/connectors/mock_db/catalog/ls", json={ + resp = connected_client.post("/api/connectors/get-catalog", json={ + "connector_id": "mock_db", "path": ["testdb", "public"], "filter": "user", }) @@ -428,12 +445,13 @@ def test_ls_with_filter(self, connected_client): def test_ls_not_connected_returns_error(self, client): with patch.object(DataConnector, "_get_identity", return_value="nobody"): - resp = client.post("/api/connectors/mock_db/catalog/ls", json={"path": []}) + resp = client.post("/api/connectors/get-catalog", json={"connector_id": "mock_db", "path": []}) assert resp.status_code in (400, 500, 502) def test_catalog_metadata(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.post("/api/connectors/mock_db/catalog/metadata", json={ + resp = connected_client.post("/api/connectors/get-catalog", json={ + "connector_id": "mock_db", "path": ["testdb", "public", "users"], }) data = resp.get_json() @@ -442,24 +460,23 @@ def test_catalog_metadata(self, connected_client): assert data["metadata"]["row_count"] == 5 assert len(data["metadata"]["columns"]) == 3 - def test_list_tables_flat(self, connected_client): + def test_catalog_tree(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.post("/api/connectors/mock_db/catalog/list_tables", json={}) + resp = connected_client.post("/api/connectors/get-catalog-tree", json={"connector_id": "mock_db"}) data = resp.get_json() assert resp.status_code == 200 - assert len(data["tables"]) == 2 - names = {t["name"] for t in data["tables"]} - assert "public.users" in names - assert "public.orders" in names + assert "tree" in data + assert "hierarchy" in data - def test_list_tables_with_filter(self, connected_client): + def test_catalog_tree_with_filter(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.post("/api/connectors/mock_db/catalog/list_tables", json={ + resp = connected_client.post("/api/connectors/get-catalog-tree", json={ + "connector_id": "mock_db", "filter": "order", }) data = resp.get_json() - assert len(data["tables"]) == 1 - assert "orders" in data["tables"][0]["name"] + assert resp.status_code == 200 + assert "tree" in data # ================================================================== @@ -470,9 +487,10 @@ class TestDataRoutes: def test_preview(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.post("/api/connectors/mock_db/data/preview", json={ + resp = connected_client.post("/api/connectors/preview-data", json={ + "connector_id": "mock_db", "source_table": "public.users", - "size": 3, + "limit": 3, }) data = resp.get_json() assert resp.status_code == 200 @@ -484,12 +502,12 @@ def test_preview(self, connected_client): def test_preview_missing_source_table(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.post("/api/connectors/mock_db/data/preview", json={}) + resp = connected_client.post("/api/connectors/preview-data", json={"connector_id": "mock_db"}) assert resp.status_code == 400 def test_import_requires_source_table(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.post("/api/connectors/mock_db/data/import", json={}) + resp = connected_client.post("/api/connectors/import-data", json={"connector_id": "mock_db"}) assert resp.status_code == 400 def test_import_success(self, connected_client): @@ -502,7 +520,8 @@ def test_import_success(self, connected_client): patch("data_formulator.security.auth.get_identity_id", return_value="test-user"), \ patch("data_formulator.workspace_factory.get_workspace") as mock_ws, \ patch.object(MockLoader, "ingest_to_workspace", return_value=mock_meta): - resp = connected_client.post("/api/connectors/mock_db/data/import", json={ + resp = connected_client.post("/api/connectors/import-data", json={ + "connector_id": "mock_db", "source_table": "public.users", "table_name": "users", }) @@ -515,7 +534,7 @@ def test_import_success(self, connected_client): def test_refresh_requires_table_name(self, connected_client): with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = connected_client.post("/api/connectors/mock_db/data/refresh", json={}) + resp = connected_client.post("/api/connectors/refresh-data", json={"connector_id": "mock_db"}) assert resp.status_code == 400 @@ -547,7 +566,8 @@ def test_sanitize_error_unknown(self): def test_error_does_not_leak_internal_details(self, client): """Errors from loader should not expose connection strings or stack traces.""" with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = client.post("/api/connectors/mock_db/auth/connect", json={ + resp = client.post("/api/connectors/connect", json={ + "connector_id": "mock_db", "params": {"host": "bad-host", "user": "x", "password": "secret123"}, }) data = resp.get_json() @@ -563,38 +583,38 @@ class TestIdentityIsolation: def test_different_identities_have_separate_loaders(self, client): """Two users connecting to the same source get separate loader instances.""" with patch.object(DataConnector, "_get_identity", return_value="user-a"): - client.post("/api/connectors/mock_db/auth/connect", json={ + client.post("/api/connectors/connect", json={ + "connector_id": "mock_db", "params": {"host": "localhost", "user": "A", "password": "A"}, }) with patch.object(DataConnector, "_get_identity", return_value="user-b"): - client.post("/api/connectors/mock_db/auth/connect", json={ + client.post("/api/connectors/connect", json={ + "connector_id": "mock_db", "params": {"host": "localhost", "user": "B", "password": "B"}, }) # Both should be connected with patch.object(DataConnector, "_get_identity", return_value="user-a"): - resp = client.get("/api/connectors/mock_db/auth/status") + resp = client.post("/api/connectors/get-status", json={"connector_id": "mock_db"}) assert resp.get_json()["connected"] is True with patch.object(DataConnector, "_get_identity", return_value="user-b"): - resp = client.get("/api/connectors/mock_db/auth/status") + resp = client.post("/api/connectors/get-status", json={"connector_id": "mock_db"}) assert resp.get_json()["connected"] is True - def test_disconnect_does_not_affect_other_user(self, client): + def test_delete_does_not_affect_other_user(self, client): + """Deleting credentials for user-a doesn't affect user-b's session.""" with patch.object(DataConnector, "_get_identity", return_value="user-a"): - client.post("/api/connectors/mock_db/auth/connect", json={ + client.post("/api/connectors/connect", json={ + "connector_id": "mock_db", "params": {"host": "localhost", "user": "A", "password": "A"}, }) with patch.object(DataConnector, "_get_identity", return_value="user-b"): - client.post("/api/connectors/mock_db/auth/connect", json={ + client.post("/api/connectors/connect", json={ + "connector_id": "mock_db", "params": {"host": "localhost", "user": "B", "password": "B"}, }) - # Disconnect user-a - with patch.object(DataConnector, "_get_identity", return_value="user-a"): - client.post("/api/connectors/mock_db/auth/disconnect") - resp = client.get("/api/connectors/mock_db/auth/status") - assert resp.get_json()["connected"] is False - # user-b should still be connected + # user-b should still be connected regardless of user-a's state with patch.object(DataConnector, "_get_identity", return_value="user-b"): - resp = client.get("/api/connectors/mock_db/auth/status") + resp = client.post("/api/connectors/get-status", json={"connector_id": "mock_db"}) assert resp.get_json()["connected"] is True @@ -606,13 +626,13 @@ class TestScopePinning: def test_pinned_database_skips_database_level(self, app, source_pinned): """When database is pinned, ls([]) should start at schema level.""" - app.register_blueprint(source_pinned.create_blueprint()) c = app.test_client() with patch.object(DataConnector, "_get_identity", return_value="test-user"): - c.post("/api/connectors/mock_pinned/auth/connect", json={ + c.post("/api/connectors/connect", json={ + "connector_id": "mock_pinned", "params": {"user": "test", "password": "test"}, }) - resp = c.post("/api/connectors/mock_pinned/catalog/ls", json={"path": []}) + resp = c.post("/api/connectors/get-catalog", json={"connector_id": "mock_pinned", "path": []}) data = resp.get_json() # Should skip database level and show schemas directly eff_keys = [h["key"] for h in data["effective_hierarchy"]] @@ -623,10 +643,10 @@ def test_pinned_database_skips_database_level(self, app, source_pinned): assert nodes[0]["node_type"] == "namespace" def test_pinned_scope_in_connect_response(self, app, source_pinned): - app.register_blueprint(source_pinned.create_blueprint()) c = app.test_client() with patch.object(DataConnector, "_get_identity", return_value="test-user"): - resp = c.post("/api/connectors/mock_pinned/auth/connect", json={ + resp = c.post("/api/connectors/connect", json={ + "connector_id": "mock_pinned", "params": {"user": "test", "password": "test"}, }) data = resp.get_json() diff --git a/tests/backend/unit/test_data_connector_vault.py b/tests/backend/unit/test_data_connector_vault.py index 380c556c..2d362f6c 100644 --- a/tests/backend/unit/test_data_connector_vault.py +++ b/tests/backend/unit/test_data_connector_vault.py @@ -25,6 +25,7 @@ from data_formulator.data_connector import ( DATA_CONNECTORS, DataConnector, + connectors_bp, ) from data_formulator.credential_vault.base import CredentialVault from data_formulator.data_loader.external_data_loader import ( @@ -111,14 +112,26 @@ def vault(): return InMemoryVault() +@pytest.fixture(autouse=True) +def _clean_data_connectors(): + """Reset the global DATA_CONNECTORS dict between tests.""" + old = dict(DATA_CONNECTORS) + DATA_CONNECTORS.clear() + yield + DATA_CONNECTORS.clear() + DATA_CONNECTORS.update(old) + + @pytest.fixture def source(): - return DataConnector.from_loader( + s = DataConnector.from_loader( MockLoader, source_id="test_db", display_name="Test DB", default_params={"host": "localhost"}, ) + DATA_CONNECTORS["test_db"] = s + return s @pytest.fixture @@ -126,8 +139,7 @@ def app(source): _app = flask.Flask(__name__) _app.config["TESTING"] = True _app.secret_key = "test-secret" - bp = source.create_blueprint() - _app.register_blueprint(bp) + _app.register_blueprint(connectors_bp) return _app @@ -216,7 +228,8 @@ def test_persist_credentials_stores_in_vault(self, source, vault): def test_connect_via_route_stores_in_vault(self, client, source, vault): with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ patch.object(DataConnector, "_get_vault", return_value=vault): - resp = client.post("/api/connectors/test_db/auth/connect", json={ + resp = client.post("/api/connectors/connect", json={ + "connector_id": "test_db", "params": {"password": "secret"}, }) data = resp.get_json() @@ -230,7 +243,8 @@ def test_connect_via_route_persist_false(self, client, source, vault): """Route with persist=false should not store in vault.""" with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ patch.object(DataConnector, "_get_vault", return_value=vault): - resp = client.post("/api/connectors/test_db/auth/connect", json={ + resp = client.post("/api/connectors/connect", json={ + "connector_id": "test_db", "params": {"password": "secret"}, "persist": False, }) @@ -245,7 +259,8 @@ def test_connect_persist_false_clears_old_vault_entry(self, client, source, vaul with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ patch.object(DataConnector, "_get_vault", return_value=vault): # First connect with persist=true - resp = client.post("/api/connectors/test_db/auth/connect", json={ + resp = client.post("/api/connectors/connect", json={ + "connector_id": "test_db", "params": {"password": "secret"}, "persist": True, }) @@ -253,7 +268,8 @@ def test_connect_persist_false_clears_old_vault_entry(self, client, source, vaul assert vault.retrieve(IDENTITY, "test_db") is not None # Reconnect with persist=false — old entry must be deleted - resp = client.post("/api/connectors/test_db/auth/connect", json={ + resp = client.post("/api/connectors/connect", json={ + "connector_id": "test_db", "params": {"password": "secret"}, "persist": False, }) @@ -265,33 +281,20 @@ def test_connect_persist_false_clears_old_vault_entry(self, client, source, vaul # Tests: Disconnect deletes credentials # ================================================================== -class TestDisconnectDeletesCredentials: +class TestDeleteCredentials: - def test_disconnect_clears_vault(self, source, vault): + def test_delete_credentials_clears_vault(self, source, vault): + """_delete_credentials clears both in-memory loader AND vault.""" with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ patch.object(DataConnector, "_get_vault", return_value=vault): source._connect({"password": "secret"}) source._persist_credentials({"password": "secret"}) assert vault.retrieve(IDENTITY, "test_db") is not None - source._disconnect() + source._delete_credentials() assert vault.retrieve(IDENTITY, "test_db") is None assert source._get_loader(IDENTITY) is None - def test_disconnect_via_route_clears_vault(self, client, source, vault): - with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ - patch.object(DataConnector, "_get_vault", return_value=vault): - # Connect first - client.post("/api/connectors/test_db/auth/connect", json={ - "params": {"password": "secret"}, - }) - assert vault.retrieve(IDENTITY, "test_db") is not None - - # Disconnect - resp = client.post("/api/connectors/test_db/auth/disconnect") - assert resp.get_json()["status"] == "disconnected" - assert vault.retrieve(IDENTITY, "test_db") is None - # ================================================================== # Tests: Auto-reconnect from vault @@ -349,8 +352,8 @@ def test_auto_reconnect_exception_cleans_stale_creds(self, source, vault): assert loader is None assert vault.retrieve(IDENTITY, "test_db") is None - def test_auth_status_triggers_auto_reconnect(self, client, source, vault): - """GET /auth/status should auto-reconnect from vault if no in-memory loader.""" + def test_status_reports_stored_credentials(self, client, source, vault): + """POST /get-status is side-effect-free: reports has_stored_credentials but does not reconnect.""" with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ patch.object(DataConnector, "_get_vault", return_value=vault): # Store credentials in vault (simulating a previous session) @@ -361,17 +364,17 @@ def test_auth_status_triggers_auto_reconnect(self, client, source, vault): # Clear any in-memory loader source._loaders.clear() - resp = client.get("/api/connectors/test_db/auth/status") + resp = client.post("/api/connectors/get-status", json={"connector_id": "test_db"}) data = resp.get_json() - assert data["connected"] is True - assert data["persisted"] is True + assert data["connected"] is False + assert data["has_stored_credentials"] is True def test_auth_status_not_connected_no_vault(self, client, source): - """GET /auth/status with no loader and no vault = not connected.""" + """POST /get-status with no loader and no vault = not connected.""" with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ patch.object(DataConnector, "_get_vault", return_value=None): source._loaders.clear() - resp = client.get("/api/connectors/test_db/auth/status") + resp = client.post("/api/connectors/get-status", json={"connector_id": "test_db"}) data = resp.get_json() assert data["connected"] is False assert data.get("has_stored_credentials") is False @@ -395,7 +398,8 @@ def test_connect_route_without_vault_not_persisted(self, client, source): """Route returns persisted=False when no vault.""" with patch.object(DataConnector, "_get_identity", return_value=IDENTITY), \ patch.object(DataConnector, "_get_vault", return_value=None): - resp = client.post("/api/connectors/test_db/auth/connect", json={ + resp = client.post("/api/connectors/connect", json={ + "connector_id": "test_db", "params": {"password": "secret"}, }) data = resp.get_json() @@ -433,8 +437,8 @@ def test_different_users_separate_vault_entries(self, source, vault): assert alice_creds["user_params"]["password"] == "alice-pw" assert bob_creds["user_params"]["password"] == "bob-pw" - def test_disconnect_only_affects_own_user(self, source, vault): - """Disconnecting one user doesn't affect another.""" + def test_delete_only_affects_own_user(self, source, vault): + """Deleting one user's credentials doesn't affect another's.""" with patch.object(DataConnector, "_get_vault", return_value=vault): with patch.object(DataConnector, "_get_identity", return_value="user:alice"): source._connect({"password": "alice-pw"}) @@ -444,7 +448,10 @@ def test_disconnect_only_affects_own_user(self, source, vault): source._persist_credentials({"password": "bob-pw"}) with patch.object(DataConnector, "_get_identity", return_value="user:alice"): - source._disconnect() + source._delete_credentials() + # Alice's credentials are gone (memory + vault), Bob's are untouched assert vault.retrieve("user:alice", "test_db") is None assert vault.retrieve("user:bob", "test_db") is not None + assert source._get_loader("user:alice") is None + assert source._get_loader("user:bob") is not None diff --git a/tests/superset/README.md b/tests/superset/README.md index 43ba7ba0..db9354ae 100644 --- a/tests/superset/README.md +++ b/tests/superset/README.md @@ -36,6 +36,20 @@ Spin up a local Apache Superset instance with sample data and connect it to Data Plus Superset's built-in example datasets (if `load_examples` succeeds). +### Test Dashboard: "DF Filter Test" + +A dashboard named **DF Filter Test** (slug: `df-filter-test`) is automatically created with native filters configured on the sample datasets: + +| Filter | Type | Dataset | Column | Multi | Default | +|--------|------|---------|--------|-------|---------| +| Region | `filter_select` | df_test_sales | region | Yes | North, South | +| Product | `filter_select` | df_test_sales | product | No | Widget A | +| Sale Date | `filter_time` | df_test_sales | date | — | Last quarter | +| Quantity Range | `filter_range` | df_test_sales | quantity | — | [5, 30] | +| Department | `filter_select` | df_test_employees | department | Yes | Engineering | + +This dashboard enables end-to-end testing of the predefined filter feature in the Superset plugin. + ## Testing the Plugin 1. Start both services: `./tests/superset/start.sh` @@ -45,6 +59,20 @@ Plus Superset's built-in example datasets (if `load_examples` succeeds). 5. Click it, then log in with `admin` / `admin` 6. Browse datasets and load one into Data Formulator +### Testing Predefined Filters + +1. Start both services: `./tests/superset/start.sh` +2. Open http://localhost:5567 and connect to Superset (login with `admin` / `admin`) +3. In the Superset plugin panel, switch to the **Dashboards** view +4. Select the **DF Filter Test** dashboard +5. Pick a dataset (e.g. `df_test_sales`) +6. The filter dialog should appear showing the predefined filters: + - **Region**: multi-select dropdown with options loaded from Superset + - **Product**: single-select dropdown + - **Sale Date**: time/date range picker + - **Quantity Range**: numeric range input +7. Adjust filter values and click Load — verify the loaded data is filtered correctly + ### Token-based Login (via Superset) The test Docker mounts a custom `superset_config.py` that adds a `/df-sso-bridge/` endpoint. This lets you test the delegated login flow where DF obtains a JWT token by having the user log in directly on Superset: diff --git a/tests/superset/sample_data.py b/tests/superset/sample_data.py index 61055388..dac3495e 100644 --- a/tests/superset/sample_data.py +++ b/tests/superset/sample_data.py @@ -1,22 +1,23 @@ #!/usr/bin/env python3 -"""Load sample datasets into the Superset test instance's default SQLite DB. +"""Create sample tables and add native filters to the Sales Dashboard. -This runs inside the Superset container after `superset init`. -It creates small, self-contained tables useful for testing the DF plugin: - - df_test_sales (100 rows, mixed types) - - df_test_employees (30 rows, names and departments) - - df_test_weather (365 rows, daily temps) +Runs inside the Superset container after ``superset load_examples``. +All operations use plain sqlite3 — no Superset imports needed. + +Phase 1: Write df_test_* tables into the *examples* SQLite database. +Phase 2: Patch the Sales Dashboard's json_metadata in the *metadata* + database so it has native filter definitions for testing. """ +import json import random import datetime import sqlite3 import os -DB_PATH = os.path.expanduser("~/.superset/superset.db") -# Fallback: newer Superset images may use a different path -if not os.path.exists(DB_PATH): - DB_PATH = "/app/superset_home/superset.db" +# -- paths inside the container -- +EXAMPLES_DB = "/app/superset_home/examples.db" +METADATA_DB = "/app/superset_home/superset.db" def create_tables(conn: sqlite3.Connection) -> None: @@ -123,57 +124,108 @@ def create_tables(conn: sqlite3.Connection) -> None: print(f"[sample_data] Created df_test_sales (100), df_test_employees (30), df_test_weather (365)") -def register_datasets_in_superset() -> None: - """Register our tables as Superset datasets via the Superset Python API. +def add_native_filters_to_sales_dashboard() -> None: + """Inject native filter configuration into the Sales Dashboard. - This runs inside the Superset process context so we can use - superset's own SQLAlchemy models. + The built-in Sales Dashboard (slug='sales-dashboard') ships with no + native filters. We patch its ``json_metadata`` to add select filters + on the ``cleaned_sales_data`` dataset columns so the DF filter UI + has something to work with. """ - try: - from superset.app import create_app - from superset.connectors.sqla.models import SqlaTable - from superset.extensions import db as superset_db - - app = create_app() - with app.app_context(): - # Find the default "examples" database - from superset.models.core import Database - examples_db = superset_db.session.query(Database).filter_by( - database_name="examples" - ).first() - - if not examples_db: - print("[sample_data] Warning: 'examples' database not found, skipping dataset registration") - return - - for table_name in ["df_test_sales", "df_test_employees", "df_test_weather"]: - existing = superset_db.session.query(SqlaTable).filter_by( - table_name=table_name, database_id=examples_db.id - ).first() - if existing: - print(f"[sample_data] Dataset '{table_name}' already registered") - continue - - dataset = SqlaTable( - table_name=table_name, - database_id=examples_db.id, - schema=None, - ) - superset_db.session.add(dataset) - print(f"[sample_data] Registered dataset '{table_name}'") - - superset_db.session.commit() - - except Exception as e: - print(f"[sample_data] Dataset registration failed (non-fatal): {e}") - print("[sample_data] Tables exist in SQLite but may need manual registration in Superset UI") + if not os.path.exists(METADATA_DB): + print("[sample_data] Metadata DB not found, skipping filter injection") + return + conn = sqlite3.connect(METADATA_DB) + cur = conn.cursor() -if __name__ == "__main__": - # Step 1: Create the tables in the examples SQLite database - conn = sqlite3.connect(DB_PATH) - create_tables(conn) + # Find the Sales Dashboard + cur.execute( + "SELECT id, json_metadata FROM dashboards " + "WHERE dashboard_title = 'Sales Dashboard' OR slug = 'sales-dashboard' " + "LIMIT 1" + ) + row = cur.fetchone() + if not row: + print("[sample_data] Sales Dashboard not found, skipping filter injection") + conn.close() + return + + dash_id, raw_meta = row + meta = json.loads(raw_meta) if raw_meta else {} + + # Already has filters? Skip. + if meta.get("native_filter_configuration"): + print(f"[sample_data] Sales Dashboard (id={dash_id}) already has native filters") + conn.close() + return + + # Find the cleaned_sales_data dataset id + cur.execute( + "SELECT id FROM tables WHERE table_name = 'cleaned_sales_data' LIMIT 1" + ) + ds_row = cur.fetchone() + if not ds_row: + print("[sample_data] cleaned_sales_data dataset not found, skipping filter injection") + conn.close() + return + ds_id = ds_row[0] + + # Build native filters + native_filters = [ + { + "id": "NATIVE_FILTER-status", + "name": "Order Status", + "filterType": "filter_select", + "targets": [{"datasetId": ds_id, "column": {"name": "status"}}], + "controlValues": {"multiSelect": True, "enableEmptyFilter": False}, + "defaultDataMask": {"filterState": {"value": ["Shipped", "In Progress"]}}, + "scope": {"rootPath": ["ROOT_ID"], "excluded": []}, + "type": "NATIVE_FILTER", + "required": False, + }, + { + "id": "NATIVE_FILTER-product_line", + "name": "Product Line", + "filterType": "filter_select", + "targets": [{"datasetId": ds_id, "column": {"name": "product_line"}}], + "controlValues": {"multiSelect": True, "enableEmptyFilter": False}, + "defaultDataMask": {"filterState": {}}, + "scope": {"rootPath": ["ROOT_ID"], "excluded": []}, + "type": "NATIVE_FILTER", + "required": False, + }, + { + "id": "NATIVE_FILTER-deal_size", + "name": "Deal Size", + "filterType": "filter_select", + "targets": [{"datasetId": ds_id, "column": {"name": "deal_size"}}], + "controlValues": {"multiSelect": False, "enableEmptyFilter": False}, + "defaultDataMask": {"filterState": {}}, + "scope": {"rootPath": ["ROOT_ID"], "excluded": []}, + "type": "NATIVE_FILTER", + "required": False, + }, + ] + + meta["native_filter_configuration"] = native_filters + cur.execute( + "UPDATE dashboards SET json_metadata = ? WHERE id = ?", + (json.dumps(meta), dash_id), + ) + conn.commit() conn.close() + print(f"[sample_data] Added {len(native_filters)} native filters to Sales Dashboard (id={dash_id})") - # Step 2: Register as Superset datasets - register_datasets_in_superset() + +if __name__ == "__main__": + # 1. Create custom tables in the examples database + if os.path.exists(EXAMPLES_DB): + conn = sqlite3.connect(EXAMPLES_DB) + create_tables(conn) + conn.close() + else: + print(f"[sample_data] Warning: {EXAMPLES_DB} not found, skipping table creation") + + # 2. Add native filters to the Sales Dashboard + add_native_filters_to_sales_dashboard() diff --git a/tests/superset/superset_config.py b/tests/superset/superset_config.py index 50787076..98f72172 100644 --- a/tests/superset/superset_config.py +++ b/tests/superset/superset_config.py @@ -76,3 +76,10 @@ def df_sso_bridge(): # CORS is configured via environment variables in docker-compose.yml # (SUPERSET_CORS_ENABLED / SUPERSET_CORS_ORIGINS). # Do NOT set ENABLE_CORS here — the official image lacks flask-cors. + +# Feature flags — ensure native dashboard filters are enabled for filter testing. +FEATURE_FLAGS = { + "DASHBOARD_NATIVE_FILTERS": True, + "DASHBOARD_CROSS_FILTERS": True, + "DASHBOARD_NATIVE_FILTERS_SET": True, +} diff --git a/uv.lock b/uv.lock index 045aec9a..9c6a21ff 100644 --- a/uv.lock +++ b/uv.lock @@ -1,5 +1,5 @@ version = 1 -revision = 3 +revision = 2 requires-python = ">=3.11" resolution-markers = [ "python_full_version >= '3.14'", @@ -19,7 +19,7 @@ wheels = [ [[package]] name = "aiohttp" -version = "3.13.4" +version = "3.13.3" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "aiohappyeyeballs" }, @@ -30,93 +30,93 @@ dependencies = [ { name = "propcache" }, { name = "yarl" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/45/4a/064321452809dae953c1ed6e017504e72551a26b6f5708a5a80e4bf556ff/aiohttp-3.13.4.tar.gz", hash = "sha256:d97a6d09c66087890c2ab5d49069e1e570583f7ac0314ecf98294c1b6aaebd38", size = 7859748, upload-time = "2026-03-28T17:19:40.6Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d4/7e/cb94129302d78c46662b47f9897d642fd0b33bdfef4b73b20c6ced35aa4c/aiohttp-3.13.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8ea0c64d1bcbf201b285c2246c51a0c035ba3bbd306640007bc5844a3b4658c1", size = 760027, upload-time = "2026-03-28T17:15:33.022Z" }, - { url = "https://files.pythonhosted.org/packages/5e/cd/2db3c9397c3bd24216b203dd739945b04f8b87bb036c640da7ddb63c75ef/aiohttp-3.13.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6f742e1fa45c0ed522b00ede565e18f97e4cf8d1883a712ac42d0339dfb0cce7", size = 508325, upload-time = "2026-03-28T17:15:34.714Z" }, - { url = "https://files.pythonhosted.org/packages/36/a3/d28b2722ec13107f2e37a86b8a169897308bab6a3b9e071ecead9d67bd9b/aiohttp-3.13.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6dcfb50ee25b3b7a1222a9123be1f9f89e56e67636b561441f0b304e25aaef8f", size = 502402, upload-time = "2026-03-28T17:15:36.409Z" }, - { url = "https://files.pythonhosted.org/packages/fa/d6/acd47b5f17c4430e555590990a4746efbcb2079909bb865516892bf85f37/aiohttp-3.13.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3262386c4ff370849863ea93b9ea60fd59c6cf56bf8f93beac625cf4d677c04d", size = 1771224, upload-time = "2026-03-28T17:15:38.223Z" }, - { url = "https://files.pythonhosted.org/packages/98/af/af6e20113ba6a48fd1cd9e5832c4851e7613ef50c7619acdaee6ec5f1aff/aiohttp-3.13.4-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:473bb5aa4218dd254e9ae4834f20e31f5a0083064ac0136a01a62ddbae2eaa42", size = 1731530, upload-time = "2026-03-28T17:15:39.988Z" }, - { url = "https://files.pythonhosted.org/packages/81/16/78a2f5d9c124ad05d5ce59a9af94214b6466c3491a25fb70760e98e9f762/aiohttp-3.13.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e56423766399b4c77b965f6aaab6c9546617b8994a956821cc507d00b91d978c", size = 1827925, upload-time = "2026-03-28T17:15:41.944Z" }, - { url = "https://files.pythonhosted.org/packages/2a/1f/79acf0974ced805e0e70027389fccbb7d728e6f30fcac725fb1071e63075/aiohttp-3.13.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8af249343fafd5ad90366a16d230fc265cf1149f26075dc9fe93cfd7c7173942", size = 1923579, upload-time = "2026-03-28T17:15:44.071Z" }, - { url = "https://files.pythonhosted.org/packages/af/53/29f9e2054ea6900413f3b4c3eb9d8331f60678ec855f13ba8714c47fd48d/aiohttp-3.13.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0bc0a5cf4f10ef5a2c94fdde488734b582a3a7a000b131263e27c9295bd682d9", size = 1767655, upload-time = "2026-03-28T17:15:45.911Z" }, - { url = "https://files.pythonhosted.org/packages/f3/57/462fe1d3da08109ba4aa8590e7aed57c059af2a7e80ec21f4bac5cfe1094/aiohttp-3.13.4-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:5c7ff1028e3c9fc5123a865ce17df1cb6424d180c503b8517afbe89aa566e6be", size = 1630439, upload-time = "2026-03-28T17:15:48.11Z" }, - { url = "https://files.pythonhosted.org/packages/d7/4b/4813344aacdb8127263e3eec343d24e973421143826364fa9fc847f6283f/aiohttp-3.13.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ba5cf98b5dcb9bddd857da6713a503fa6d341043258ca823f0f5ab7ab4a94ee8", size = 1745557, upload-time = "2026-03-28T17:15:50.13Z" }, - { url = "https://files.pythonhosted.org/packages/d4/01/1ef1adae1454341ec50a789f03cfafe4c4ac9c003f6a64515ecd32fe4210/aiohttp-3.13.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:d85965d3ba21ee4999e83e992fecb86c4614d6920e40705501c0a1f80a583c12", size = 1741796, upload-time = "2026-03-28T17:15:52.351Z" }, - { url = "https://files.pythonhosted.org/packages/22/04/8cdd99af988d2aa6922714d957d21383c559835cbd43fbf5a47ddf2e0f05/aiohttp-3.13.4-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:49f0b18a9b05d79f6f37ddd567695943fcefb834ef480f17a4211987302b2dc7", size = 1805312, upload-time = "2026-03-28T17:15:54.407Z" }, - { url = "https://files.pythonhosted.org/packages/fb/7f/b48d5577338d4b25bbdbae35c75dbfd0493cb8886dc586fbfb2e90862239/aiohttp-3.13.4-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:7f78cb080c86fbf765920e5f1ef35af3f24ec4314d6675d0a21eaf41f6f2679c", size = 1621751, upload-time = "2026-03-28T17:15:56.564Z" }, - { url = "https://files.pythonhosted.org/packages/bc/89/4eecad8c1858e6d0893c05929e22343e0ebe3aec29a8a399c65c3cc38311/aiohttp-3.13.4-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:67a3ec705534a614b68bbf1c70efa777a21c3da3895d1c44510a41f5a7ae0453", size = 1826073, upload-time = "2026-03-28T17:15:58.489Z" }, - { url = "https://files.pythonhosted.org/packages/f5/5c/9dc8293ed31b46c39c9c513ac7ca152b3c3d38e0ea111a530ad12001b827/aiohttp-3.13.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d6630ec917e85c5356b2295744c8a97d40f007f96a1c76bf1928dc2e27465393", size = 1760083, upload-time = "2026-03-28T17:16:00.677Z" }, - { url = "https://files.pythonhosted.org/packages/1e/19/8bbf6a4994205d96831f97b7d21a0feed120136e6267b5b22d229c6dc4dc/aiohttp-3.13.4-cp311-cp311-win32.whl", hash = "sha256:54049021bc626f53a5394c29e8c444f726ee5a14b6e89e0ad118315b1f90f5e3", size = 439690, upload-time = "2026-03-28T17:16:02.902Z" }, - { url = "https://files.pythonhosted.org/packages/0c/f5/ac409ecd1007528d15c3e8c3a57d34f334c70d76cfb7128a28cffdebd4c1/aiohttp-3.13.4-cp311-cp311-win_amd64.whl", hash = "sha256:c033f2bc964156030772d31cbf7e5defea181238ce1f87b9455b786de7d30145", size = 463824, upload-time = "2026-03-28T17:16:05.058Z" }, - { url = "https://files.pythonhosted.org/packages/1e/bd/ede278648914cabbabfdf95e436679b5d4156e417896a9b9f4587169e376/aiohttp-3.13.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:ee62d4471ce86b108b19c3364db4b91180d13fe3510144872d6bad5401957360", size = 752158, upload-time = "2026-03-28T17:16:06.901Z" }, - { url = "https://files.pythonhosted.org/packages/90/de/581c053253c07b480b03785196ca5335e3c606a37dc73e95f6527f1591fe/aiohttp-3.13.4-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c0fd8f41b54b58636402eb493afd512c23580456f022c1ba2db0f810c959ed0d", size = 501037, upload-time = "2026-03-28T17:16:08.82Z" }, - { url = "https://files.pythonhosted.org/packages/fa/f9/a5ede193c08f13cc42c0a5b50d1e246ecee9115e4cf6e900d8dbd8fd6acb/aiohttp-3.13.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4baa48ce49efd82d6b1a0be12d6a36b35e5594d1dd42f8bfba96ea9f8678b88c", size = 501556, upload-time = "2026-03-28T17:16:10.63Z" }, - { url = "https://files.pythonhosted.org/packages/d6/10/88ff67cd48a6ec36335b63a640abe86135791544863e0cfe1f065d6cef7a/aiohttp-3.13.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d738ebab9f71ee652d9dbd0211057690022201b11197f9a7324fd4dba128aa97", size = 1757314, upload-time = "2026-03-28T17:16:12.498Z" }, - { url = "https://files.pythonhosted.org/packages/8b/15/fdb90a5cf5a1f52845c276e76298c75fbbcc0ac2b4a86551906d54529965/aiohttp-3.13.4-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0ce692c3468fa831af7dceed52edf51ac348cebfc8d3feb935927b63bd3e8576", size = 1731819, upload-time = "2026-03-28T17:16:14.558Z" }, - { url = "https://files.pythonhosted.org/packages/ec/df/28146785a007f7820416be05d4f28cc207493efd1e8c6c1068e9bdc29198/aiohttp-3.13.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:8e08abcfe752a454d2cb89ff0c08f2d1ecd057ae3e8cc6d84638de853530ebab", size = 1793279, upload-time = "2026-03-28T17:16:16.594Z" }, - { url = "https://files.pythonhosted.org/packages/10/47/689c743abf62ea7a77774d5722f220e2c912a77d65d368b884d9779ef41b/aiohttp-3.13.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5977f701b3fff36367a11087f30ea73c212e686d41cd363c50c022d48b011d8d", size = 1891082, upload-time = "2026-03-28T17:16:18.71Z" }, - { url = "https://files.pythonhosted.org/packages/b0/b6/f7f4f318c7e58c23b761c9b13b9a3c9b394e0f9d5d76fbc6622fa98509f6/aiohttp-3.13.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:54203e10405c06f8b6020bd1e076ae0fe6c194adcee12a5a78af3ffa3c57025e", size = 1773938, upload-time = "2026-03-28T17:16:21.125Z" }, - { url = "https://files.pythonhosted.org/packages/aa/06/f207cb3121852c989586a6fc16ff854c4fcc8651b86c5d3bd1fc83057650/aiohttp-3.13.4-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:358a6af0145bc4dda037f13167bef3cce54b132087acc4c295c739d05d16b1c3", size = 1579548, upload-time = "2026-03-28T17:16:23.588Z" }, - { url = "https://files.pythonhosted.org/packages/6c/58/e1289661a32161e24c1fe479711d783067210d266842523752869cc1d9c2/aiohttp-3.13.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:898ea1850656d7d61832ef06aa9846ab3ddb1621b74f46de78fbc5e1a586ba83", size = 1714669, upload-time = "2026-03-28T17:16:25.713Z" }, - { url = "https://files.pythonhosted.org/packages/96/0a/3e86d039438a74a86e6a948a9119b22540bae037d6ba317a042ae3c22711/aiohttp-3.13.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:7bc30cceb710cf6a44e9617e43eebb6e3e43ad855a34da7b4b6a73537d8a6763", size = 1754175, upload-time = "2026-03-28T17:16:28.18Z" }, - { url = "https://files.pythonhosted.org/packages/f4/30/e717fc5df83133ba467a560b6d8ef20197037b4bb5d7075b90037de1018e/aiohttp-3.13.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:4a31c0c587a8a038f19a4c7e60654a6c899c9de9174593a13e7cc6e15ff271f9", size = 1762049, upload-time = "2026-03-28T17:16:30.941Z" }, - { url = "https://files.pythonhosted.org/packages/e4/28/8f7a2d4492e336e40005151bdd94baf344880a4707573378579f833a64c1/aiohttp-3.13.4-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:2062f675f3fe6e06d6113eb74a157fb9df58953ffed0cdb4182554b116545758", size = 1570861, upload-time = "2026-03-28T17:16:32.953Z" }, - { url = "https://files.pythonhosted.org/packages/78/45/12e1a3d0645968b1c38de4b23fdf270b8637735ea057d4f84482ff918ad9/aiohttp-3.13.4-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:3d1ba8afb847ff80626d5e408c1fdc99f942acc877d0702fe137015903a220a9", size = 1790003, upload-time = "2026-03-28T17:16:35.468Z" }, - { url = "https://files.pythonhosted.org/packages/eb/0f/60374e18d590de16dcb39d6ff62f39c096c1b958e6f37727b5870026ea30/aiohttp-3.13.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b08149419994cdd4d5eecf7fd4bc5986b5a9380285bcd01ab4c0d6bfca47b79d", size = 1737289, upload-time = "2026-03-28T17:16:38.187Z" }, - { url = "https://files.pythonhosted.org/packages/02/bf/535e58d886cfbc40a8b0013c974afad24ef7632d645bca0b678b70033a60/aiohttp-3.13.4-cp312-cp312-win32.whl", hash = "sha256:fc432f6a2c4f720180959bc19aa37259651c1a4ed8af8afc84dd41c60f15f791", size = 434185, upload-time = "2026-03-28T17:16:40.735Z" }, - { url = "https://files.pythonhosted.org/packages/1e/1a/d92e3325134ebfff6f4069f270d3aac770d63320bd1fcd0eca023e74d9a8/aiohttp-3.13.4-cp312-cp312-win_amd64.whl", hash = "sha256:6148c9ae97a3e8bff9a1fc9c757fa164116f86c100468339730e717590a3fb77", size = 461285, upload-time = "2026-03-28T17:16:42.713Z" }, - { url = "https://files.pythonhosted.org/packages/e3/ac/892f4162df9b115b4758d615f32ec63d00f3084c705ff5526630887b9b42/aiohttp-3.13.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:63dd5e5b1e43b8fb1e91b79b7ceba1feba588b317d1edff385084fcc7a0a4538", size = 745744, upload-time = "2026-03-28T17:16:44.67Z" }, - { url = "https://files.pythonhosted.org/packages/97/a9/c5b87e4443a2f0ea88cb3000c93a8fdad1ee63bffc9ded8d8c8e0d66efc6/aiohttp-3.13.4-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:746ac3cc00b5baea424dacddea3ec2c2702f9590de27d837aa67004db1eebc6e", size = 498178, upload-time = "2026-03-28T17:16:46.766Z" }, - { url = "https://files.pythonhosted.org/packages/94/42/07e1b543a61250783650df13da8ddcdc0d0a5538b2bd15cef6e042aefc61/aiohttp-3.13.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:bda8f16ea99d6a6705e5946732e48487a448be874e54a4f73d514660ff7c05d3", size = 498331, upload-time = "2026-03-28T17:16:48.9Z" }, - { url = "https://files.pythonhosted.org/packages/20/d6/492f46bf0328534124772d0cf58570acae5b286ea25006900650f69dae0e/aiohttp-3.13.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4b061e7b5f840391e3f64d0ddf672973e45c4cfff7a0feea425ea24e51530fc2", size = 1744414, upload-time = "2026-03-28T17:16:50.968Z" }, - { url = "https://files.pythonhosted.org/packages/e2/4d/e02627b2683f68051246215d2d62b2d2f249ff7a285e7a858dc47d6b6a14/aiohttp-3.13.4-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b252e8d5cd66184b570d0d010de742736e8a4fab22c58299772b0c5a466d4b21", size = 1719226, upload-time = "2026-03-28T17:16:53.173Z" }, - { url = "https://files.pythonhosted.org/packages/7b/6c/5d0a3394dd2b9f9aeba6e1b6065d0439e4b75d41f1fb09a3ec010b43552b/aiohttp-3.13.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:20af8aad61d1803ff11152a26146d8d81c266aa8c5aa9b4504432abb965c36a0", size = 1782110, upload-time = "2026-03-28T17:16:55.362Z" }, - { url = "https://files.pythonhosted.org/packages/0d/2d/c20791e3437700a7441a7edfb59731150322424f5aadf635602d1d326101/aiohttp-3.13.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:13a5cc924b59859ad2adb1478e31f410a7ed46e92a2a619d6d1dd1a63c1a855e", size = 1884809, upload-time = "2026-03-28T17:16:57.734Z" }, - { url = "https://files.pythonhosted.org/packages/c8/94/d99dbfbd1924a87ef643833932eb2a3d9e5eee87656efea7d78058539eff/aiohttp-3.13.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:534913dfb0a644d537aebb4123e7d466d94e3be5549205e6a31f72368980a81a", size = 1764938, upload-time = "2026-03-28T17:17:00.221Z" }, - { url = "https://files.pythonhosted.org/packages/49/61/3ce326a1538781deb89f6cf5e094e2029cd308ed1e21b2ba2278b08426f6/aiohttp-3.13.4-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:320e40192a2dcc1cf4b5576936e9652981ab596bf81eb309535db7e2f5b5672f", size = 1570697, upload-time = "2026-03-28T17:17:02.985Z" }, - { url = "https://files.pythonhosted.org/packages/b6/77/4ab5a546857bb3028fbaf34d6eea180267bdab022ee8b1168b1fcde4bfdd/aiohttp-3.13.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9e587fcfce2bcf06526a43cb705bdee21ac089096f2e271d75de9c339db3100c", size = 1702258, upload-time = "2026-03-28T17:17:05.28Z" }, - { url = "https://files.pythonhosted.org/packages/79/63/d8f29021e39bc5af8e5d5e9da1b07976fb9846487a784e11e4f4eeda4666/aiohttp-3.13.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:9eb9c2eea7278206b5c6c1441fdd9dc420c278ead3f3b2cc87f9b693698cc500", size = 1740287, upload-time = "2026-03-28T17:17:07.712Z" }, - { url = "https://files.pythonhosted.org/packages/55/3a/cbc6b3b124859a11bc8055d3682c26999b393531ef926754a3445b99dfef/aiohttp-3.13.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:29be00c51972b04bf9d5c8f2d7f7314f48f96070ca40a873a53056e652e805f7", size = 1753011, upload-time = "2026-03-28T17:17:10.053Z" }, - { url = "https://files.pythonhosted.org/packages/e0/30/836278675205d58c1368b21520eab9572457cf19afd23759216c04483048/aiohttp-3.13.4-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:90c06228a6c3a7c9f776fe4fc0b7ff647fffd3bed93779a6913c804ae00c1073", size = 1566359, upload-time = "2026-03-28T17:17:12.433Z" }, - { url = "https://files.pythonhosted.org/packages/50/b4/8032cc9b82d17e4277704ba30509eaccb39329dc18d6a35f05e424439e32/aiohttp-3.13.4-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:a533ec132f05fd9a1d959e7f34184cd7d5e8511584848dab85faefbaac573069", size = 1785537, upload-time = "2026-03-28T17:17:14.721Z" }, - { url = "https://files.pythonhosted.org/packages/17/7d/5873e98230bde59f493bf1f7c3e327486a4b5653fa401144704df5d00211/aiohttp-3.13.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1c946f10f413836f82ea4cfb90200d2a59578c549f00857e03111cf45ad01ca5", size = 1740752, upload-time = "2026-03-28T17:17:17.387Z" }, - { url = "https://files.pythonhosted.org/packages/7b/f2/13e46e0df051494d7d3c68b7f72d071f48c384c12716fc294f75d5b1a064/aiohttp-3.13.4-cp313-cp313-win32.whl", hash = "sha256:48708e2706106da6967eff5908c78ca3943f005ed6bcb75da2a7e4da94ef8c70", size = 433187, upload-time = "2026-03-28T17:17:19.523Z" }, - { url = "https://files.pythonhosted.org/packages/ea/c0/649856ee655a843c8f8664592cfccb73ac80ede6a8c8db33a25d810c12db/aiohttp-3.13.4-cp313-cp313-win_amd64.whl", hash = "sha256:74a2eb058da44fa3a877a49e2095b591d4913308bb424c418b77beb160c55ce3", size = 459778, upload-time = "2026-03-28T17:17:21.964Z" }, - { url = "https://files.pythonhosted.org/packages/6d/29/6657cc37ae04cacc2dbf53fb730a06b6091cc4cbe745028e047c53e6d840/aiohttp-3.13.4-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:e0a2c961fc92abeff61d6444f2ce6ad35bb982db9fc8ff8a47455beacf454a57", size = 749363, upload-time = "2026-03-28T17:17:24.044Z" }, - { url = "https://files.pythonhosted.org/packages/90/7f/30ccdf67ca3d24b610067dc63d64dcb91e5d88e27667811640644aa4a85d/aiohttp-3.13.4-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:153274535985a0ff2bff1fb6c104ed547cec898a09213d21b0f791a44b14d933", size = 499317, upload-time = "2026-03-28T17:17:26.199Z" }, - { url = "https://files.pythonhosted.org/packages/93/13/e372dd4e68ad04ee25dafb050c7f98b0d91ea643f7352757e87231102555/aiohttp-3.13.4-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:351f3171e2458da3d731ce83f9e6b9619e325c45cbd534c7759750cabf453ad7", size = 500477, upload-time = "2026-03-28T17:17:28.279Z" }, - { url = "https://files.pythonhosted.org/packages/e5/fe/ee6298e8e586096fb6f5eddd31393d8544f33ae0792c71ecbb4c2bef98ac/aiohttp-3.13.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f989ac8bc5595ff761a5ccd32bdb0768a117f36dd1504b1c2c074ed5d3f4df9c", size = 1737227, upload-time = "2026-03-28T17:17:30.587Z" }, - { url = "https://files.pythonhosted.org/packages/b0/b9/a7a0463a09e1a3fe35100f74324f23644bfc3383ac5fd5effe0722a5f0b7/aiohttp-3.13.4-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:d36fc1709110ec1e87a229b201dd3ddc32aa01e98e7868083a794609b081c349", size = 1694036, upload-time = "2026-03-28T17:17:33.29Z" }, - { url = "https://files.pythonhosted.org/packages/57/7c/8972ae3fb7be00a91aee6b644b2a6a909aedb2c425269a3bfd90115e6f8f/aiohttp-3.13.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:42adaeea83cbdf069ab94f5103ce0787c21fb1a0153270da76b59d5578302329", size = 1786814, upload-time = "2026-03-28T17:17:36.035Z" }, - { url = "https://files.pythonhosted.org/packages/93/01/c81e97e85c774decbaf0d577de7d848934e8166a3a14ad9f8aa5be329d28/aiohttp-3.13.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:92deb95469928cc41fd4b42a95d8012fa6df93f6b1c0a83af0ffbc4a5e218cde", size = 1866676, upload-time = "2026-03-28T17:17:38.441Z" }, - { url = "https://files.pythonhosted.org/packages/5a/5f/5b46fe8694a639ddea2cd035bf5729e4677ea882cb251396637e2ef1590d/aiohttp-3.13.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0c0c7c07c4257ef3a1df355f840bc62d133bcdef5c1c5ba75add3c08553e2eed", size = 1740842, upload-time = "2026-03-28T17:17:40.783Z" }, - { url = "https://files.pythonhosted.org/packages/20/a2/0d4b03d011cca6b6b0acba8433193c1e484efa8d705ea58295590fe24203/aiohttp-3.13.4-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f062c45de8a1098cb137a1898819796a2491aec4e637a06b03f149315dff4d8f", size = 1566508, upload-time = "2026-03-28T17:17:43.235Z" }, - { url = "https://files.pythonhosted.org/packages/98/17/e689fd500da52488ec5f889effd6404dece6a59de301e380f3c64f167beb/aiohttp-3.13.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:76093107c531517001114f0ebdb4f46858ce818590363e3e99a4a2280334454a", size = 1700569, upload-time = "2026-03-28T17:17:46.165Z" }, - { url = "https://files.pythonhosted.org/packages/d8/0d/66402894dbcf470ef7db99449e436105ea862c24f7ea4c95c683e635af35/aiohttp-3.13.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:6f6ec32162d293b82f8b63a16edc80769662fbd5ae6fbd4936d3206a2c2cc63b", size = 1707407, upload-time = "2026-03-28T17:17:48.825Z" }, - { url = "https://files.pythonhosted.org/packages/2f/eb/af0ab1a3650092cbd8e14ef29e4ab0209e1460e1c299996c3f8288b3f1ff/aiohttp-3.13.4-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:5903e2db3d202a00ad9f0ec35a122c005e85d90c9836ab4cda628f01edf425e2", size = 1752214, upload-time = "2026-03-28T17:17:51.206Z" }, - { url = "https://files.pythonhosted.org/packages/5a/bf/72326f8a98e4c666f292f03c385545963cc65e358835d2a7375037a97b57/aiohttp-3.13.4-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:2d5bea57be7aca98dbbac8da046d99b5557c5cf4e28538c4c786313078aca09e", size = 1562162, upload-time = "2026-03-28T17:17:53.634Z" }, - { url = "https://files.pythonhosted.org/packages/67/9f/13b72435f99151dd9a5469c96b3b5f86aa29b7e785ca7f35cf5e538f74c0/aiohttp-3.13.4-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:bcf0c9902085976edc0232b75006ef38f89686901249ce14226b6877f88464fb", size = 1768904, upload-time = "2026-03-28T17:17:55.991Z" }, - { url = "https://files.pythonhosted.org/packages/18/bc/28d4970e7d5452ac7776cdb5431a1164a0d9cf8bd2fffd67b4fb463aa56d/aiohttp-3.13.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:c3295f98bfeed2e867cab588f2a146a9db37a85e3ae9062abf46ba062bd29165", size = 1723378, upload-time = "2026-03-28T17:17:58.348Z" }, - { url = "https://files.pythonhosted.org/packages/53/74/b32458ca1a7f34d65bdee7aef2036adbe0438123d3d53e2b083c453c24dd/aiohttp-3.13.4-cp314-cp314-win32.whl", hash = "sha256:a598a5c5767e1369d8f5b08695cab1d8160040f796c4416af76fd773d229b3c9", size = 438711, upload-time = "2026-03-28T17:18:00.728Z" }, - { url = "https://files.pythonhosted.org/packages/40/b2/54b487316c2df3e03a8f3435e9636f8a81a42a69d942164830d193beb56a/aiohttp-3.13.4-cp314-cp314-win_amd64.whl", hash = "sha256:c555db4bc7a264bead5a7d63d92d41a1122fcd39cc62a4db815f45ad46f9c2c8", size = 464977, upload-time = "2026-03-28T17:18:03.367Z" }, - { url = "https://files.pythonhosted.org/packages/47/fb/e41b63c6ce71b07a59243bb8f3b457ee0c3402a619acb9d2c0d21ef0e647/aiohttp-3.13.4-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:45abbbf09a129825d13c18c7d3182fecd46d9da3cfc383756145394013604ac1", size = 781549, upload-time = "2026-03-28T17:18:05.779Z" }, - { url = "https://files.pythonhosted.org/packages/97/53/532b8d28df1e17e44c4d9a9368b78dcb6bf0b51037522136eced13afa9e8/aiohttp-3.13.4-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:74c80b2bc2c2adb7b3d1941b2b60701ee2af8296fc8aad8b8bc48bc25767266c", size = 514383, upload-time = "2026-03-28T17:18:08.096Z" }, - { url = "https://files.pythonhosted.org/packages/1b/1f/62e5d400603e8468cd635812d99cb81cfdc08127a3dc474c647615f31339/aiohttp-3.13.4-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c97989ae40a9746650fa196894f317dafc12227c808c774929dda0ff873a5954", size = 518304, upload-time = "2026-03-28T17:18:10.642Z" }, - { url = "https://files.pythonhosted.org/packages/90/57/2326b37b10896447e3c6e0cbef4fe2486d30913639a5cfd1332b5d870f82/aiohttp-3.13.4-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:dae86be9811493f9990ef44fff1685f5c1a3192e9061a71a109d527944eed551", size = 1893433, upload-time = "2026-03-28T17:18:13.121Z" }, - { url = "https://files.pythonhosted.org/packages/d2/b4/a24d82112c304afdb650167ef2fe190957d81cbddac7460bedd245f765aa/aiohttp-3.13.4-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:1db491abe852ca2fa6cc48a3341985b0174b3741838e1341b82ac82c8bd9e871", size = 1755901, upload-time = "2026-03-28T17:18:16.21Z" }, - { url = "https://files.pythonhosted.org/packages/9e/2d/0883ef9d878d7846287f036c162a951968f22aabeef3ac97b0bea6f76d5d/aiohttp-3.13.4-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:0e5d701c0aad02a7dce72eef6b93226cf3734330f1a31d69ebbf69f33b86666e", size = 1876093, upload-time = "2026-03-28T17:18:18.703Z" }, - { url = "https://files.pythonhosted.org/packages/ad/52/9204bb59c014869b71971addad6778f005daa72a96eed652c496789d7468/aiohttp-3.13.4-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8ac32a189081ae0a10ba18993f10f338ec94341f0d5df8fff348043962f3c6f8", size = 1970815, upload-time = "2026-03-28T17:18:21.858Z" }, - { url = "https://files.pythonhosted.org/packages/d6/b5/e4eb20275a866dde0f570f411b36c6b48f7b53edfe4f4071aa1b0728098a/aiohttp-3.13.4-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:98e968cdaba43e45c73c3f306fca418c8009a957733bac85937c9f9cf3f4de27", size = 1816223, upload-time = "2026-03-28T17:18:24.729Z" }, - { url = "https://files.pythonhosted.org/packages/d8/23/e98075c5bb146aa61a1239ee1ac7714c85e814838d6cebbe37d3fe19214a/aiohttp-3.13.4-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ca114790c9144c335d538852612d3e43ea0f075288f4849cf4b05d6cd2238ce7", size = 1649145, upload-time = "2026-03-28T17:18:27.269Z" }, - { url = "https://files.pythonhosted.org/packages/d6/c1/7bad8be33bb06c2bb224b6468874346026092762cbec388c3bdb65a368ee/aiohttp-3.13.4-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:ea2e071661ba9cfe11eabbc81ac5376eaeb3061f6e72ec4cc86d7cdd1ffbdbbb", size = 1816562, upload-time = "2026-03-28T17:18:29.847Z" }, - { url = "https://files.pythonhosted.org/packages/5c/10/c00323348695e9a5e316825969c88463dcc24c7e9d443244b8a2c9cf2eae/aiohttp-3.13.4-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:34e89912b6c20e0fd80e07fa401fd218a410aa1ce9f1c2f1dad6db1bd0ce0927", size = 1800333, upload-time = "2026-03-28T17:18:32.269Z" }, - { url = "https://files.pythonhosted.org/packages/84/43/9b2147a1df3559f49bd723e22905b46a46c068a53adb54abdca32c4de180/aiohttp-3.13.4-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:0e217cf9f6a42908c52b46e42c568bd57adc39c9286ced31aaace614b6087965", size = 1820617, upload-time = "2026-03-28T17:18:35.238Z" }, - { url = "https://files.pythonhosted.org/packages/a9/7f/b3481a81e7a586d02e99387b18c6dafff41285f6efd3daa2124c01f87eae/aiohttp-3.13.4-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:0c296f1221e21ba979f5ac1964c3b78cfde15c5c5f855ffd2caab337e9cd9182", size = 1643417, upload-time = "2026-03-28T17:18:37.949Z" }, - { url = "https://files.pythonhosted.org/packages/8f/72/07181226bc99ce1124e0f89280f5221a82d3ae6a6d9d1973ce429d48e52b/aiohttp-3.13.4-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:d99a9d168ebaffb74f36d011750e490085ac418f4db926cce3989c8fe6cb6b1b", size = 1849286, upload-time = "2026-03-28T17:18:40.534Z" }, - { url = "https://files.pythonhosted.org/packages/1a/e6/1b3566e103eca6da5be4ae6713e112a053725c584e96574caf117568ffef/aiohttp-3.13.4-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:cb19177205d93b881f3f89e6081593676043a6828f59c78c17a0fd6c1fbed2ba", size = 1782635, upload-time = "2026-03-28T17:18:43.073Z" }, - { url = "https://files.pythonhosted.org/packages/37/58/1b11c71904b8d079eb0c39fe664180dd1e14bebe5608e235d8bfbadc8929/aiohttp-3.13.4-cp314-cp314t-win32.whl", hash = "sha256:c606aa5656dab6552e52ca368e43869c916338346bfaf6304e15c58fb113ea30", size = 472537, upload-time = "2026-03-28T17:18:46.286Z" }, - { url = "https://files.pythonhosted.org/packages/bc/8f/87c56a1a1977d7dddea5b31e12189665a140fdb48a71e9038ff90bb564ec/aiohttp-3.13.4-cp314-cp314t-win_amd64.whl", hash = "sha256:014dcc10ec8ab8db681f0d68e939d1e9286a5aa2b993cbbdb0db130853e02144", size = 506381, upload-time = "2026-03-28T17:18:48.74Z" }, +sdist = { url = "https://files.pythonhosted.org/packages/50/42/32cf8e7704ceb4481406eb87161349abb46a57fee3f008ba9cb610968646/aiohttp-3.13.3.tar.gz", hash = "sha256:a949eee43d3782f2daae4f4a2819b2cb9b0c5d3b7f7a927067cc84dafdbb9f88", size = 7844556, upload-time = "2026-01-03T17:33:05.204Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f1/4c/a164164834f03924d9a29dc3acd9e7ee58f95857e0b467f6d04298594ebb/aiohttp-3.13.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5b6073099fb654e0a068ae678b10feff95c5cae95bbfcbfa7af669d361a8aa6b", size = 746051, upload-time = "2026-01-03T17:29:43.287Z" }, + { url = "https://files.pythonhosted.org/packages/82/71/d5c31390d18d4f58115037c432b7e0348c60f6f53b727cad33172144a112/aiohttp-3.13.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1cb93e166e6c28716c8c6aeb5f99dfb6d5ccf482d29fe9bf9a794110e6d0ab64", size = 499234, upload-time = "2026-01-03T17:29:44.822Z" }, + { url = "https://files.pythonhosted.org/packages/0e/c9/741f8ac91e14b1d2e7100690425a5b2b919a87a5075406582991fb7de920/aiohttp-3.13.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:28e027cf2f6b641693a09f631759b4d9ce9165099d2b5d92af9bd4e197690eea", size = 494979, upload-time = "2026-01-03T17:29:46.405Z" }, + { url = "https://files.pythonhosted.org/packages/75/b5/31d4d2e802dfd59f74ed47eba48869c1c21552c586d5e81a9d0d5c2ad640/aiohttp-3.13.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3b61b7169ababd7802f9568ed96142616a9118dd2be0d1866e920e77ec8fa92a", size = 1748297, upload-time = "2026-01-03T17:29:48.083Z" }, + { url = "https://files.pythonhosted.org/packages/1a/3e/eefad0ad42959f226bb79664826883f2687d602a9ae2941a18e0484a74d3/aiohttp-3.13.3-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:80dd4c21b0f6237676449c6baaa1039abae86b91636b6c91a7f8e61c87f89540", size = 1707172, upload-time = "2026-01-03T17:29:49.648Z" }, + { url = "https://files.pythonhosted.org/packages/c5/3a/54a64299fac2891c346cdcf2aa6803f994a2e4beeaf2e5a09dcc54acc842/aiohttp-3.13.3-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:65d2ccb7eabee90ce0503c17716fc77226be026dcc3e65cce859a30db715025b", size = 1805405, upload-time = "2026-01-03T17:29:51.244Z" }, + { url = "https://files.pythonhosted.org/packages/6c/70/ddc1b7169cf64075e864f64595a14b147a895a868394a48f6a8031979038/aiohttp-3.13.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5b179331a481cb5529fca8b432d8d3c7001cb217513c94cd72d668d1248688a3", size = 1899449, upload-time = "2026-01-03T17:29:53.938Z" }, + { url = "https://files.pythonhosted.org/packages/a1/7e/6815aab7d3a56610891c76ef79095677b8b5be6646aaf00f69b221765021/aiohttp-3.13.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d4c940f02f49483b18b079d1c27ab948721852b281f8b015c058100e9421dd1", size = 1748444, upload-time = "2026-01-03T17:29:55.484Z" }, + { url = "https://files.pythonhosted.org/packages/6b/f2/073b145c4100da5511f457dc0f7558e99b2987cf72600d42b559db856fbc/aiohttp-3.13.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f9444f105664c4ce47a2a7171a2418bce5b7bae45fb610f4e2c36045d85911d3", size = 1606038, upload-time = "2026-01-03T17:29:57.179Z" }, + { url = "https://files.pythonhosted.org/packages/0a/c1/778d011920cae03ae01424ec202c513dc69243cf2db303965615b81deeea/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:694976222c711d1d00ba131904beb60534f93966562f64440d0c9d41b8cdb440", size = 1724156, upload-time = "2026-01-03T17:29:58.914Z" }, + { url = "https://files.pythonhosted.org/packages/0e/cb/3419eabf4ec1e9ec6f242c32b689248365a1cf621891f6f0386632525494/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:f33ed1a2bf1997a36661874b017f5c4b760f41266341af36febaf271d179f6d7", size = 1722340, upload-time = "2026-01-03T17:30:01.962Z" }, + { url = "https://files.pythonhosted.org/packages/7a/e5/76cf77bdbc435bf233c1f114edad39ed4177ccbfab7c329482b179cff4f4/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e636b3c5f61da31a92bf0d91da83e58fdfa96f178ba682f11d24f31944cdd28c", size = 1783041, upload-time = "2026-01-03T17:30:03.609Z" }, + { url = "https://files.pythonhosted.org/packages/9d/d4/dd1ca234c794fd29c057ce8c0566b8ef7fd6a51069de5f06fa84b9a1971c/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:5d2d94f1f5fcbe40838ac51a6ab5704a6f9ea42e72ceda48de5e6b898521da51", size = 1596024, upload-time = "2026-01-03T17:30:05.132Z" }, + { url = "https://files.pythonhosted.org/packages/55/58/4345b5f26661a6180afa686c473620c30a66afdf120ed3dd545bbc809e85/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2be0e9ccf23e8a94f6f0650ce06042cefc6ac703d0d7ab6c7a917289f2539ad4", size = 1804590, upload-time = "2026-01-03T17:30:07.135Z" }, + { url = "https://files.pythonhosted.org/packages/7b/06/05950619af6c2df7e0a431d889ba2813c9f0129cec76f663e547a5ad56f2/aiohttp-3.13.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9af5e68ee47d6534d36791bbe9b646d2a7c7deb6fc24d7943628edfbb3581f29", size = 1740355, upload-time = "2026-01-03T17:30:09.083Z" }, + { url = "https://files.pythonhosted.org/packages/3e/80/958f16de79ba0422d7c1e284b2abd0c84bc03394fbe631d0a39ffa10e1eb/aiohttp-3.13.3-cp311-cp311-win32.whl", hash = "sha256:a2212ad43c0833a873d0fb3c63fa1bacedd4cf6af2fee62bf4b739ceec3ab239", size = 433701, upload-time = "2026-01-03T17:30:10.869Z" }, + { url = "https://files.pythonhosted.org/packages/dc/f2/27cdf04c9851712d6c1b99df6821a6623c3c9e55956d4b1e318c337b5a48/aiohttp-3.13.3-cp311-cp311-win_amd64.whl", hash = "sha256:642f752c3eb117b105acbd87e2c143de710987e09860d674e068c4c2c441034f", size = 457678, upload-time = "2026-01-03T17:30:12.719Z" }, + { url = "https://files.pythonhosted.org/packages/a0/be/4fc11f202955a69e0db803a12a062b8379c970c7c84f4882b6da17337cc1/aiohttp-3.13.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:b903a4dfee7d347e2d87697d0713be59e0b87925be030c9178c5faa58ea58d5c", size = 739732, upload-time = "2026-01-03T17:30:14.23Z" }, + { url = "https://files.pythonhosted.org/packages/97/2c/621d5b851f94fa0bb7430d6089b3aa970a9d9b75196bc93bb624b0db237a/aiohttp-3.13.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:a45530014d7a1e09f4a55f4f43097ba0fd155089372e105e4bff4ca76cb1b168", size = 494293, upload-time = "2026-01-03T17:30:15.96Z" }, + { url = "https://files.pythonhosted.org/packages/5d/43/4be01406b78e1be8320bb8316dc9c42dbab553d281c40364e0f862d5661c/aiohttp-3.13.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:27234ef6d85c914f9efeb77ff616dbf4ad2380be0cda40b4db086ffc7ddd1b7d", size = 493533, upload-time = "2026-01-03T17:30:17.431Z" }, + { url = "https://files.pythonhosted.org/packages/8d/a8/5a35dc56a06a2c90d4742cbf35294396907027f80eea696637945a106f25/aiohttp-3.13.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d32764c6c9aafb7fb55366a224756387cd50bfa720f32b88e0e6fa45b27dcf29", size = 1737839, upload-time = "2026-01-03T17:30:19.422Z" }, + { url = "https://files.pythonhosted.org/packages/bf/62/4b9eeb331da56530bf2e198a297e5303e1c1ebdceeb00fe9b568a65c5a0c/aiohttp-3.13.3-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b1a6102b4d3ebc07dad44fbf07b45bb600300f15b552ddf1851b5390202ea2e3", size = 1703932, upload-time = "2026-01-03T17:30:21.756Z" }, + { url = "https://files.pythonhosted.org/packages/7c/f6/af16887b5d419e6a367095994c0b1332d154f647e7dc2bd50e61876e8e3d/aiohttp-3.13.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c014c7ea7fb775dd015b2d3137378b7be0249a448a1612268b5a90c2d81de04d", size = 1771906, upload-time = "2026-01-03T17:30:23.932Z" }, + { url = "https://files.pythonhosted.org/packages/ce/83/397c634b1bcc24292fa1e0c7822800f9f6569e32934bdeef09dae7992dfb/aiohttp-3.13.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2b8d8ddba8f95ba17582226f80e2de99c7a7948e66490ef8d947e272a93e9463", size = 1871020, upload-time = "2026-01-03T17:30:26Z" }, + { url = "https://files.pythonhosted.org/packages/86/f6/a62cbbf13f0ac80a70f71b1672feba90fdb21fd7abd8dbf25c0105fb6fa3/aiohttp-3.13.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9ae8dd55c8e6c4257eae3a20fd2c8f41edaea5992ed67156642493b8daf3cecc", size = 1755181, upload-time = "2026-01-03T17:30:27.554Z" }, + { url = "https://files.pythonhosted.org/packages/0a/87/20a35ad487efdd3fba93d5843efdfaa62d2f1479eaafa7453398a44faf13/aiohttp-3.13.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:01ad2529d4b5035578f5081606a465f3b814c542882804e2e8cda61adf5c71bf", size = 1561794, upload-time = "2026-01-03T17:30:29.254Z" }, + { url = "https://files.pythonhosted.org/packages/de/95/8fd69a66682012f6716e1bc09ef8a1a2a91922c5725cb904689f112309c4/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:bb4f7475e359992b580559e008c598091c45b5088f28614e855e42d39c2f1033", size = 1697900, upload-time = "2026-01-03T17:30:31.033Z" }, + { url = "https://files.pythonhosted.org/packages/e5/66/7b94b3b5ba70e955ff597672dad1691333080e37f50280178967aff68657/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:c19b90316ad3b24c69cd78d5c9b4f3aa4497643685901185b65166293d36a00f", size = 1728239, upload-time = "2026-01-03T17:30:32.703Z" }, + { url = "https://files.pythonhosted.org/packages/47/71/6f72f77f9f7d74719692ab65a2a0252584bf8d5f301e2ecb4c0da734530a/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:96d604498a7c782cb15a51c406acaea70d8c027ee6b90c569baa6e7b93073679", size = 1740527, upload-time = "2026-01-03T17:30:34.695Z" }, + { url = "https://files.pythonhosted.org/packages/fa/b4/75ec16cbbd5c01bdaf4a05b19e103e78d7ce1ef7c80867eb0ace42ff4488/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:084911a532763e9d3dd95adf78a78f4096cd5f58cdc18e6fdbc1b58417a45423", size = 1554489, upload-time = "2026-01-03T17:30:36.864Z" }, + { url = "https://files.pythonhosted.org/packages/52/8f/bc518c0eea29f8406dcf7ed1f96c9b48e3bc3995a96159b3fc11f9e08321/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:7a4a94eb787e606d0a09404b9c38c113d3b099d508021faa615d70a0131907ce", size = 1767852, upload-time = "2026-01-03T17:30:39.433Z" }, + { url = "https://files.pythonhosted.org/packages/9d/f2/a07a75173124f31f11ea6f863dc44e6f09afe2bca45dd4e64979490deab1/aiohttp-3.13.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:87797e645d9d8e222e04160ee32aa06bc5c163e8499f24db719e7852ec23093a", size = 1722379, upload-time = "2026-01-03T17:30:41.081Z" }, + { url = "https://files.pythonhosted.org/packages/3c/4a/1a3fee7c21350cac78e5c5cef711bac1b94feca07399f3d406972e2d8fcd/aiohttp-3.13.3-cp312-cp312-win32.whl", hash = "sha256:b04be762396457bef43f3597c991e192ee7da460a4953d7e647ee4b1c28e7046", size = 428253, upload-time = "2026-01-03T17:30:42.644Z" }, + { url = "https://files.pythonhosted.org/packages/d9/b7/76175c7cb4eb73d91ad63c34e29fc4f77c9386bba4a65b53ba8e05ee3c39/aiohttp-3.13.3-cp312-cp312-win_amd64.whl", hash = "sha256:e3531d63d3bdfa7e3ac5e9b27b2dd7ec9df3206a98e0b3445fa906f233264c57", size = 455407, upload-time = "2026-01-03T17:30:44.195Z" }, + { url = "https://files.pythonhosted.org/packages/97/8a/12ca489246ca1faaf5432844adbfce7ff2cc4997733e0af120869345643a/aiohttp-3.13.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:5dff64413671b0d3e7d5918ea490bdccb97a4ad29b3f311ed423200b2203e01c", size = 734190, upload-time = "2026-01-03T17:30:45.832Z" }, + { url = "https://files.pythonhosted.org/packages/32/08/de43984c74ed1fca5c014808963cc83cb00d7bb06af228f132d33862ca76/aiohttp-3.13.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:87b9aab6d6ed88235aa2970294f496ff1a1f9adcd724d800e9b952395a80ffd9", size = 491783, upload-time = "2026-01-03T17:30:47.466Z" }, + { url = "https://files.pythonhosted.org/packages/17/f8/8dd2cf6112a5a76f81f81a5130c57ca829d101ad583ce57f889179accdda/aiohttp-3.13.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:425c126c0dc43861e22cb1c14ba4c8e45d09516d0a3ae0a3f7494b79f5f233a3", size = 490704, upload-time = "2026-01-03T17:30:49.373Z" }, + { url = "https://files.pythonhosted.org/packages/6d/40/a46b03ca03936f832bc7eaa47cfbb1ad012ba1be4790122ee4f4f8cba074/aiohttp-3.13.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7f9120f7093c2a32d9647abcaf21e6ad275b4fbec5b55969f978b1a97c7c86bf", size = 1720652, upload-time = "2026-01-03T17:30:50.974Z" }, + { url = "https://files.pythonhosted.org/packages/f7/7e/917fe18e3607af92657e4285498f500dca797ff8c918bd7d90b05abf6c2a/aiohttp-3.13.3-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:697753042d57f4bf7122cab985bf15d0cef23c770864580f5af4f52023a56bd6", size = 1692014, upload-time = "2026-01-03T17:30:52.729Z" }, + { url = "https://files.pythonhosted.org/packages/71/b6/cefa4cbc00d315d68973b671cf105b21a609c12b82d52e5d0c9ae61d2a09/aiohttp-3.13.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6de499a1a44e7de70735d0b39f67c8f25eb3d91eb3103be99ca0fa882cdd987d", size = 1759777, upload-time = "2026-01-03T17:30:54.537Z" }, + { url = "https://files.pythonhosted.org/packages/fb/e3/e06ee07b45e59e6d81498b591fc589629be1553abb2a82ce33efe2a7b068/aiohttp-3.13.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:37239e9f9a7ea9ac5bf6b92b0260b01f8a22281996da609206a84df860bc1261", size = 1861276, upload-time = "2026-01-03T17:30:56.512Z" }, + { url = "https://files.pythonhosted.org/packages/7c/24/75d274228acf35ceeb2850b8ce04de9dd7355ff7a0b49d607ee60c29c518/aiohttp-3.13.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f76c1e3fe7d7c8afad7ed193f89a292e1999608170dcc9751a7462a87dfd5bc0", size = 1743131, upload-time = "2026-01-03T17:30:58.256Z" }, + { url = "https://files.pythonhosted.org/packages/04/98/3d21dde21889b17ca2eea54fdcff21b27b93f45b7bb94ca029c31ab59dc3/aiohttp-3.13.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fc290605db2a917f6e81b0e1e0796469871f5af381ce15c604a3c5c7e51cb730", size = 1556863, upload-time = "2026-01-03T17:31:00.445Z" }, + { url = "https://files.pythonhosted.org/packages/9e/84/da0c3ab1192eaf64782b03971ab4055b475d0db07b17eff925e8c93b3aa5/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4021b51936308aeea0367b8f006dc999ca02bc118a0cc78c303f50a2ff6afb91", size = 1682793, upload-time = "2026-01-03T17:31:03.024Z" }, + { url = "https://files.pythonhosted.org/packages/ff/0f/5802ada182f575afa02cbd0ec5180d7e13a402afb7c2c03a9aa5e5d49060/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:49a03727c1bba9a97d3e93c9f93ca03a57300f484b6e935463099841261195d3", size = 1716676, upload-time = "2026-01-03T17:31:04.842Z" }, + { url = "https://files.pythonhosted.org/packages/3f/8c/714d53bd8b5a4560667f7bbbb06b20c2382f9c7847d198370ec6526af39c/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:3d9908a48eb7416dc1f4524e69f1d32e5d90e3981e4e37eb0aa1cd18f9cfa2a4", size = 1733217, upload-time = "2026-01-03T17:31:06.868Z" }, + { url = "https://files.pythonhosted.org/packages/7d/79/e2176f46d2e963facea939f5be2d26368ce543622be6f00a12844d3c991f/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:2712039939ec963c237286113c68dbad80a82a4281543f3abf766d9d73228998", size = 1552303, upload-time = "2026-01-03T17:31:08.958Z" }, + { url = "https://files.pythonhosted.org/packages/ab/6a/28ed4dea1759916090587d1fe57087b03e6c784a642b85ef48217b0277ae/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:7bfdc049127717581866fa4708791220970ce291c23e28ccf3922c700740fdc0", size = 1763673, upload-time = "2026-01-03T17:31:10.676Z" }, + { url = "https://files.pythonhosted.org/packages/e8/35/4a3daeb8b9fab49240d21c04d50732313295e4bd813a465d840236dd0ce1/aiohttp-3.13.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8057c98e0c8472d8846b9c79f56766bcc57e3e8ac7bfd510482332366c56c591", size = 1721120, upload-time = "2026-01-03T17:31:12.575Z" }, + { url = "https://files.pythonhosted.org/packages/bc/9f/d643bb3c5fb99547323e635e251c609fbbc660d983144cfebec529e09264/aiohttp-3.13.3-cp313-cp313-win32.whl", hash = "sha256:1449ceddcdbcf2e0446957863af03ebaaa03f94c090f945411b61269e2cb5daf", size = 427383, upload-time = "2026-01-03T17:31:14.382Z" }, + { url = "https://files.pythonhosted.org/packages/4e/f1/ab0395f8a79933577cdd996dd2f9aa6014af9535f65dddcf88204682fe62/aiohttp-3.13.3-cp313-cp313-win_amd64.whl", hash = "sha256:693781c45a4033d31d4187d2436f5ac701e7bbfe5df40d917736108c1cc7436e", size = 453899, upload-time = "2026-01-03T17:31:15.958Z" }, + { url = "https://files.pythonhosted.org/packages/99/36/5b6514a9f5d66f4e2597e40dea2e3db271e023eb7a5d22defe96ba560996/aiohttp-3.13.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:ea37047c6b367fd4bd632bff8077449b8fa034b69e812a18e0132a00fae6e808", size = 737238, upload-time = "2026-01-03T17:31:17.909Z" }, + { url = "https://files.pythonhosted.org/packages/f7/49/459327f0d5bcd8c6c9ca69e60fdeebc3622861e696490d8674a6d0cb90a6/aiohttp-3.13.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:6fc0e2337d1a4c3e6acafda6a78a39d4c14caea625124817420abceed36e2415", size = 492292, upload-time = "2026-01-03T17:31:19.919Z" }, + { url = "https://files.pythonhosted.org/packages/e8/0b/b97660c5fd05d3495b4eb27f2d0ef18dc1dc4eff7511a9bf371397ff0264/aiohttp-3.13.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c685f2d80bb67ca8c3837823ad76196b3694b0159d232206d1e461d3d434666f", size = 493021, upload-time = "2026-01-03T17:31:21.636Z" }, + { url = "https://files.pythonhosted.org/packages/54/d4/438efabdf74e30aeceb890c3290bbaa449780583b1270b00661126b8aae4/aiohttp-3.13.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:48e377758516d262bde50c2584fc6c578af272559c409eecbdd2bae1601184d6", size = 1717263, upload-time = "2026-01-03T17:31:23.296Z" }, + { url = "https://files.pythonhosted.org/packages/71/f2/7bddc7fd612367d1459c5bcf598a9e8f7092d6580d98de0e057eb42697ad/aiohttp-3.13.3-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:34749271508078b261c4abb1767d42b8d0c0cc9449c73a4df494777dc55f0687", size = 1669107, upload-time = "2026-01-03T17:31:25.334Z" }, + { url = "https://files.pythonhosted.org/packages/00/5a/1aeaecca40e22560f97610a329e0e5efef5e0b5afdf9f857f0d93839ab2e/aiohttp-3.13.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:82611aeec80eb144416956ec85b6ca45a64d76429c1ed46ae1b5f86c6e0c9a26", size = 1760196, upload-time = "2026-01-03T17:31:27.394Z" }, + { url = "https://files.pythonhosted.org/packages/f8/f8/0ff6992bea7bd560fc510ea1c815f87eedd745fe035589c71ce05612a19a/aiohttp-3.13.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2fff83cfc93f18f215896e3a190e8e5cb413ce01553901aca925176e7568963a", size = 1843591, upload-time = "2026-01-03T17:31:29.238Z" }, + { url = "https://files.pythonhosted.org/packages/e3/d1/e30e537a15f53485b61f5be525f2157da719819e8377298502aebac45536/aiohttp-3.13.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bbe7d4cecacb439e2e2a8a1a7b935c25b812af7a5fd26503a66dadf428e79ec1", size = 1720277, upload-time = "2026-01-03T17:31:31.053Z" }, + { url = "https://files.pythonhosted.org/packages/84/45/23f4c451d8192f553d38d838831ebbc156907ea6e05557f39563101b7717/aiohttp-3.13.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b928f30fe49574253644b1ca44b1b8adbd903aa0da4b9054a6c20fc7f4092a25", size = 1548575, upload-time = "2026-01-03T17:31:32.87Z" }, + { url = "https://files.pythonhosted.org/packages/6a/ed/0a42b127a43712eda7807e7892c083eadfaf8429ca8fb619662a530a3aab/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7b5e8fe4de30df199155baaf64f2fcd604f4c678ed20910db8e2c66dc4b11603", size = 1679455, upload-time = "2026-01-03T17:31:34.76Z" }, + { url = "https://files.pythonhosted.org/packages/2e/b5/c05f0c2b4b4fe2c9d55e73b6d3ed4fd6c9dc2684b1d81cbdf77e7fad9adb/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:8542f41a62bcc58fc7f11cf7c90e0ec324ce44950003feb70640fc2a9092c32a", size = 1687417, upload-time = "2026-01-03T17:31:36.699Z" }, + { url = "https://files.pythonhosted.org/packages/c9/6b/915bc5dad66aef602b9e459b5a973529304d4e89ca86999d9d75d80cbd0b/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:5e1d8c8b8f1d91cd08d8f4a3c2b067bfca6ec043d3ff36de0f3a715feeedf926", size = 1729968, upload-time = "2026-01-03T17:31:38.622Z" }, + { url = "https://files.pythonhosted.org/packages/11/3b/e84581290a9520024a08640b63d07673057aec5ca548177a82026187ba73/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:90455115e5da1c3c51ab619ac57f877da8fd6d73c05aacd125c5ae9819582aba", size = 1545690, upload-time = "2026-01-03T17:31:40.57Z" }, + { url = "https://files.pythonhosted.org/packages/f5/04/0c3655a566c43fd647c81b895dfe361b9f9ad6d58c19309d45cff52d6c3b/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:042e9e0bcb5fba81886c8b4fbb9a09d6b8a00245fd8d88e4d989c1f96c74164c", size = 1746390, upload-time = "2026-01-03T17:31:42.857Z" }, + { url = "https://files.pythonhosted.org/packages/1f/53/71165b26978f719c3419381514c9690bd5980e764a09440a10bb816ea4ab/aiohttp-3.13.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2eb752b102b12a76ca02dff751a801f028b4ffbbc478840b473597fc91a9ed43", size = 1702188, upload-time = "2026-01-03T17:31:44.984Z" }, + { url = "https://files.pythonhosted.org/packages/29/a7/cbe6c9e8e136314fa1980da388a59d2f35f35395948a08b6747baebb6aa6/aiohttp-3.13.3-cp314-cp314-win32.whl", hash = "sha256:b556c85915d8efaed322bf1bdae9486aa0f3f764195a0fb6ee962e5c71ef5ce1", size = 433126, upload-time = "2026-01-03T17:31:47.463Z" }, + { url = "https://files.pythonhosted.org/packages/de/56/982704adea7d3b16614fc5936014e9af85c0e34b58f9046655817f04306e/aiohttp-3.13.3-cp314-cp314-win_amd64.whl", hash = "sha256:9bf9f7a65e7aa20dd764151fb3d616c81088f91f8df39c3893a536e279b4b984", size = 459128, upload-time = "2026-01-03T17:31:49.2Z" }, + { url = "https://files.pythonhosted.org/packages/6c/2a/3c79b638a9c3d4658d345339d22070241ea341ed4e07b5ac60fb0f418003/aiohttp-3.13.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:05861afbbec40650d8a07ea324367cb93e9e8cc7762e04dd4405df99fa65159c", size = 769512, upload-time = "2026-01-03T17:31:51.134Z" }, + { url = "https://files.pythonhosted.org/packages/29/b9/3e5014d46c0ab0db8707e0ac2711ed28c4da0218c358a4e7c17bae0d8722/aiohttp-3.13.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:2fc82186fadc4a8316768d61f3722c230e2c1dcab4200d52d2ebdf2482e47592", size = 506444, upload-time = "2026-01-03T17:31:52.85Z" }, + { url = "https://files.pythonhosted.org/packages/90/03/c1d4ef9a054e151cd7839cdc497f2638f00b93cbe8043983986630d7a80c/aiohttp-3.13.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:0add0900ff220d1d5c5ebbf99ed88b0c1bbf87aa7e4262300ed1376a6b13414f", size = 510798, upload-time = "2026-01-03T17:31:54.91Z" }, + { url = "https://files.pythonhosted.org/packages/ea/76/8c1e5abbfe8e127c893fe7ead569148a4d5a799f7cf958d8c09f3eedf097/aiohttp-3.13.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:568f416a4072fbfae453dcf9a99194bbb8bdeab718e08ee13dfa2ba0e4bebf29", size = 1868835, upload-time = "2026-01-03T17:31:56.733Z" }, + { url = "https://files.pythonhosted.org/packages/8e/ac/984c5a6f74c363b01ff97adc96a3976d9c98940b8969a1881575b279ac5d/aiohttp-3.13.3-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:add1da70de90a2569c5e15249ff76a631ccacfe198375eead4aadf3b8dc849dc", size = 1720486, upload-time = "2026-01-03T17:31:58.65Z" }, + { url = "https://files.pythonhosted.org/packages/b2/9a/b7039c5f099c4eb632138728828b33428585031a1e658d693d41d07d89d1/aiohttp-3.13.3-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:10b47b7ba335d2e9b1239fa571131a87e2d8ec96b333e68b2a305e7a98b0bae2", size = 1847951, upload-time = "2026-01-03T17:32:00.989Z" }, + { url = "https://files.pythonhosted.org/packages/3c/02/3bec2b9a1ba3c19ff89a43a19324202b8eb187ca1e928d8bdac9bbdddebd/aiohttp-3.13.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3dd4dce1c718e38081c8f35f323209d4c1df7d4db4bab1b5c88a6b4d12b74587", size = 1941001, upload-time = "2026-01-03T17:32:03.122Z" }, + { url = "https://files.pythonhosted.org/packages/37/df/d879401cedeef27ac4717f6426c8c36c3091c6e9f08a9178cc87549c537f/aiohttp-3.13.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:34bac00a67a812570d4a460447e1e9e06fae622946955f939051e7cc895cfab8", size = 1797246, upload-time = "2026-01-03T17:32:05.255Z" }, + { url = "https://files.pythonhosted.org/packages/8d/15/be122de1f67e6953add23335c8ece6d314ab67c8bebb3f181063010795a7/aiohttp-3.13.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a19884d2ee70b06d9204b2727a7b9f983d0c684c650254679e716b0b77920632", size = 1627131, upload-time = "2026-01-03T17:32:07.607Z" }, + { url = "https://files.pythonhosted.org/packages/12/12/70eedcac9134cfa3219ab7af31ea56bc877395b1ac30d65b1bc4b27d0438/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:5f8ca7f2bb6ba8348a3614c7918cc4bb73268c5ac2a207576b7afea19d3d9f64", size = 1795196, upload-time = "2026-01-03T17:32:09.59Z" }, + { url = "https://files.pythonhosted.org/packages/32/11/b30e1b1cd1f3054af86ebe60df96989c6a414dd87e27ad16950eee420bea/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:b0d95340658b9d2f11d9697f59b3814a9d3bb4b7a7c20b131df4bcef464037c0", size = 1782841, upload-time = "2026-01-03T17:32:11.445Z" }, + { url = "https://files.pythonhosted.org/packages/88/0d/d98a9367b38912384a17e287850f5695c528cff0f14f791ce8ee2e4f7796/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:a1e53262fd202e4b40b70c3aff944a8155059beedc8a89bba9dc1f9ef06a1b56", size = 1795193, upload-time = "2026-01-03T17:32:13.705Z" }, + { url = "https://files.pythonhosted.org/packages/43/a5/a2dfd1f5ff5581632c7f6a30e1744deda03808974f94f6534241ef60c751/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:d60ac9663f44168038586cab2157e122e46bdef09e9368b37f2d82d354c23f72", size = 1621979, upload-time = "2026-01-03T17:32:15.965Z" }, + { url = "https://files.pythonhosted.org/packages/fa/f0/12973c382ae7c1cccbc4417e129c5bf54c374dfb85af70893646e1f0e749/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:90751b8eed69435bac9ff4e3d2f6b3af1f57e37ecb0fbeee59c0174c9e2d41df", size = 1822193, upload-time = "2026-01-03T17:32:18.219Z" }, + { url = "https://files.pythonhosted.org/packages/3c/5f/24155e30ba7f8c96918af1350eb0663e2430aad9e001c0489d89cd708ab1/aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:fc353029f176fd2b3ec6cfc71be166aba1936fe5d73dd1992ce289ca6647a9aa", size = 1769801, upload-time = "2026-01-03T17:32:20.25Z" }, + { url = "https://files.pythonhosted.org/packages/eb/f8/7314031ff5c10e6ece114da79b338ec17eeff3a079e53151f7e9f43c4723/aiohttp-3.13.3-cp314-cp314t-win32.whl", hash = "sha256:2e41b18a58da1e474a057b3d35248d8320029f61d70a37629535b16a0c8f3767", size = 466523, upload-time = "2026-01-03T17:32:22.215Z" }, + { url = "https://files.pythonhosted.org/packages/b4/63/278a98c715ae467624eafe375542d8ba9b4383a016df8fdefe0ae28382a7/aiohttp-3.13.3-cp314-cp314t-win_amd64.whl", hash = "sha256:44531a36aa2264a1860089ffd4dce7baf875ee5a6079d5fb42e261c704ef7344", size = 499694, upload-time = "2026-01-03T17:32:24.546Z" }, ] [[package]] @@ -574,14 +574,14 @@ wheels = [ [[package]] name = "click" -version = "8.3.1" +version = "8.1.8" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "colorama", marker = "sys_platform == 'win32'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/3d/fa/656b739db8587d7b5dfa22e22ed02566950fbfbcdc20311993483657a5c0/click-8.3.1.tar.gz", hash = "sha256:12ff4785d337a1bb490bb7e9c2b1ee5da3112e94a8622f26a6c77f5d2fc6842a", size = 295065, upload-time = "2025-11-15T20:45:42.706Z" } +sdist = { url = "https://files.pythonhosted.org/packages/b9/2e/0090cbf739cee7d23781ad4b89a9894a41538e4fcf4c31dcdd705b78eb8b/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a", size = 226593, upload-time = "2024-12-21T18:38:44.339Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/98/78/01c019cdb5d6498122777c1a43056ebb3ebfeef2076d9d026bfe15583b2b/click-8.3.1-py3-none-any.whl", hash = "sha256:981153a64e25f12d547d3426c367a4857371575ee7ad18df2a6183ab0545b2a6", size = 108274, upload-time = "2025-11-15T20:45:41.139Z" }, + { url = "https://files.pythonhosted.org/packages/7e/d4/7ebdbd03970677812aac39c869717059dbb71a4cfc033ca6e5221787892c/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2", size = 98188, upload-time = "2024-12-21T18:38:41.666Z" }, ] [[package]] @@ -1526,14 +1526,14 @@ wheels = [ [[package]] name = "importlib-metadata" -version = "8.7.1" +version = "8.5.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "zipp" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/f3/49/3b30cad09e7771a4982d9975a8cbf64f00d4a1ececb53297f1d9a7be1b10/importlib_metadata-8.7.1.tar.gz", hash = "sha256:49fef1ae6440c182052f407c8d34a68f72efc36db9ca90dc0113398f2fdde8bb", size = 57107, upload-time = "2025-12-21T10:00:19.278Z" } +sdist = { url = "https://files.pythonhosted.org/packages/cd/12/33e59336dca5be0c398a7482335911a33aa0e20776128f038019f1a95f1b/importlib_metadata-8.5.0.tar.gz", hash = "sha256:71522656f0abace1d072b9e5481a48f07c138e00f079c38c8f883823f9c26bd7", size = 55304, upload-time = "2024-09-11T14:56:08.937Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/fa/5e/f8e9a1d23b9c20a551a8a02ea3637b4642e22c2626e3a13a9a29cdea99eb/importlib_metadata-8.7.1-py3-none-any.whl", hash = "sha256:5a1f80bf1daa489495071efbb095d75a634cf28a8bc299581244063b53176151", size = 27865, upload-time = "2025-12-21T10:00:18.329Z" }, + { url = "https://files.pythonhosted.org/packages/a0/d9/a1e041c5e7caa9a05c925f4bdbdfb7f006d1f74996af53467bc394c97be7/importlib_metadata-8.5.0-py3-none-any.whl", hash = "sha256:45e54197d28b7a7f1559e60b95e7c567032b602131fbd588f1497f47880aa68b", size = 26514, upload-time = "2024-09-11T14:56:07.019Z" }, ] [[package]] @@ -1796,7 +1796,7 @@ wheels = [ [[package]] name = "jsonschema" -version = "4.26.0" +version = "4.23.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "attrs" }, @@ -1804,9 +1804,9 @@ dependencies = [ { name = "referencing" }, { name = "rpds-py" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/b3/fc/e067678238fa451312d4c62bf6e6cf5ec56375422aee02f9cb5f909b3047/jsonschema-4.26.0.tar.gz", hash = "sha256:0c26707e2efad8aa1bfc5b7ce170f3fccc2e4918ff85989ba9ffa9facb2be326", size = 366583, upload-time = "2026-01-07T13:41:07.246Z" } +sdist = { url = "https://files.pythonhosted.org/packages/38/2e/03362ee4034a4c917f697890ccd4aec0800ccf9ded7f511971c75451deec/jsonschema-4.23.0.tar.gz", hash = "sha256:d71497fef26351a33265337fa77ffeb82423f3ea21283cd9467bb03999266bc4", size = 325778, upload-time = "2024-07-08T18:40:05.546Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/69/90/f63fb5873511e014207a475e2bb4e8b2e570d655b00ac19a9a0ca0a385ee/jsonschema-4.26.0-py3-none-any.whl", hash = "sha256:d489f15263b8d200f8387e64b4c3a75f06629559fb73deb8fdfb525f2dab50ce", size = 90630, upload-time = "2026-01-07T13:41:05.306Z" }, + { url = "https://files.pythonhosted.org/packages/69/4a/4f9dbeb84e8850557c02365a0eee0649abe5eb1d84af92a25731c6c0f922/jsonschema-4.23.0-py3-none-any.whl", hash = "sha256:fbadb6f8b144a8f8cf9f0b89ba94501d143e50411a1278633f56a7acf7fd5566", size = 88462, upload-time = "2024-07-08T18:40:00.165Z" }, ] [package.optional-dependencies] @@ -1817,7 +1817,6 @@ format-nongpl = [ { name = "jsonpointer" }, { name = "rfc3339-validator" }, { name = "rfc3986-validator" }, - { name = "rfc3987-syntax" }, { name = "uri-template" }, { name = "webcolors" }, ] @@ -2033,15 +2032,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/ab/b5/36c712098e6191d1b4e349304ef73a8d06aed77e56ceaac8c0a306c7bda1/jupyterlab_widgets-3.0.16-py3-none-any.whl", hash = "sha256:45fa36d9c6422cf2559198e4db481aa243c7a32d9926b500781c830c80f7ecf8", size = 914926, upload-time = "2025-11-01T21:11:28.008Z" }, ] -[[package]] -name = "lark" -version = "1.3.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/da/34/28fff3ab31ccff1fd4f6c7c7b0ceb2b6968d8ea4950663eadcb5720591a0/lark-1.3.1.tar.gz", hash = "sha256:b426a7a6d6d53189d318f2b6236ab5d6429eaf09259f1ca33eb716eed10d2905", size = 382732, upload-time = "2025-10-27T18:25:56.653Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/82/3d/14ce75ef66813643812f3093ab17e46d3a206942ce7376d31ec2d36229e7/lark-1.3.1-py3-none-any.whl", hash = "sha256:c629b661023a014c37da873b4ff58a817398d12635d3bbb2c5a03be7fe5d1e12", size = 113151, upload-time = "2025-10-27T18:25:54.882Z" }, -] - [[package]] name = "limits" version = "5.6.0" @@ -2058,7 +2048,7 @@ wheels = [ [[package]] name = "litellm" -version = "1.83.0" +version = "1.83.8" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "aiohttp" }, @@ -2074,9 +2064,9 @@ dependencies = [ { name = "tiktoken" }, { name = "tokenizers" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/22/92/6ce9737554994ca8e536e5f4f6a87cc7c4774b656c9eb9add071caf7d54b/litellm-1.83.0.tar.gz", hash = "sha256:860bebc76c4bb27b4cf90b4a77acd66dba25aced37e3db98750de8a1766bfb7a", size = 17333062, upload-time = "2026-03-31T05:08:25.331Z" } +sdist = { url = "https://files.pythonhosted.org/packages/d2/5a/a7b4b4bf9443b1f1d8fb1e1ed7d1936eca93851ff3e43113c3dad17c6556/litellm-1.83.8.tar.gz", hash = "sha256:38db022b4bf5a51cbe597a8308e6e51eb71254ae684d41aa210b76df0c827063", size = 14751978, upload-time = "2026-04-15T03:37:51.462Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/19/2c/a670cc050fcd6f45c6199eb99e259c73aea92edba8d5c2fc1b3686d36217/litellm-1.83.0-py3-none-any.whl", hash = "sha256:88c536d339248f3987571493015784671ba3f193a328e1ea6780dbebaa2094a8", size = 15610306, upload-time = "2026-03-31T05:08:21.987Z" }, + { url = "https://files.pythonhosted.org/packages/f0/02/ee86522b2cb359079596d224db9b23dc12c02d7eeaf3d458abd7a0c54444/litellm-1.83.8-py3-none-any.whl", hash = "sha256:3bc8cfeff9d73a6a11409006c0d66bafeed9a23db65f642000f72f1cdb2e9ce8", size = 16333221, upload-time = "2026-04-15T03:37:47.934Z" }, ] [[package]] @@ -2496,7 +2486,7 @@ wheels = [ [[package]] name = "openai" -version = "2.16.0" +version = "2.24.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "anyio" }, @@ -2508,9 +2498,9 @@ dependencies = [ { name = "tqdm" }, { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/b1/6c/e4c964fcf1d527fdf4739e7cc940c60075a4114d50d03871d5d5b1e13a88/openai-2.16.0.tar.gz", hash = "sha256:42eaa22ca0d8ded4367a77374104d7a2feafee5bd60a107c3c11b5243a11cd12", size = 629649, upload-time = "2026-01-27T23:28:02.579Z" } +sdist = { url = "https://files.pythonhosted.org/packages/55/13/17e87641b89b74552ed408a92b231283786523edddc95f3545809fab673c/openai-2.24.0.tar.gz", hash = "sha256:1e5769f540dbd01cb33bc4716a23e67b9d695161a734aff9c5f925e2bf99a673", size = 658717, upload-time = "2026-02-24T20:02:07.958Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/16/83/0315bf2cfd75a2ce8a7e54188e9456c60cec6c0cf66728ed07bd9859ff26/openai-2.16.0-py3-none-any.whl", hash = "sha256:5f46643a8f42899a84e80c38838135d7038e7718333ce61396994f887b09a59b", size = 1068612, upload-time = "2026-01-27T23:28:00.356Z" }, + { url = "https://files.pythonhosted.org/packages/c9/30/844dc675ee6902579b8eef01ed23917cc9319a1c9c0c14ec6e39340c96d0/openai-2.24.0-py3-none-any.whl", hash = "sha256:fed30480d7d6c884303287bde864980a4b137b60553ffbcf9ab4a233b7a73d94", size = 1120122, upload-time = "2026-02-24T20:02:05.669Z" }, ] [[package]] @@ -3285,11 +3275,11 @@ wheels = [ [[package]] name = "python-dotenv" -version = "1.2.1" +version = "1.0.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f0/26/19cadc79a718c5edbec86fd4919a6b6d3f681039a2f6d66d14be94e75fb9/python_dotenv-1.2.1.tar.gz", hash = "sha256:42667e897e16ab0d66954af0e60a9caa94f0fd4ecf3aaf6d2d260eec1aa36ad6", size = 44221, upload-time = "2025-10-26T15:12:10.434Z" } +sdist = { url = "https://files.pythonhosted.org/packages/bc/57/e84d88dfe0aec03b7a2d4327012c1627ab5f03652216c63d49846d7a6c58/python-dotenv-1.0.1.tar.gz", hash = "sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca", size = 39115, upload-time = "2024-01-23T06:33:00.505Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/14/1b/a298b06749107c305e1fe0f814c6c74aea7b2f1e10989cb30f544a1b3253/python_dotenv-1.2.1-py3-none-any.whl", hash = "sha256:b81ee9561e9ca4004139c6cbba3a238c32b03e4894671e181b671e8cb8425d61", size = 21230, upload-time = "2025-10-26T15:12:09.109Z" }, + { url = "https://files.pythonhosted.org/packages/6a/3e/b68c118422ec867fa7ab88444e1274aa40681c606d59ac27de5a5588f082/python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a", size = 19863, upload-time = "2024-01-23T06:32:58.246Z" }, ] [[package]] @@ -3591,18 +3581,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/9e/51/17023c0f8f1869d8806b979a2bffa3f861f26a3f1a66b094288323fba52f/rfc3986_validator-0.1.1-py2.py3-none-any.whl", hash = "sha256:2f235c432ef459970b4306369336b9d5dbdda31b510ca1e327636e01f528bfa9", size = 4242, upload-time = "2019-10-28T16:00:13.976Z" }, ] -[[package]] -name = "rfc3987-syntax" -version = "1.1.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "lark" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/2c/06/37c1a5557acf449e8e406a830a05bf885ac47d33270aec454ef78675008d/rfc3987_syntax-1.1.0.tar.gz", hash = "sha256:717a62cbf33cffdd16dfa3a497d81ce48a660ea691b1ddd7be710c22f00b4a0d", size = 14239, upload-time = "2025-07-18T01:05:05.015Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7e/71/44ce230e1b7fadd372515a97e32a83011f906ddded8d03e3c6aafbdedbb7/rfc3987_syntax-1.1.0-py3-none-any.whl", hash = "sha256:6c3d97604e4c5ce9f714898e05401a0445a641cfa276432b0a648c80856f6a3f", size = 8046, upload-time = "2025-07-18T01:05:03.843Z" }, -] - [[package]] name = "rpds-py" version = "0.30.0" diff --git a/yarn.lock b/yarn.lock index 746457fc..59dea60a 100644 --- a/yarn.lock +++ b/yarn.lock @@ -131,6 +131,16 @@ "@babel/helper-string-parser" "^7.27.1" "@babel/helper-validator-identifier" "^7.28.5" +"@base-ui/utils@^0.2.6": + version "0.2.7" + resolved "https://registry.npmjs.org/@base-ui/utils/-/utils-0.2.7.tgz#a7a57d08af6d02a905c0ca5e87b6dde85597046d" + integrity sha512-nXYKhiL/0JafyJE8PfcflipGftOftlIwKd72rU15iZ1M5yqgg5J9P8NHU71GReDuXco5MJA/eVQqUT5WRqX9sA== + dependencies: + "@babel/runtime" "^7.29.2" + "@floating-ui/utils" "^0.2.11" + reselect "^5.1.1" + use-sync-external-store "^1.6.0" + "@bramus/specificity@^2.4.2": version "2.4.2" resolved "https://registry.npmjs.org/@bramus/specificity/-/specificity-2.4.2.tgz" @@ -278,11 +288,121 @@ resolved "https://registry.npmjs.org/@emotion/weak-memoize/-/weak-memoize-0.4.0.tgz" integrity sha512-snKqtPW01tN0ui7yu9rGv69aJXr/a/Ywvl11sUjNtEcRc+ng/mQriFL0wLXMef74iHa/EkftbDzU9F8iFbH+zg== +"@esbuild/aix-ppc64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.21.5.tgz#c7184a326533fcdf1b8ee0733e21c713b975575f" + integrity sha512-1SDgH6ZSPTlggy1yI6+Dbkiz8xzpHJEVAlF/AM1tHPLsf5STom9rwtjE4hKAF20FfXXNTFqEYXyJNWh1GiZedQ== + +"@esbuild/android-arm64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.21.5.tgz#09d9b4357780da9ea3a7dfb833a1f1ff439b4052" + integrity sha512-c0uX9VAUBQ7dTDCjq+wdyGLowMdtR/GoC2U5IYk/7D1H1JYC0qseD7+11iMP2mRLN9RcCMRcjC4YMclCzGwS/A== + +"@esbuild/android-arm@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.21.5.tgz#9b04384fb771926dfa6d7ad04324ecb2ab9b2e28" + integrity sha512-vCPvzSjpPHEi1siZdlvAlsPxXl7WbOVUBBAowWug4rJHb68Ox8KualB+1ocNvT5fjv6wpkX6o/iEpbDrf68zcg== + +"@esbuild/android-x64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.21.5.tgz#29918ec2db754cedcb6c1b04de8cd6547af6461e" + integrity sha512-D7aPRUUNHRBwHxzxRvp856rjUHRFW1SdQATKXH2hqA0kAZb1hKmi02OpYRacl0TxIGz/ZmXWlbZgjwWYaCakTA== + "@esbuild/darwin-arm64@0.21.5": version "0.21.5" resolved "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.21.5.tgz" integrity sha512-DwqXqZyuk5AiWWf3UfLiRDJ5EDd49zg6O9wclZ7kUMv2WRFr4HKjXp/5t8JZ11QbQfUS6/cRCKGwYhtNAY88kQ== +"@esbuild/darwin-x64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.21.5.tgz#c13838fa57372839abdddc91d71542ceea2e1e22" + integrity sha512-se/JjF8NlmKVG4kNIuyWMV/22ZaerB+qaSi5MdrXtd6R08kvs2qCN4C09miupktDitvh8jRFflwGFBQcxZRjbw== + +"@esbuild/freebsd-arm64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.21.5.tgz#646b989aa20bf89fd071dd5dbfad69a3542e550e" + integrity sha512-5JcRxxRDUJLX8JXp/wcBCy3pENnCgBR9bN6JsY4OmhfUtIHe3ZW0mawA7+RDAcMLrMIZaf03NlQiX9DGyB8h4g== + +"@esbuild/freebsd-x64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.21.5.tgz#aa615cfc80af954d3458906e38ca22c18cf5c261" + integrity sha512-J95kNBj1zkbMXtHVH29bBriQygMXqoVQOQYA+ISs0/2l3T9/kj42ow2mpqerRBxDJnmkUDCaQT/dfNXWX/ZZCQ== + +"@esbuild/linux-arm64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.21.5.tgz#70ac6fa14f5cb7e1f7f887bcffb680ad09922b5b" + integrity sha512-ibKvmyYzKsBeX8d8I7MH/TMfWDXBF3db4qM6sy+7re0YXya+K1cem3on9XgdT2EQGMu4hQyZhan7TeQ8XkGp4Q== + +"@esbuild/linux-arm@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.21.5.tgz#fc6fd11a8aca56c1f6f3894f2bea0479f8f626b9" + integrity sha512-bPb5AHZtbeNGjCKVZ9UGqGwo8EUu4cLq68E95A53KlxAPRmUyYv2D6F0uUI65XisGOL1hBP5mTronbgo+0bFcA== + +"@esbuild/linux-ia32@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.21.5.tgz#3271f53b3f93e3d093d518d1649d6d68d346ede2" + integrity sha512-YvjXDqLRqPDl2dvRODYmmhz4rPeVKYvppfGYKSNGdyZkA01046pLWyRKKI3ax8fbJoK5QbxblURkwK/MWY18Tg== + +"@esbuild/linux-loong64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.21.5.tgz#ed62e04238c57026aea831c5a130b73c0f9f26df" + integrity sha512-uHf1BmMG8qEvzdrzAqg2SIG/02+4/DHB6a9Kbya0XDvwDEKCoC8ZRWI5JJvNdUjtciBGFQ5PuBlpEOXQj+JQSg== + +"@esbuild/linux-mips64el@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.21.5.tgz#e79b8eb48bf3b106fadec1ac8240fb97b4e64cbe" + integrity sha512-IajOmO+KJK23bj52dFSNCMsz1QP1DqM6cwLUv3W1QwyxkyIWecfafnI555fvSGqEKwjMXVLokcV5ygHW5b3Jbg== + +"@esbuild/linux-ppc64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.21.5.tgz#5f2203860a143b9919d383ef7573521fb154c3e4" + integrity sha512-1hHV/Z4OEfMwpLO8rp7CvlhBDnjsC3CttJXIhBi+5Aj5r+MBvy4egg7wCbe//hSsT+RvDAG7s81tAvpL2XAE4w== + +"@esbuild/linux-riscv64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.21.5.tgz#07bcafd99322d5af62f618cb9e6a9b7f4bb825dc" + integrity sha512-2HdXDMd9GMgTGrPWnJzP2ALSokE/0O5HhTUvWIbD3YdjME8JwvSCnNGBnTThKGEB91OZhzrJ4qIIxk/SBmyDDA== + +"@esbuild/linux-s390x@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.21.5.tgz#b7ccf686751d6a3e44b8627ababc8be3ef62d8de" + integrity sha512-zus5sxzqBJD3eXxwvjN1yQkRepANgxE9lgOW2qLnmr8ikMTphkjgXu1HR01K4FJg8h1kEEDAqDcZQtbrRnB41A== + +"@esbuild/linux-x64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.21.5.tgz#6d8f0c768e070e64309af8004bb94e68ab2bb3b0" + integrity sha512-1rYdTpyv03iycF1+BhzrzQJCdOuAOtaqHTWJZCWvijKD2N5Xu0TtVC8/+1faWqcP9iBCWOmjmhoH94dH82BxPQ== + +"@esbuild/netbsd-x64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.21.5.tgz#bbe430f60d378ecb88decb219c602667387a6047" + integrity sha512-Woi2MXzXjMULccIwMnLciyZH4nCIMpWQAs049KEeMvOcNADVxo0UBIQPfSmxB3CWKedngg7sWZdLvLczpe0tLg== + +"@esbuild/openbsd-x64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.21.5.tgz#99d1cf2937279560d2104821f5ccce220cb2af70" + integrity sha512-HLNNw99xsvx12lFBUwoT8EVCsSvRNDVxNpjZ7bPn947b8gJPzeHWyNVhFsaerc0n3TsbOINvRP2byTZ5LKezow== + +"@esbuild/sunos-x64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.21.5.tgz#08741512c10d529566baba837b4fe052c8f3487b" + integrity sha512-6+gjmFpfy0BHU5Tpptkuh8+uw3mnrvgs+dSPQXQOv3ekbordwnzTVEb4qnIvQcYXq6gzkyTnoZ9dZG+D4garKg== + +"@esbuild/win32-arm64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.21.5.tgz#675b7385398411240735016144ab2e99a60fc75d" + integrity sha512-Z0gOTd75VvXqyq7nsl93zwahcTROgqvuAcYDUr+vOv8uHhNSKROyU961kgtCD1e95IqPKSQKH7tBTslnS3tA8A== + +"@esbuild/win32-ia32@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.21.5.tgz#1bfc3ce98aa6ca9a0969e4d2af72144c59c1193b" + integrity sha512-SWXFF1CL2RVNMaVs+BBClwtfZSvDgtL//G/smwAc5oVK/UPu2Gu9tIaRgFmYFFKrmg3SyAjSrElf0TiJ1v8fYA== + +"@esbuild/win32-x64@0.21.5": + version "0.21.5" + resolved "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.21.5.tgz#acad351d582d157bb145535db2a6ff53dd514b5c" + integrity sha512-tQd/1efJuzPC6rCFwEvLtci/xNFcTZknmXs98FYDfGE4wP9ClFV98nyKrzJKVPMhdDnjzLhdUyMX4PsQAPjwIw== + "@eslint-community/eslint-utils@^4.8.0", "@eslint-community/eslint-utils@^4.9.1": version "4.9.1" resolved "https://registry.npmjs.org/@eslint-community/eslint-utils/-/eslint-utils-4.9.1.tgz" @@ -333,7 +453,7 @@ minimatch "^3.1.5" strip-json-comments "^3.1.1" -"@eslint/js@^9.15.0", "@eslint/js@9.39.4": +"@eslint/js@9.39.4", "@eslint/js@^9.15.0": version "9.39.4" resolved "https://registry.npmjs.org/@eslint/js/-/js-9.39.4.tgz" integrity sha512-nE7DEIchvtiFTwBw4Lfbu59PG+kCofhjsKaCWzxTpt4lfRjRMqG6uMBzKXuEcyXhOHoUp9riAm7/aWYGhXZ9cw== @@ -544,6 +664,25 @@ dependencies: "@babel/runtime" "^7.28.6" +"@mui/types@^9.0.0": + version "9.0.0" + resolved "https://registry.npmjs.org/@mui/types/-/types-9.0.0.tgz#92d8c64e72cb863ee59108cb20cc476d648a3ab9" + integrity sha512-i1cuFCAWN44b3AJWO7mh7tuh1sqbQSeVr/94oG0TX5uXivac8XalgE4/6fQZcmGZigzbQ35IXxj/4jLpRIBYZg== + dependencies: + "@babel/runtime" "^7.29.2" + +"@mui/utils@9.0.0": + version "9.0.0" + resolved "https://registry.npmjs.org/@mui/utils/-/utils-9.0.0.tgz#25b563ccbf537feba5f89c37a00cb8e6eea45ad0" + integrity sha512-bQcqyg/gjULUqTuyUjSAFr6LQGLvtkNtDbJerAtoUn9kGZ0hg5QJiN1PLHMLbeFpe3te1831uq7GFl2ITokGdg== + dependencies: + "@babel/runtime" "^7.29.2" + "@mui/types" "^9.0.0" + "@types/prop-types" "^15.7.15" + clsx "^2.1.1" + prop-types "^15.8.1" + react-is "^19.2.4" + "@mui/utils@^7.3.9": version "7.3.9" resolved "https://registry.npmjs.org/@mui/utils/-/utils-7.3.9.tgz" @@ -556,16 +695,107 @@ prop-types "^15.8.1" react-is "^19.2.3" +"@mui/x-internals@^9.0.0": + version "9.0.0" + resolved "https://registry.npmjs.org/@mui/x-internals/-/x-internals-9.0.0.tgz#8851a058e09b719690b4f398319805239e923855" + integrity sha512-E/4rdg69JjhyybpPGypCjAKSKLLnSdCFM+O6P/nkUg47+qt3uftxQEhjQO53rcn6ahHl6du/uNZ9BLgeY6kYxQ== + dependencies: + "@babel/runtime" "^7.28.6" + "@mui/utils" "9.0.0" + reselect "^5.1.1" + use-sync-external-store "^1.6.0" + +"@mui/x-tree-view@^9.0.1": + version "9.0.1" + resolved "https://registry.npmjs.org/@mui/x-tree-view/-/x-tree-view-9.0.1.tgz#0bc1feb262ef5648ec44f00b6e1125ddb34bcbb8" + integrity sha512-vUSxqg5dAbaHJsifc/PjFhf57vQWjMSK9cJKPy1SXcAFHWhXSJ1oCHpUwfQ7hjKvkeqf8emKxtCVMXNVRRhIWw== + dependencies: + "@babel/runtime" "^7.28.6" + "@base-ui/utils" "^0.2.6" + "@mui/utils" "9.0.0" + "@mui/x-internals" "^9.0.0" + "@types/react-transition-group" "^4.4.12" + clsx "^2.1.1" + prop-types "^15.8.1" + react-transition-group "^4.4.5" + +"@napi-rs/wasm-runtime@^1.1.1": + version "1.1.4" + resolved "https://registry.npmjs.org/@napi-rs/wasm-runtime/-/wasm-runtime-1.1.4.tgz#a46bbfedc29751b7170c5d23bc1d8ee8c7e3c1e1" + integrity sha512-3NQNNgA1YSlJb/kMH1ildASP9HW7/7kYnRI2szWJaofaS1hWmbGI4H+d3+22aGzXXN9IJ+n+GiFVcGipJP18ow== + dependencies: + "@tybys/wasm-util" "^0.10.1" + "@oxc-project/types@=0.122.0": version "0.122.0" resolved "https://registry.npmjs.org/@oxc-project/types/-/types-0.122.0.tgz" integrity sha512-oLAl5kBpV4w69UtFZ9xqcmTi+GENWOcPF7FCrczTiBbmC0ibXxCwyvZGbO39rCVEuLGAZM84DH0pUIyyv/YJzA== +"@parcel/watcher-android-arm64@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-android-arm64/-/watcher-android-arm64-2.5.6.tgz#5f32e0dba356f4ac9a11068d2a5c134ca3ba6564" + integrity sha512-YQxSS34tPF/6ZG7r/Ih9xy+kP/WwediEUsqmtf0cuCV5TPPKw/PQHRhueUo6JdeFJaqV3pyjm0GdYjZotbRt/A== + "@parcel/watcher-darwin-arm64@2.5.6": version "2.5.6" resolved "https://registry.npmjs.org/@parcel/watcher-darwin-arm64/-/watcher-darwin-arm64-2.5.6.tgz" integrity sha512-Z2ZdrnwyXvvvdtRHLmM4knydIdU9adO3D4n/0cVipF3rRiwP+3/sfzpAwA/qKFL6i1ModaabkU7IbpeMBgiVEA== +"@parcel/watcher-darwin-x64@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-darwin-x64/-/watcher-darwin-x64-2.5.6.tgz#bf05d76a78bc15974f15ec3671848698b0838063" + integrity sha512-HgvOf3W9dhithcwOWX9uDZyn1lW9R+7tPZ4sug+NGrGIo4Rk1hAXLEbcH1TQSqxts0NYXXlOWqVpvS1SFS4fRg== + +"@parcel/watcher-freebsd-x64@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-freebsd-x64/-/watcher-freebsd-x64-2.5.6.tgz#8bc26e9848e7303ac82922a5ae1b1ef1bdb48a53" + integrity sha512-vJVi8yd/qzJxEKHkeemh7w3YAn6RJCtYlE4HPMoVnCpIXEzSrxErBW5SJBgKLbXU3WdIpkjBTeUNtyBVn8TRng== + +"@parcel/watcher-linux-arm-glibc@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-linux-arm-glibc/-/watcher-linux-arm-glibc-2.5.6.tgz#1328fee1deb0c2d7865079ef53a2ba4cc2f8b40a" + integrity sha512-9JiYfB6h6BgV50CCfasfLf/uvOcJskMSwcdH1PHH9rvS1IrNy8zad6IUVPVUfmXr+u+Km9IxcfMLzgdOudz9EQ== + +"@parcel/watcher-linux-arm-musl@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-linux-arm-musl/-/watcher-linux-arm-musl-2.5.6.tgz#bad0f45cb3e2157746db8b9d22db6a125711f152" + integrity sha512-Ve3gUCG57nuUUSyjBq/MAM0CzArtuIOxsBdQ+ftz6ho8n7s1i9E1Nmk/xmP323r2YL0SONs1EuwqBp2u1k5fxg== + +"@parcel/watcher-linux-arm64-glibc@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-linux-arm64-glibc/-/watcher-linux-arm64-glibc-2.5.6.tgz#b75913fbd501d9523c5f35d420957bf7d0204809" + integrity sha512-f2g/DT3NhGPdBmMWYoxixqYr3v/UXcmLOYy16Bx0TM20Tchduwr4EaCbmxh1321TABqPGDpS8D/ggOTaljijOA== + +"@parcel/watcher-linux-arm64-musl@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-linux-arm64-musl/-/watcher-linux-arm64-musl-2.5.6.tgz#da5621a6a576070c8c0de60dea8b46dc9c3827d4" + integrity sha512-qb6naMDGlbCwdhLj6hgoVKJl2odL34z2sqkC7Z6kzir8b5W65WYDpLB6R06KabvZdgoHI/zxke4b3zR0wAbDTA== + +"@parcel/watcher-linux-x64-glibc@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-linux-x64-glibc/-/watcher-linux-x64-glibc-2.5.6.tgz#ce437accdc4b30f93a090b4a221fd95cd9b89639" + integrity sha512-kbT5wvNQlx7NaGjzPFu8nVIW1rWqV780O7ZtkjuWaPUgpv2NMFpjYERVi0UYj1msZNyCzGlaCWEtzc+exjMGbQ== + +"@parcel/watcher-linux-x64-musl@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-linux-x64-musl/-/watcher-linux-x64-musl-2.5.6.tgz#02400c54b4a67efcc7e2327b249711920ac969e2" + integrity sha512-1JRFeC+h7RdXwldHzTsmdtYR/Ku8SylLgTU/reMuqdVD7CtLwf0VR1FqeprZ0eHQkO0vqsbvFLXUmYm/uNKJBg== + +"@parcel/watcher-win32-arm64@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-win32-arm64/-/watcher-win32-arm64-2.5.6.tgz#caae3d3c7583ca0a7171e6bd142c34d20ea1691e" + integrity sha512-3ukyebjc6eGlw9yRt678DxVF7rjXatWiHvTXqphZLvo7aC5NdEgFufVwjFfY51ijYEWpXbqF5jtrK275z52D4Q== + +"@parcel/watcher-win32-ia32@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-win32-ia32/-/watcher-win32-ia32-2.5.6.tgz#9ac922550896dfe47bfc5ae3be4f1bcaf8155d6d" + integrity sha512-k35yLp1ZMwwee3Ez/pxBi5cf4AoBKYXj00CZ80jUz5h8prpiaQsiRPKQMxoLstNuqe2vR4RNPEAEcjEFzhEz/g== + +"@parcel/watcher-win32-x64@2.5.6": + version "2.5.6" + resolved "https://registry.npmjs.org/@parcel/watcher-win32-x64/-/watcher-win32-x64-2.5.6.tgz#73fdafba2e21c448f0e456bbe13178d8fe11739d" + integrity sha512-hbQlYcCq5dlAX9Qx+kFb0FHue6vbjlf0FrNzSKdYK2APUf7tGfGxQCk2ihEREmbR6ZMc0MVAD5RIX/41gpUzTw== + "@parcel/watcher@^2.4.1": version "2.5.6" resolved "https://registry.npmjs.org/@parcel/watcher/-/watcher-2.5.6.tgz" @@ -630,11 +860,83 @@ resolved "https://registry.npmjs.org/@remix-run/router/-/router-1.23.2.tgz" integrity sha512-Ic6m2U/rMjTkhERIa/0ZtXJP17QUi2CbWE7cqx4J58M8aA3QTfW+2UlQ4psvTX9IO1RfNVhK3pcpdjej7L+t2w== +"@rolldown/binding-android-arm64@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-android-arm64/-/binding-android-arm64-1.0.0-rc.11.tgz#25a584227ed97239fd564451c0db2c359751b42a" + integrity sha512-SJ+/g+xNnOh6NqYxD0V3uVN4W3VfnrGsC9/hoglicgTNfABFG9JjISvkkU0dNY84MNHLWyOgxP9v9Y9pX4S7+A== + "@rolldown/binding-darwin-arm64@1.0.0-rc.11": version "1.0.0-rc.11" resolved "https://registry.npmjs.org/@rolldown/binding-darwin-arm64/-/binding-darwin-arm64-1.0.0-rc.11.tgz" integrity sha512-7WQgR8SfOPwmDZGFkThUvsmd/nwAWv91oCO4I5LS7RKrssPZmOt7jONN0cW17ydGC1n/+puol1IpoieKqQidmg== +"@rolldown/binding-darwin-x64@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-darwin-x64/-/binding-darwin-x64-1.0.0-rc.11.tgz#6e751ea2067cacee0c94f0e8b087761dde62f9ea" + integrity sha512-39Ks6UvIHq4rEogIfQBoBRusj0Q0nPVWIvqmwBLaT6aqQGIakHdESBVOPRRLacy4WwUPIx4ZKzfZ9PMW+IeyUQ== + +"@rolldown/binding-freebsd-x64@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-freebsd-x64/-/binding-freebsd-x64-1.0.0-rc.11.tgz#b7582b959398c5871034b94ba0a8ecde0425a8e7" + integrity sha512-jfsm0ZHfhiqrvWjJAmzsqiIFPz5e7mAoCOPBNTcNgkiid/LaFKiq92+0ojH+nmJmKYkre4t71BWXUZDNp7vsag== + +"@rolldown/binding-linux-arm-gnueabihf@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-linux-arm-gnueabihf/-/binding-linux-arm-gnueabihf-1.0.0-rc.11.tgz#3b8c5e071d6a0ed1cb1880c1948c6fece553502a" + integrity sha512-zjQaUtSyq1nVe3nxmlSCuR96T1LPlpvmJ0SZy0WJFEsV4kFbXcq2u68L4E6O0XeFj4aex9bEauqjW8UQBeAvfQ== + +"@rolldown/binding-linux-arm64-gnu@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-linux-arm64-gnu/-/binding-linux-arm64-gnu-1.0.0-rc.11.tgz#2533165620137b077ae4ede92b752a63cd85cfcb" + integrity sha512-WMW1yE6IOnehTcFE9eipFkm3XN63zypWlrJQ2iF7NrQ9b2LDRjumFoOGJE8RJJTJCTBAdmLMnJ8uVitACUUo1Q== + +"@rolldown/binding-linux-arm64-musl@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-linux-arm64-musl/-/binding-linux-arm64-musl-1.0.0-rc.11.tgz#b04cf5b806a012027a4e8b139e0f86b2ff7621c0" + integrity sha512-jfndI9tsfm4APzjNt6QdBkYwre5lRPUgHeDHoI7ydKUuJvz3lZeCfMsI56BZj+7BYqiKsJm7cfd/6KYV7ubrBg== + +"@rolldown/binding-linux-ppc64-gnu@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-linux-ppc64-gnu/-/binding-linux-ppc64-gnu-1.0.0-rc.11.tgz#bda9c11fe03482033d5dac6a943802b3e7579550" + integrity sha512-ZlFgw46NOAGMgcdvdYwAGu2Q+SLFA9LzbJLW+iyMOJyhj5wk6P3KEE9Gct4xWwSzFoPI7JCdYmYMzVtlgQ+zfw== + +"@rolldown/binding-linux-s390x-gnu@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-linux-s390x-gnu/-/binding-linux-s390x-gnu-1.0.0-rc.11.tgz#55daa2d35f92f62e958fc44e12db1c16e1f271c5" + integrity sha512-hIOYmuT6ofM4K04XAZd3OzMySEO4K0/nc9+jmNcxNAxRi6c5UWpqfw3KMFV4MVFWL+jQsSh+bGw2VqmaPMTLyw== + +"@rolldown/binding-linux-x64-gnu@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-linux-x64-gnu/-/binding-linux-x64-gnu-1.0.0-rc.11.tgz#8ca1abf607bbe2f7fdd6f6416192937dc9ea1e54" + integrity sha512-qXBQQO9OvkjjQPLdUVr7Nr2t3QTZI7s4KZtfw7HzBgjbmAPSFwSv4rmET9lLSgq3rH/ndA3ngv3Qb8l2njoPNA== + +"@rolldown/binding-linux-x64-musl@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-linux-x64-musl/-/binding-linux-x64-musl-1.0.0-rc.11.tgz#36a52beee8ac97a79d1ed8f1b94fab677e3e4d11" + integrity sha512-/tpFfoSTzUkH9LPY+cYbqZBDyyX62w5fICq9qzsHLL8uTI6BHip3Q9Uzft0wylk/i8OOwKik8OxW+QAhDmzwmg== + +"@rolldown/binding-openharmony-arm64@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-openharmony-arm64/-/binding-openharmony-arm64-1.0.0-rc.11.tgz#91c74fd23b3f3f3942fe4b3aefc9428ecbaa55fd" + integrity sha512-mcp3Rio2w72IvdZG0oQ4bM2c2oumtwHfUfKncUM6zGgz0KgPz4YmDPQfnXEiY5t3+KD/i8HG2rOB/LxdmieK2g== + +"@rolldown/binding-wasm32-wasi@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-wasm32-wasi/-/binding-wasm32-wasi-1.0.0-rc.11.tgz#6520bafe57ff1cd2fb45f8f22b1cb6d57be44e79" + integrity sha512-LXk5Hii1Ph9asuGRjBuz8TUxdc1lWzB7nyfdoRgI0WGPZKmCxvlKk8KfYysqtr4MfGElu/f/pEQRh8fcEgkrWw== + dependencies: + "@napi-rs/wasm-runtime" "^1.1.1" + +"@rolldown/binding-win32-arm64-msvc@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-win32-arm64-msvc/-/binding-win32-arm64-msvc-1.0.0-rc.11.tgz#73dd1c4737473c8270b61cd2e42b05a34453ffc0" + integrity sha512-dDwf5otnx0XgRY1yqxOC4ITizcdzS/8cQ3goOWv3jFAo4F+xQYni+hnMuO6+LssHHdJW7+OCVL3CoU4ycnh35Q== + +"@rolldown/binding-win32-x64-msvc@1.0.0-rc.11": + version "1.0.0-rc.11" + resolved "https://registry.npmjs.org/@rolldown/binding-win32-x64-msvc/-/binding-win32-x64-msvc-1.0.0-rc.11.tgz#4d922aa6dd6bf27c73eba93fec9a0aed62549095" + integrity sha512-LN4/skhSggybX71ews7dAj6r2geaMJfm3kMbK2KhFMg9B10AZXnKoLCVVgzhMHL0S+aKtr4p8QbAW8k+w95bAA== + "@rolldown/pluginutils@1.0.0-beta.27": version "1.0.0-beta.27" resolved "https://registry.npmjs.org/@rolldown/pluginutils/-/pluginutils-1.0.0-beta.27.tgz" @@ -645,11 +947,136 @@ resolved "https://registry.npmjs.org/@rolldown/pluginutils/-/pluginutils-1.0.0-rc.11.tgz" integrity sha512-xQO9vbwBecJRv9EUcQ/y0dzSTJgA7Q6UVN7xp6B81+tBGSLVAK03yJ9NkJaUA7JFD91kbjxRSC/mDnmvXzbHoQ== +"@rollup/rollup-android-arm-eabi@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.60.0.tgz#7e158ddfc16f78da99c0d5ccbae6cae403ef3284" + integrity sha512-WOhNW9K8bR3kf4zLxbfg6Pxu2ybOUbB2AjMDHSQx86LIF4rH4Ft7vmMwNt0loO0eonglSNy4cpD3MKXXKQu0/A== + +"@rollup/rollup-android-arm64@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-android-arm64/-/rollup-android-arm64-4.60.0.tgz#49f4ae0e22b6f9ffbcd3818b9a0758fa2d10b1cd" + integrity sha512-u6JHLll5QKRvjciE78bQXDmqRqNs5M/3GVqZeMwvmjaNODJih/WIrJlFVEihvV0MiYFmd+ZyPr9wxOVbPAG2Iw== + "@rollup/rollup-darwin-arm64@4.60.0": version "4.60.0" resolved "https://registry.npmjs.org/@rollup/rollup-darwin-arm64/-/rollup-darwin-arm64-4.60.0.tgz" integrity sha512-qEF7CsKKzSRc20Ciu2Zw1wRrBz4g56F7r/vRwY430UPp/nt1x21Q/fpJ9N5l47WWvJlkNCPJz3QRVw008fi7yA== +"@rollup/rollup-darwin-x64@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-darwin-x64/-/rollup-darwin-x64-4.60.0.tgz#1bf7a92b27ebdd5e0d1d48503c7811160773be1a" + integrity sha512-WADYozJ4QCnXCH4wPB+3FuGmDPoFseVCUrANmA5LWwGmC6FL14BWC7pcq+FstOZv3baGX65tZ378uT6WG8ynTw== + +"@rollup/rollup-freebsd-arm64@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-freebsd-arm64/-/rollup-freebsd-arm64-4.60.0.tgz#5ccf537b99c5175008444702193ad0b1c36f7f16" + integrity sha512-6b8wGHJlDrGeSE3aH5mGNHBjA0TTkxdoNHik5EkvPHCt351XnigA4pS7Wsj/Eo9Y8RBU6f35cjN9SYmCFBtzxw== + +"@rollup/rollup-freebsd-x64@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-freebsd-x64/-/rollup-freebsd-x64-4.60.0.tgz#1196ecd7bf4e128624ef83cd1f9d785114474a77" + integrity sha512-h25Ga0t4jaylMB8M/JKAyrvvfxGRjnPQIR8lnCayyzEjEOx2EJIlIiMbhpWxDRKGKF8jbNH01NnN663dH638mA== + +"@rollup/rollup-linux-arm-gnueabihf@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-arm-gnueabihf/-/rollup-linux-arm-gnueabihf-4.60.0.tgz#cc147633a4af229fee83a737bf2334fbac3dc28e" + integrity sha512-RzeBwv0B3qtVBWtcuABtSuCzToo2IEAIQrcyB/b2zMvBWVbjo8bZDjACUpnaafaxhTw2W+imQbP2BD1usasK4g== + +"@rollup/rollup-linux-arm-musleabihf@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-arm-musleabihf/-/rollup-linux-arm-musleabihf-4.60.0.tgz#3559f9f060153ea54594a42c3b87a297bedcc26e" + integrity sha512-Sf7zusNI2CIU1HLzuu9Tc5YGAHEZs5Lu7N1ssJG4Tkw6e0MEsN7NdjUDDfGNHy2IU+ENyWT+L2obgWiguWibWQ== + +"@rollup/rollup-linux-arm64-gnu@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-arm64-gnu/-/rollup-linux-arm64-gnu-4.60.0.tgz#e91f887b154123485cfc4b59befe2080fcd8f2df" + integrity sha512-DX2x7CMcrJzsE91q7/O02IJQ5/aLkVtYFryqCjduJhUfGKG6yJV8hxaw8pZa93lLEpPTP/ohdN4wFz7yp/ry9A== + +"@rollup/rollup-linux-arm64-musl@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-arm64-musl/-/rollup-linux-arm64-musl-4.60.0.tgz#660752f040df9ba44a24765df698928917c0bf21" + integrity sha512-09EL+yFVbJZlhcQfShpswwRZ0Rg+z/CsSELFCnPt3iK+iqwGsI4zht3secj5vLEs957QvFFXnzAT0FFPIxSrkQ== + +"@rollup/rollup-linux-loong64-gnu@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-loong64-gnu/-/rollup-linux-loong64-gnu-4.60.0.tgz#cb0e939a5fa479ccef264f3f45b31971695f869c" + integrity sha512-i9IcCMPr3EXm8EQg5jnja0Zyc1iFxJjZWlb4wr7U2Wx/GrddOuEafxRdMPRYVaXjgbhvqalp6np07hN1w9kAKw== + +"@rollup/rollup-linux-loong64-musl@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-loong64-musl/-/rollup-linux-loong64-musl-4.60.0.tgz#42f86fbc82cd1a81be2d346476dd3231cf5ee442" + integrity sha512-DGzdJK9kyJ+B78MCkWeGnpXJ91tK/iKA6HwHxF4TAlPIY7GXEvMe8hBFRgdrR9Ly4qebR/7gfUs9y2IoaVEyog== + +"@rollup/rollup-linux-ppc64-gnu@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-ppc64-gnu/-/rollup-linux-ppc64-gnu-4.60.0.tgz#39776a647a789dc95ea049277c5ef8f098df77f9" + integrity sha512-RwpnLsqC8qbS8z1H1AxBA1H6qknR4YpPR9w2XX0vo2Sz10miu57PkNcnHVaZkbqyw/kUWfKMI73jhmfi9BRMUQ== + +"@rollup/rollup-linux-ppc64-musl@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-ppc64-musl/-/rollup-linux-ppc64-musl-4.60.0.tgz#466f20029a8e8b3bb2954c7ddebc9586420cac2c" + integrity sha512-Z8pPf54Ly3aqtdWC3G4rFigZgNvd+qJlOE52fmko3KST9SoGfAdSRCwyoyG05q1HrrAblLbk1/PSIV+80/pxLg== + +"@rollup/rollup-linux-riscv64-gnu@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-gnu/-/rollup-linux-riscv64-gnu-4.60.0.tgz#cff9877c78f12e7aa6246f6902ad913e99edb2b7" + integrity sha512-3a3qQustp3COCGvnP4SvrMHnPQ9d1vzCakQVRTliaz8cIp/wULGjiGpbcqrkv0WrHTEp8bQD/B3HBjzujVWLOA== + +"@rollup/rollup-linux-riscv64-musl@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-musl/-/rollup-linux-riscv64-musl-4.60.0.tgz#9a762fb99b5a82a921017f56491b7e892b9fb17d" + integrity sha512-pjZDsVH/1VsghMJ2/kAaxt6dL0psT6ZexQVrijczOf+PeP2BUqTHYejk3l6TlPRydggINOeNRhvpLa0AYpCWSQ== + +"@rollup/rollup-linux-s390x-gnu@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-s390x-gnu/-/rollup-linux-s390x-gnu-4.60.0.tgz#9d25ad8ac7dab681935baf78ac5ea92d14629cdf" + integrity sha512-3ObQs0BhvPgiUVZrN7gqCSvmFuMWvWvsjG5ayJ3Lraqv+2KhOsp+pUbigqbeWqueGIsnn+09HBw27rJ+gYK4VQ== + +"@rollup/rollup-linux-x64-gnu@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-x64-gnu/-/rollup-linux-x64-gnu-4.60.0.tgz#5e5139e11819fa38a052368da79422cb4afcf466" + integrity sha512-EtylprDtQPdS5rXvAayrNDYoJhIz1/vzN2fEubo3yLE7tfAw+948dO0g4M0vkTVFhKojnF+n6C8bDNe+gDRdTg== + +"@rollup/rollup-linux-x64-gnu@^4.24.4": + version "4.60.1" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-x64-gnu/-/rollup-linux-x64-gnu-4.60.1.tgz#56a6a0d9076f2a05a976031493b24a20ddcc0e77" + integrity sha512-77PpsFQUCOiZR9+LQEFg9GClyfkNXj1MP6wRnzYs0EeWbPcHs02AXu4xuUbM1zhwn3wqaizle3AEYg5aeoohhg== + +"@rollup/rollup-linux-x64-musl@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-linux-x64-musl/-/rollup-linux-x64-musl-4.60.0.tgz#b6211d46e11b1f945f5504cc794fce839331ed08" + integrity sha512-k09oiRCi/bHU9UVFqD17r3eJR9bn03TyKraCrlz5ULFJGdJGi7VOmm9jl44vOJvRJ6P7WuBi/s2A97LxxHGIdw== + +"@rollup/rollup-openbsd-x64@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-openbsd-x64/-/rollup-openbsd-x64-4.60.0.tgz#e6e09eebaa7012bb9c7331b437a9e992bd94ca35" + integrity sha512-1o/0/pIhozoSaDJoDcec+IVLbnRtQmHwPV730+AOD29lHEEo4F5BEUB24H0OBdhbBBDwIOSuf7vgg0Ywxdfiiw== + +"@rollup/rollup-openharmony-arm64@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-openharmony-arm64/-/rollup-openharmony-arm64-4.60.0.tgz#f7d99ae857032498e57a5e7259fb7100fd24a87e" + integrity sha512-pESDkos/PDzYwtyzB5p/UoNU/8fJo68vcXM9ZW2V0kjYayj1KaaUfi1NmTUTUpMn4UhU4gTuK8gIaFO4UGuMbA== + +"@rollup/rollup-win32-arm64-msvc@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-win32-arm64-msvc/-/rollup-win32-arm64-msvc-4.60.0.tgz#41e392f5d9f3bf1253fdaf2f6d6f6b1bfc452856" + integrity sha512-hj1wFStD7B1YBeYmvY+lWXZ7ey73YGPcViMShYikqKT1GtstIKQAtfUI6yrzPjAy/O7pO0VLXGmUVWXQMaYgTQ== + +"@rollup/rollup-win32-ia32-msvc@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-win32-ia32-msvc/-/rollup-win32-ia32-msvc-4.60.0.tgz#f41b0490be0e5d3cf459b4dc076a192b532adea9" + integrity sha512-SyaIPFoxmUPlNDq5EHkTbiKzmSEmq/gOYFI/3HHJ8iS/v1mbugVa7dXUzcJGQfoytp9DJFLhHH4U3/eTy2Bq4w== + +"@rollup/rollup-win32-x64-gnu@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-win32-x64-gnu/-/rollup-win32-x64-gnu-4.60.0.tgz#0fcf9f1fcb750f0317b13aac3b3231687e6397a5" + integrity sha512-RdcryEfzZr+lAr5kRm2ucN9aVlCCa2QNq4hXelZxb8GG0NJSazq44Z3PCCc8wISRuCVnGs0lQJVX5Vp6fKA+IA== + +"@rollup/rollup-win32-x64-msvc@4.60.0": + version "4.60.0" + resolved "https://registry.npmjs.org/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.60.0.tgz#3afdb30405f6d4248df5e72e1ca86c5eab55fab8" + integrity sha512-PrsWNQ8BuE00O3Xsx3ALh2Df8fAj9+cvvX9AIA6o4KpATR98c9mud4XtDWVvsEuyia5U4tVSTKygawyJkjm60w== + "@standard-schema/spec@^1.1.0": version "1.1.0" resolved "https://registry.npmjs.org/@standard-schema/spec/-/spec-1.1.0.tgz" @@ -660,6 +1087,61 @@ resolved "https://registry.npmjs.org/@swc/core-darwin-arm64/-/core-darwin-arm64-1.15.21.tgz" integrity sha512-SA8SFg9dp0qKRH8goWsax6bptFE2EdmPf2YRAQW9WoHGf3XKM1bX0nd5UdwxmC5hXsBUZAYf7xSciCler6/oyA== +"@swc/core-darwin-x64@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-darwin-x64/-/core-darwin-x64-1.15.21.tgz#05ff28c00a7045d9760c847e19604fff02b6e3ea" + integrity sha512-//fOVntgowz9+V90lVsNCtyyrtbHp3jWH6Rch7MXHXbcvbLmbCTmssl5DeedUWLLGiAAW1wksBdqdGYOTjaNLw== + +"@swc/core-linux-arm-gnueabihf@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-linux-arm-gnueabihf/-/core-linux-arm-gnueabihf-1.15.21.tgz#d52a0fac1933fe4e4180a196417053571d6c255f" + integrity sha512-meNI4Sh6h9h8DvIfEc0l5URabYMSuNvyisLmG6vnoYAS43s8ON3NJR8sDHvdP7NJTrLe0q/x2XCn6yL/BeHcZg== + +"@swc/core-linux-arm64-gnu@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-linux-arm64-gnu/-/core-linux-arm64-gnu-1.15.21.tgz#32cd1b9d0d4be4d53ccfbc122ac61289f37735b9" + integrity sha512-QrXlNQnHeXqU2EzLlnsPoWEh8/GtNJLvfMiPsDhk+ht6Xv8+vhvZ5YZ/BokNWSIZiWPKLAqR0M7T92YF5tmD3g== + +"@swc/core-linux-arm64-musl@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-linux-arm64-musl/-/core-linux-arm64-musl-1.15.21.tgz#0993e8b2ffac4f1141fa7b158e8dd982c2476c1a" + integrity sha512-8/yGCMO333ultDaMQivE5CjO6oXDPeeg1IV4sphojPkb0Pv0i6zvcRIkgp60xDB+UxLr6VgHgt+BBgqS959E9g== + +"@swc/core-linux-ppc64-gnu@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-linux-ppc64-gnu/-/core-linux-ppc64-gnu-1.15.21.tgz#5f6765d9a36235d95fd5c69f6d848973e85d8180" + integrity sha512-ucW0HzPx0s1dgRvcvuLSPSA/2Kk/VYTv9st8qe1Kc22Gu0Q0rH9+6TcBTmMuNIp0Xs4BPr1uBttmbO1wEGI49Q== + +"@swc/core-linux-s390x-gnu@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-linux-s390x-gnu/-/core-linux-s390x-gnu-1.15.21.tgz#f96779dc2ba8d47298bca3ceaa961e0f460aa0bd" + integrity sha512-ulTnOGc5I7YRObE/9NreAhQg94QkiR5qNhhcUZ1iFAYjzg/JGAi1ch+s/Ixe61pMIr8bfVrF0NOaB0f8wjaAfA== + +"@swc/core-linux-x64-gnu@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-linux-x64-gnu/-/core-linux-x64-gnu-1.15.21.tgz#0ffe779d5fd060bfb7992176f51d317c81c6aaaf" + integrity sha512-D0RokxtM+cPvSqJIKR6uja4hbD+scI9ezo95mBhfSyLUs9wnPPl26sLp1ZPR/EXRdYm3F3S6RUtVi+8QXhT24Q== + +"@swc/core-linux-x64-musl@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-linux-x64-musl/-/core-linux-x64-musl-1.15.21.tgz#2ea9fab26555d27c715aed6a08604a8296e4af50" + integrity sha512-nER8u7VeRfmU6fMDzl1NQAbbB/G7O2avmvCOwIul1uGkZ2/acbPH+DCL9h5+0yd/coNcxMBTL6NGepIew+7C2w== + +"@swc/core-win32-arm64-msvc@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-win32-arm64-msvc/-/core-win32-arm64-msvc-1.15.21.tgz#b401f34f38d744ca2b800bf2574ef5f7b20ca52f" + integrity sha512-+/AgNBnjYugUA8C0Do4YzymgvnGbztv7j8HKSQLvR/DQgZPoXQ2B3PqB2mTtGh/X5DhlJWiqnunN35JUgWcAeQ== + +"@swc/core-win32-ia32-msvc@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-win32-ia32-msvc/-/core-win32-ia32-msvc-1.15.21.tgz#c761e981725d137abd7abcecff88d1dc2d76baad" + integrity sha512-IkSZj8PX/N4HcaFhMQtzmkV8YSnuNoJ0E6OvMwFiOfejPhiKXvl7CdDsn1f4/emYEIDO3fpgZW9DTaCRMDxaDA== + +"@swc/core-win32-x64-msvc@1.15.21": + version "1.15.21" + resolved "https://registry.npmjs.org/@swc/core-win32-x64-msvc/-/core-win32-x64-msvc-1.15.21.tgz#4878cd851b4f98033e19fca78953201aef736edd" + integrity sha512-zUyWso7OOENB6e1N1hNuNn8vbvLsTdKQ5WKLgt/JcBNfJhKy/6jmBmqI3GXk/MyvQKd5SLvP7A0F36p7TeDqvw== + "@swc/core@^1.12.11": version "1.15.21" resolved "https://registry.npmjs.org/@swc/core/-/core-1.15.21.tgz" @@ -926,6 +1408,13 @@ "@tiptap/extensions" "^3.22.2" "@tiptap/pm" "^3.22.2" +"@tybys/wasm-util@^0.10.1": + version "0.10.1" + resolved "https://registry.npmjs.org/@tybys/wasm-util/-/wasm-util-0.10.1.tgz#ecddd3205cf1e2d5274649ff0eedd2991ed7f414" + integrity sha512-9tTaPJLSiejZKx+Bmog4uSubteqTvFrVrURwkmHixBo0G4seD0zUxp98E1DzUBJxLQ3NPwXrGKDiVjwx/DpPsg== + dependencies: + tslib "^2.4.0" + "@types/aria-query@^5.0.1": version "5.0.4" resolved "https://registry.npmjs.org/@types/aria-query/-/aria-query-5.0.4.tgz" @@ -1161,7 +1650,7 @@ dependencies: dompurify "*" -"@types/estree@^1.0.0", "@types/estree@^1.0.6", "@types/estree@^1.0.8", "@types/estree@1.0.8": +"@types/estree@1.0.8", "@types/estree@^1.0.0", "@types/estree@^1.0.6", "@types/estree@^1.0.8": version "1.0.8" resolved "https://registry.npmjs.org/@types/estree/-/estree-1.0.8.tgz" integrity sha512-dWHzHa2WqEXI/O1E9OjrocMTKJl2mSrEolh1Iomrv6U+JuNwaHXsXx9bLu5gG7BUWFIN0skIQJQ/L1rIex4X6w== @@ -1303,7 +1792,7 @@ resolved "https://registry.npmjs.org/@types/validator/-/validator-13.15.10.tgz" integrity sha512-T8L6i7wCuyoK8A/ZeLYt1+q0ty3Zb9+qbSSvrIVitzT3YjZqkTZ40IbRsPanlB4h1QB3JVL1SYCdR6ngtFYcuA== -"@typescript-eslint/eslint-plugin@^8.16.0", "@typescript-eslint/eslint-plugin@8.57.2": +"@typescript-eslint/eslint-plugin@8.57.2", "@typescript-eslint/eslint-plugin@^8.16.0": version "8.57.2" resolved "https://registry.npmjs.org/@typescript-eslint/eslint-plugin/-/eslint-plugin-8.57.2.tgz" integrity sha512-NZZgp0Fm2IkD+La5PR81sd+g+8oS6JwJje+aRWsDocxHkjyRw0J5L5ZTlN3LI1LlOcGL7ph3eaIUmTXMIjLk0w== @@ -1317,7 +1806,7 @@ natural-compare "^1.4.0" ts-api-utils "^2.4.0" -"@typescript-eslint/parser@^8.16.0", "@typescript-eslint/parser@8.57.2": +"@typescript-eslint/parser@8.57.2", "@typescript-eslint/parser@^8.16.0": version "8.57.2" resolved "https://registry.npmjs.org/@typescript-eslint/parser/-/parser-8.57.2.tgz" integrity sha512-30ScMRHIAD33JJQkgfGW1t8CURZtjc2JpTrq5n2HFhOefbAhb7ucc7xJwdWcrEtqUIYJ73Nybpsggii6GtAHjA== @@ -1345,7 +1834,7 @@ "@typescript-eslint/types" "8.57.2" "@typescript-eslint/visitor-keys" "8.57.2" -"@typescript-eslint/tsconfig-utils@^8.57.2", "@typescript-eslint/tsconfig-utils@8.57.2": +"@typescript-eslint/tsconfig-utils@8.57.2", "@typescript-eslint/tsconfig-utils@^8.57.2": version "8.57.2" resolved "https://registry.npmjs.org/@typescript-eslint/tsconfig-utils/-/tsconfig-utils-8.57.2.tgz" integrity sha512-3Lm5DSM+DCowsUOJC+YqHHnKEfFh5CoGkj5Z31NQSNF4l5wdOwqGn99wmwN/LImhfY3KJnmordBq/4+VDe2eKw== @@ -1361,7 +1850,7 @@ debug "^4.4.3" ts-api-utils "^2.4.0" -"@typescript-eslint/types@^8.57.2", "@typescript-eslint/types@8.57.2": +"@typescript-eslint/types@8.57.2", "@typescript-eslint/types@^8.57.2": version "8.57.2" resolved "https://registry.npmjs.org/@typescript-eslint/types/-/types-8.57.2.tgz" integrity sha512-/iZM6FnM4tnx9csuTxspMW4BOSegshwX5oBDznJ7S4WggL7Vczz5d2W11ecc4vRrQMQHXRSxzrCsyG5EsPPTbA== @@ -1576,11 +2065,6 @@ argparse@^2.0.1: resolved "https://registry.npmjs.org/argparse/-/argparse-2.0.1.tgz" integrity sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q== -aria-query@^5.0.0, aria-query@^5.3.2: - version "5.3.2" - resolved "https://registry.npmjs.org/aria-query/-/aria-query-5.3.2.tgz" - integrity sha512-COROpnaoap1E2F000S62r6A60uHZnmlvomhfyT2DlTcrY1OrBKn2UhH7qn5wTC9zMvD0AY7csdPSNwKP+7WiQw== - aria-query@5.3.0: version "5.3.0" resolved "https://registry.npmjs.org/aria-query/-/aria-query-5.3.0.tgz" @@ -1588,6 +2072,11 @@ aria-query@5.3.0: dependencies: dequal "^2.0.3" +aria-query@^5.0.0, aria-query@^5.3.2: + version "5.3.2" + resolved "https://registry.npmjs.org/aria-query/-/aria-query-5.3.2.tgz" + integrity sha512-COROpnaoap1E2F000S62r6A60uHZnmlvomhfyT2DlTcrY1OrBKn2UhH7qn5wTC9zMvD0AY7csdPSNwKP+7WiQw== + array-buffer-byte-length@^1.0.1, array-buffer-byte-length@^1.0.2: version "1.0.2" resolved "https://registry.npmjs.org/array-buffer-byte-length/-/array-buffer-byte-length-1.0.2.tgz" @@ -1930,11 +2419,6 @@ color-name@~1.1.4: resolved "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz" integrity sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA== -commander@^8.3.0: - version "8.3.0" - resolved "https://registry.npmjs.org/commander/-/commander-8.3.0.tgz" - integrity sha512-OkTL9umf+He2DZkUq8f8J9of7yL6RJKI24dVITBmNfZBmri9zYZQrKkuXiKhyfPSu8tUhnVBB1iKXevvnlR4Ww== - commander@2: version "2.20.3" resolved "https://registry.npmjs.org/commander/-/commander-2.20.3.tgz" @@ -1945,6 +2429,11 @@ commander@7: resolved "https://registry.npmjs.org/commander/-/commander-7.2.0.tgz" integrity sha512-QrWXB+ZQSVPmIWIhtEO9H+gwHaMGYiF5ChvoJ+K9ZGHG/sVsa6yiesAD1GC/x46sET00Xlwo1u49RVVVzvcSkw== +commander@^8.3.0: + version "8.3.0" + resolved "https://registry.npmjs.org/commander/-/commander-8.3.0.tgz" + integrity sha512-OkTL9umf+He2DZkUq8f8J9of7yL6RJKI24dVITBmNfZBmri9zYZQrKkuXiKhyfPSu8tUhnVBB1iKXevvnlR4Ww== + compress-commons@^4.1.2: version "4.1.2" resolved "https://registry.npmjs.org/compress-commons/-/compress-commons-4.1.2.tgz" @@ -2043,7 +2532,7 @@ culori@^4.0.2: resolved "https://registry.npmjs.org/culori/-/culori-4.0.2.tgz" integrity sha512-1+BhOB8ahCn4O0cep0Sh2l9KCOfOdY+BXJnKMHFFzDEouSr/el18QwXEMRlOj9UY5nCeA8UN3a/82rUWRBeyBw== -d3-array@^3.2.0, d3-array@^3.2.4, "d3-array@1 - 3", "d3-array@2 - 3", "d3-array@2.10.0 - 3", "d3-array@2.5.0 - 3", d3-array@3, d3-array@3.2.4: +"d3-array@1 - 3", "d3-array@2 - 3", "d3-array@2.10.0 - 3", "d3-array@2.5.0 - 3", d3-array@3, d3-array@3.2.4, d3-array@^3.2.0, d3-array@^3.2.4: version "3.2.4" resolved "https://registry.npmjs.org/d3-array/-/d3-array-3.2.4.tgz" integrity sha512-tdQAmyA18i4J7wprpYq8ClcxZy3SC31QMeByyCFyRt7BVHdREQZ5lpzoe5mFEYZUWe+oq8HBvk9JjpibyEV4Jg== @@ -2073,7 +2562,7 @@ d3-chord@3: dependencies: d3-path "1 - 3" -d3-color@^3.1.0, "d3-color@1 - 3", d3-color@3: +"d3-color@1 - 3", d3-color@3, d3-color@^3.1.0: version "3.1.0" resolved "https://registry.npmjs.org/d3-color/-/d3-color-3.1.0.tgz" integrity sha512-zg/chbXyeBtMQ1LbD/WSoW2DpC3I0mpmPdW+ynRTj/x2DAWYrIY7qeZIHidozwV24m4iavr15lNwIwLxRmOxhA== @@ -2085,7 +2574,7 @@ d3-contour@4: dependencies: d3-array "^3.2.0" -d3-delaunay@^6.0.4, d3-delaunay@6: +d3-delaunay@6, d3-delaunay@^6.0.4: version "6.0.4" resolved "https://registry.npmjs.org/d3-delaunay/-/d3-delaunay-6.0.4.tgz" integrity sha512-mdjtIZ1XLAM8bm/hx3WwjfHt6Sggek7qH043O8KEjDXN40xi3vx/6pYSVTwLjEgiXQTbvaouWKynLBiUZ6SK6A== @@ -2105,7 +2594,7 @@ d3-delaunay@^6.0.4, d3-delaunay@6: d3-dispatch "1 - 3" d3-selection "3" -d3-dsv@^3.0.1, "d3-dsv@1 - 3", d3-dsv@3: +"d3-dsv@1 - 3", d3-dsv@3, d3-dsv@^3.0.1: version "3.0.1" resolved "https://registry.npmjs.org/d3-dsv/-/d3-dsv-3.0.1.tgz" integrity sha512-UG6OvdI5afDIFP9w4G0mNq50dSOsXHJaRE8arAS5o9ApWnIElp8GZw1Dun8vP8OyHOZ/QJUKUJwxiiCCnUwm+Q== @@ -2126,7 +2615,7 @@ d3-fetch@3: dependencies: d3-dsv "1 - 3" -d3-force@^3.0.0, d3-force@3: +d3-force@3, d3-force@^3.0.0: version "3.0.0" resolved "https://registry.npmjs.org/d3-force/-/d3-force-3.0.0.tgz" integrity sha512-zxV/SsA+U4yte8051P4ECydjD/S+qeYtnaIyAs9tgHCqfguma/aAQDjo85A9Z6EKhBirHRJHXIgJUlffT4wdLg== @@ -2135,7 +2624,7 @@ d3-force@^3.0.0, d3-force@3: d3-quadtree "1 - 3" d3-timer "1 - 3" -d3-format@^3.1.0, "d3-format@1 - 3", d3-format@3: +"d3-format@1 - 3", d3-format@3, d3-format@^3.1.0: version "3.1.2" resolved "https://registry.npmjs.org/d3-format/-/d3-format-3.1.2.tgz" integrity sha512-AJDdYOdnyRDV5b6ArilzCPPwc1ejkHcoyFarqlPqT7zRYjhavcT3uSrqcMvsgh2CgoPbK3RCwyHaVyxYcP2Arg== @@ -2149,26 +2638,26 @@ d3-geo-projection@^4.0.0: d3-array "1 - 3" d3-geo "1.12.0 - 3" -d3-geo@^3.1.1, "d3-geo@1.12.0 - 3", d3-geo@3: +"d3-geo@1.12.0 - 3", d3-geo@3, d3-geo@^3.1.1: version "3.1.1" resolved "https://registry.npmjs.org/d3-geo/-/d3-geo-3.1.1.tgz" integrity sha512-637ln3gXKXOwhalDzinUgY83KzNWZRKbYubaG+fGVuc/dxO64RRljtCTnf5ecMyE1RIdtqpkVcq0IbtU2S8j2Q== dependencies: d3-array "2.5.0 - 3" -d3-hierarchy@^3.1.2, d3-hierarchy@3: +d3-hierarchy@3, d3-hierarchy@^3.1.2: version "3.1.2" resolved "https://registry.npmjs.org/d3-hierarchy/-/d3-hierarchy-3.1.2.tgz" integrity sha512-FX/9frcub54beBdugHjDCdikxThEqjnR93Qt7PvQTOHxyiNCAlvMrHhclk3cD5VeAaq9fxmfRp+CnWw9rEMBuA== -d3-interpolate@^3.0.1, "d3-interpolate@1 - 3", "d3-interpolate@1.2.0 - 3", d3-interpolate@3: +"d3-interpolate@1 - 3", "d3-interpolate@1.2.0 - 3", d3-interpolate@3, d3-interpolate@^3.0.1: version "3.0.1" resolved "https://registry.npmjs.org/d3-interpolate/-/d3-interpolate-3.0.1.tgz" integrity sha512-3bYs1rOD33uo8aqJfKP3JWPAibgw8Zm2+L9vBKEHJ2Rg+viTR7o5Mmv5mZcieN+FRYaAOWX5SJATX6k1PWz72g== dependencies: d3-color "1 - 3" -d3-path@^3.1.0, "d3-path@1 - 3", d3-path@3: +"d3-path@1 - 3", d3-path@3, d3-path@^3.1.0: version "3.1.0" resolved "https://registry.npmjs.org/d3-path/-/d3-path-3.1.0.tgz" integrity sha512-p3KP5HCf/bvjBSSKuXid6Zqijx7wIfNW+J/maPs+iwR35at5JCbLUT0LzF1cnjbCHWhqzQTIN2Jpe8pRebIEFQ== @@ -2188,7 +2677,7 @@ d3-random@3: resolved "https://registry.npmjs.org/d3-random/-/d3-random-3.0.1.tgz" integrity sha512-FXMe9GfxTxqd5D6jFsQ+DJ8BJS4E/fT5mqqdjovykEB2oFbTMDVdg1MGFxfQW+FBOGoB++k8swBrgwSHT1cUXQ== -d3-scale-chromatic@^3.1.0, d3-scale-chromatic@3: +d3-scale-chromatic@3, d3-scale-chromatic@^3.1.0: version "3.1.0" resolved "https://registry.npmjs.org/d3-scale-chromatic/-/d3-scale-chromatic-3.1.0.tgz" integrity sha512-A3s5PWiZ9YCXFye1o246KoscMWqf8BsD9eRiJ3He7C9OBaxKhAd5TFCdEx/7VbKtxxTsu//1mMJFrEt572cEyQ== @@ -2196,7 +2685,7 @@ d3-scale-chromatic@^3.1.0, d3-scale-chromatic@3: d3-color "1 - 3" d3-interpolate "1 - 3" -d3-scale@^4.0.2, d3-scale@4: +d3-scale@4, d3-scale@^4.0.2: version "4.0.2" resolved "https://registry.npmjs.org/d3-scale/-/d3-scale-4.0.2.tgz" integrity sha512-GZW464g1SH7ag3Y7hXjf8RoUuAFIqklOAq3MRl4OaWabTFJY9PN/E1YklhXLh+OQ3fM9yS2nOkCoS+WLZ6kvxQ== @@ -2212,28 +2701,28 @@ d3-scale@^4.0.2, d3-scale@4: resolved "https://registry.npmjs.org/d3-selection/-/d3-selection-3.0.0.tgz" integrity sha512-fmTRWbNMmsmWq6xJV8D19U/gw/bwrHfNXxrIN+HfZgnzqTHp9jOmKMhsTUjXOJnZOdZY9Q28y4yebKzqDKlxlQ== -d3-shape@^3.2.0, d3-shape@3: +d3-shape@3, d3-shape@^3.2.0: version "3.2.0" resolved "https://registry.npmjs.org/d3-shape/-/d3-shape-3.2.0.tgz" integrity sha512-SaLBuwGm3MOViRq2ABk3eLoxwZELpH6zhl3FbAoJ7Vm1gofKx6El1Ib5z23NUEhF9AsGl7y+dzLe5Cw2AArGTA== dependencies: d3-path "^3.1.0" -d3-time-format@^4.1.0, "d3-time-format@2 - 4", d3-time-format@4: +"d3-time-format@2 - 4", d3-time-format@4, d3-time-format@^4.1.0: version "4.1.0" resolved "https://registry.npmjs.org/d3-time-format/-/d3-time-format-4.1.0.tgz" integrity sha512-dJxPBlzC7NugB2PDLwo9Q8JiTR3M3e4/XANkreKSUxF8vvXKqm1Yfq4Q5dl8budlunRVlUUaDUgFt7eA8D6NLg== dependencies: d3-time "1 - 3" -d3-time@^3.1.0, "d3-time@1 - 3", "d3-time@2.1.1 - 3", d3-time@3: +"d3-time@1 - 3", "d3-time@2.1.1 - 3", d3-time@3, d3-time@^3.1.0: version "3.1.0" resolved "https://registry.npmjs.org/d3-time/-/d3-time-3.1.0.tgz" integrity sha512-VqKjzBLejbSMT4IgbmVgDjpkYrNWUYJnbCGo874u7MMKIWsILRX+OpX/gTk8MqjpT1A/c6HY2dCA77ZN0lkQ2Q== dependencies: d3-array "2 - 3" -d3-timer@^3.0.1, "d3-timer@1 - 3", d3-timer@3: +"d3-timer@1 - 3", d3-timer@3, d3-timer@^3.0.1: version "3.0.1" resolved "https://registry.npmjs.org/d3-timer/-/d3-timer-3.0.1.tgz" integrity sha512-ndfJ/JxxMd3nw31uyKoY2naivF+r29V+Lc0svZxe1JvvIRmi8hUsrMvdOwgS1o6uBHmiz91geQ0ylPP0aj1VUA== @@ -3241,7 +3730,7 @@ inflight@^1.0.4: once "^1.3.0" wrappy "1" -inherits@^2.0.3, inherits@^2.0.4, inherits@~2.0.0, inherits@~2.0.3, inherits@2: +inherits@2, inherits@^2.0.3, inherits@^2.0.4, inherits@~2.0.0, inherits@~2.0.3: version "2.0.4" resolved "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz" integrity sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ== @@ -3620,6 +4109,13 @@ levn@^0.4.1: prelude-ls "^1.2.1" type-check "~0.4.0" +lie@3.1.1: + version "3.1.1" + resolved "https://registry.npmjs.org/lie/-/lie-3.1.1.tgz" + integrity sha512-RiNhHysUjhrDQntfYSfY4MU24coXXdEOgw9WGcKHNeEwffDYbF//u87M1EWaMGzuFoSbqW0C9C6lEEhDOAswfw== + dependencies: + immediate "~3.0.5" + lie@~3.3.0: version "3.3.0" resolved "https://registry.npmjs.org/lie/-/lie-3.3.0.tgz" @@ -3627,18 +4123,61 @@ lie@~3.3.0: dependencies: immediate "~3.0.5" -lie@3.1.1: - version "3.1.1" - resolved "https://registry.npmjs.org/lie/-/lie-3.1.1.tgz" - integrity sha512-RiNhHysUjhrDQntfYSfY4MU24coXXdEOgw9WGcKHNeEwffDYbF//u87M1EWaMGzuFoSbqW0C9C6lEEhDOAswfw== - dependencies: - immediate "~3.0.5" +lightningcss-android-arm64@1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss-android-arm64/-/lightningcss-android-arm64-1.32.0.tgz#f033885116dfefd9c6f54787523e3514b61e1968" + integrity sha512-YK7/ClTt4kAK0vo6w3X+Pnm0D2cf2vPHbhOXdoNti1Ga0al1P4TBZhwjATvjNwLEBCnKvjJc2jQgHXH0NEwlAg== lightningcss-darwin-arm64@1.32.0: version "1.32.0" resolved "https://registry.npmjs.org/lightningcss-darwin-arm64/-/lightningcss-darwin-arm64-1.32.0.tgz" integrity sha512-RzeG9Ju5bag2Bv1/lwlVJvBE3q6TtXskdZLLCyfg5pt+HLz9BqlICO7LZM7VHNTTn/5PRhHFBSjk5lc4cmscPQ== +lightningcss-darwin-x64@1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss-darwin-x64/-/lightningcss-darwin-x64-1.32.0.tgz#35f3e97332d130b9ca181e11b568ded6aebc6d5e" + integrity sha512-U+QsBp2m/s2wqpUYT/6wnlagdZbtZdndSmut/NJqlCcMLTWp5muCrID+K5UJ6jqD2BFshejCYXniPDbNh73V8w== + +lightningcss-freebsd-x64@1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss-freebsd-x64/-/lightningcss-freebsd-x64-1.32.0.tgz#9777a76472b64ed6ff94342ad64c7bafd794a575" + integrity sha512-JCTigedEksZk3tHTTthnMdVfGf61Fky8Ji2E4YjUTEQX14xiy/lTzXnu1vwiZe3bYe0q+SpsSH/CTeDXK6WHig== + +lightningcss-linux-arm-gnueabihf@1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss-linux-arm-gnueabihf/-/lightningcss-linux-arm-gnueabihf-1.32.0.tgz#13ae652e1ab73b9135d7b7da172f666c410ad53d" + integrity sha512-x6rnnpRa2GL0zQOkt6rts3YDPzduLpWvwAF6EMhXFVZXD4tPrBkEFqzGowzCsIWsPjqSK+tyNEODUBXeeVHSkw== + +lightningcss-linux-arm64-gnu@1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss-linux-arm64-gnu/-/lightningcss-linux-arm64-gnu-1.32.0.tgz#417858795a94592f680123a1b1f9da8a0e1ef335" + integrity sha512-0nnMyoyOLRJXfbMOilaSRcLH3Jw5z9HDNGfT/gwCPgaDjnx0i8w7vBzFLFR1f6CMLKF8gVbebmkUN3fa/kQJpQ== + +lightningcss-linux-arm64-musl@1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss-linux-arm64-musl/-/lightningcss-linux-arm64-musl-1.32.0.tgz#6be36692e810b718040802fd809623cffe732133" + integrity sha512-UpQkoenr4UJEzgVIYpI80lDFvRmPVg6oqboNHfoH4CQIfNA+HOrZ7Mo7KZP02dC6LjghPQJeBsvXhJod/wnIBg== + +lightningcss-linux-x64-gnu@1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss-linux-x64-gnu/-/lightningcss-linux-x64-gnu-1.32.0.tgz#0b7803af4eb21cfd38dd39fe2abbb53c7dd091f6" + integrity sha512-V7Qr52IhZmdKPVr+Vtw8o+WLsQJYCTd8loIfpDaMRWGUZfBOYEJeyJIkqGIDMZPwPx24pUMfwSxxI8phr/MbOA== + +lightningcss-linux-x64-musl@1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss-linux-x64-musl/-/lightningcss-linux-x64-musl-1.32.0.tgz#88dc8ba865ddddb1ac5ef04b0f161804418c163b" + integrity sha512-bYcLp+Vb0awsiXg/80uCRezCYHNg1/l3mt0gzHnWV9XP1W5sKa5/TCdGWaR/zBM2PeF/HbsQv/j2URNOiVuxWg== + +lightningcss-win32-arm64-msvc@1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss-win32-arm64-msvc/-/lightningcss-win32-arm64-msvc-1.32.0.tgz#4f30ba3fa5e925f5b79f945e8cc0d176c3b1ab38" + integrity sha512-8SbC8BR40pS6baCM8sbtYDSwEVQd4JlFTOlaD3gWGHfThTcABnNDBda6eTZeqbofalIJhFx0qKzgHJmcPTnGdw== + +lightningcss-win32-x64-msvc@1.32.0: + version "1.32.0" + resolved "https://registry.npmjs.org/lightningcss-win32-x64-msvc/-/lightningcss-win32-x64-msvc-1.32.0.tgz#141aa5605645064928902bb4af045fa7d9f4220a" + integrity sha512-Amq9B/SoZYdDi1kFrojnoqPLxYhQ4Wo5XiL8EVJrVsB8ARoC1PWW6VGtT0WKCemjy8aC+louJnjS7U18x3b06Q== + lightningcss@^1.32.0: version "1.32.0" resolved "https://registry.npmjs.org/lightningcss/-/lightningcss-1.32.0.tgz" @@ -4110,7 +4649,7 @@ pathe@^2.0.3: resolved "https://registry.npmjs.org/pathe/-/pathe-2.0.3.tgz" integrity sha512-WUjGcAqP1gQacoQe+OBJsFA7Ld4DyXuUIjZ5cc75cLHvJ7dtNsTugphxIADwspS+AraAUePCKrSVtPLFj/F88w== -picocolors@^1.1.1, picocolors@1.1.1: +picocolors@1.1.1, picocolors@^1.1.1: version "1.1.1" resolved "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz" integrity sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA== @@ -4442,6 +4981,11 @@ react-is@^19.2.3: resolved "https://registry.npmjs.org/react-is/-/react-is-19.2.4.tgz" integrity sha512-W+EWGn2v0ApPKgKKCy/7s7WHXkboGcsrXE+2joLyVxkbyVQfO3MUEaUQDHoSmb8TFFrSKYa9mw64WZHNHSDzYA== +react-is@^19.2.4: + version "19.2.5" + resolved "https://registry.npmjs.org/react-is/-/react-is-19.2.5.tgz#7e7b54143e9313fed787b23fd4295d5a23872ad9" + integrity sha512-Dn0t8IQhCmeIT3wu+Apm1/YVsJXsGWi6k4sPdnBIdqMVtHtv0IGi6dcpNpNkNac0zB2uUAqNX3MHzN8c+z2rwQ== + react-katex@^3.1.0: version "3.1.0" resolved "https://registry.npmjs.org/react-katex/-/react-katex-3.1.0.tgz" @@ -4518,33 +5062,7 @@ react@^18.2.0: dependencies: loose-envify "^1.1.0" -readable-stream@^2.0.0: - version "2.3.8" - resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.8.tgz" - integrity sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA== - dependencies: - core-util-is "~1.0.0" - inherits "~2.0.3" - isarray "~1.0.0" - process-nextick-args "~2.0.0" - safe-buffer "~5.1.1" - string_decoder "~1.1.1" - util-deprecate "~1.0.1" - -readable-stream@^2.0.2: - version "2.3.8" - resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.8.tgz" - integrity sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA== - dependencies: - core-util-is "~1.0.0" - inherits "~2.0.3" - isarray "~1.0.0" - process-nextick-args "~2.0.0" - safe-buffer "~5.1.1" - string_decoder "~1.1.1" - util-deprecate "~1.0.1" - -readable-stream@^2.0.5: +readable-stream@^2.0.0, readable-stream@^2.0.2, readable-stream@^2.0.5, readable-stream@~2.3.6: version "2.3.8" resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.8.tgz" integrity sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA== @@ -4566,19 +5084,6 @@ readable-stream@^3.1.1, readable-stream@^3.4.0, readable-stream@^3.6.0: string_decoder "^1.1.1" util-deprecate "^1.0.1" -readable-stream@~2.3.6: - version "2.3.8" - resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.8.tgz" - integrity sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA== - dependencies: - core-util-is "~1.0.0" - inherits "~2.0.3" - isarray "~1.0.0" - process-nextick-args "~2.0.0" - safe-buffer "~5.1.1" - string_decoder "~1.1.1" - util-deprecate "~1.0.1" - readdir-glob@^1.1.2: version "1.1.3" resolved "https://registry.npmjs.org/readdir-glob/-/readdir-glob-1.1.3.tgz" @@ -4652,6 +5157,11 @@ reselect@^4.1.8: resolved "https://registry.npmjs.org/reselect/-/reselect-4.1.8.tgz" integrity sha512-ab9EmR80F/zQTMNeneUr4cv+jSwPJgIlvEmVwLerwrWVbpLlBuls9XHzIeTFy4cegU2NHBp3va0LKOzU5qFEYQ== +reselect@^5.1.1: + version "5.1.1" + resolved "https://registry.npmjs.org/reselect/-/reselect-5.1.1.tgz#c766b1eb5d558291e5e550298adb0becc24bb72e" + integrity sha512-K/BG6eIky/SBpzfHZv/dd+9JBFiS4SWV7FIujVyJRux6e45+73RaUHXLmIR1f7WOMaQ0U1km6qwklRQxpJJY0w== + resolve-from@^4.0.0: version "4.0.0" resolved "https://registry.npmjs.org/resolve-from/-/resolve-from-4.0.0.tgz" @@ -4774,7 +5284,7 @@ safe-array-concat@^1.1.3: has-symbols "^1.1.0" isarray "^2.0.5" -safe-buffer@^5.0.1: +safe-buffer@^5.0.1, safe-buffer@~5.2.0: version "5.2.1" resolved "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz" integrity sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ== @@ -4784,11 +5294,6 @@ safe-buffer@~5.1.0, safe-buffer@~5.1.1: resolved "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz" integrity sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g== -safe-buffer@~5.2.0: - version "5.2.1" - resolved "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz" - integrity sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ== - safe-push-apply@^1.0.0: version "1.0.0" resolved "https://registry.npmjs.org/safe-push-apply/-/safe-push-apply-1.0.0.tgz" @@ -4979,7 +5484,7 @@ solid-js@^1.9.5: seroval "~1.5.0" seroval-plugins "~1.5.0" -source-map-js@^1.2.1, "source-map-js@>=0.6.2 <2.0.0": +"source-map-js@>=0.6.2 <2.0.0", source-map-js@^1.2.1: version "1.2.1" resolved "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz" integrity sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA== @@ -5012,20 +5517,6 @@ stop-iteration-iterator@^1.1.0: es-errors "^1.3.0" internal-slot "^1.1.0" -string_decoder@^1.1.1: - version "1.3.0" - resolved "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz" - integrity sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA== - dependencies: - safe-buffer "~5.2.0" - -string_decoder@~1.1.1: - version "1.1.1" - resolved "https://registry.npmjs.org/string_decoder/-/string_decoder-1.1.1.tgz" - integrity sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg== - dependencies: - safe-buffer "~5.1.0" - string-width@^7.0.0, string-width@^7.2.0: version "7.2.0" resolved "https://registry.npmjs.org/string-width/-/string-width-7.2.0.tgz" @@ -5103,6 +5594,20 @@ string.prototype.trimstart@^1.0.8: define-properties "^1.2.1" es-object-atoms "^1.0.0" +string_decoder@^1.1.1: + version "1.3.0" + resolved "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz" + integrity sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA== + dependencies: + safe-buffer "~5.2.0" + +string_decoder@~1.1.1: + version "1.1.1" + resolved "https://registry.npmjs.org/string_decoder/-/string_decoder-1.1.1.tgz" + integrity sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg== + dependencies: + safe-buffer "~5.1.0" + strip-ansi@^7.1.0: version "7.2.0" resolved "https://registry.npmjs.org/strip-ansi/-/strip-ansi-7.2.0.tgz" @@ -5258,16 +5763,16 @@ ts-api-utils@^2.4.0: resolved "https://registry.npmjs.org/ts-api-utils/-/ts-api-utils-2.5.0.tgz" integrity sha512-OJ/ibxhPlqrMM0UiNHJ/0CKQkoKF243/AEmplt3qpRgkW8VG7IfOS41h7V8TjITqdByHzrjcS/2si+y4lIh8NA== -tslib@^2.8.1, tslib@~2.8.1: - version "2.8.1" - resolved "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz" - integrity sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w== - tslib@2.3.0: version "2.3.0" resolved "https://registry.npmjs.org/tslib/-/tslib-2.3.0.tgz" integrity sha512-N82ooyxVNm6h1riLCoyS9e3fuJ3AMG2zIZs2Gd1ATcSFjSA23Q0fzjjZeh0jbJvWVDZ0cJT8yaNNaaXHzueNjg== +tslib@^2.4.0, tslib@^2.8.1, tslib@~2.8.1: + version "2.8.1" + resolved "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz" + integrity sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w== + tunnel-agent@^0.6.0: version "0.6.0" resolved "https://registry.npmjs.org/tunnel-agent/-/tunnel-agent-0.6.0.tgz" @@ -5447,6 +5952,18 @@ vega-dataflow@^6.1.0, vega-dataflow@~6.1.0: vega-loader "^5.1.0" vega-util "^2.1.0" +vega-embed@6.5.1: + version "6.5.1" + resolved "https://registry.npmjs.org/vega-embed/-/vega-embed-6.5.1.tgz" + integrity sha512-yz/L1bN3+fLOpgXVb/8sCRv4GlZpD2/ngeKJAFRiHTIRm5zK6W0KuqZZvyGaO7E4s7RuYjW1TWhRIOqh5rS5hA== + dependencies: + fast-json-patch "^3.0.0-1" + json-stringify-pretty-compact "^2.0.0" + semver "^7.1.3" + vega-schema-url-parser "^1.1.0" + vega-themes "^2.8.2" + vega-tooltip "^0.22.0" + vega-embed@^6.21.0: version "6.29.0" resolved "https://registry.npmjs.org/vega-embed/-/vega-embed-6.29.0.tgz" @@ -5461,18 +5978,6 @@ vega-embed@^6.21.0: vega-themes "^2.15.0" vega-tooltip "^0.35.2" -vega-embed@6.5.1: - version "6.5.1" - resolved "https://registry.npmjs.org/vega-embed/-/vega-embed-6.5.1.tgz" - integrity sha512-yz/L1bN3+fLOpgXVb/8sCRv4GlZpD2/ngeKJAFRiHTIRm5zK6W0KuqZZvyGaO7E4s7RuYjW1TWhRIOqh5rS5hA== - dependencies: - fast-json-patch "^3.0.0-1" - json-stringify-pretty-compact "^2.0.0" - semver "^7.1.3" - vega-schema-url-parser "^1.1.0" - vega-themes "^2.8.2" - vega-tooltip "^0.22.0" - vega-encode@~5.1.0: version "5.1.0" resolved "https://registry.npmjs.org/vega-encode/-/vega-encode-5.1.0.tgz" @@ -5735,17 +6240,7 @@ vega-typings@~2.1.0: vega-expression "^6.1.0" vega-util "^2.1.0" -vega-util@^1.13.1: - version "1.17.4" - resolved "https://registry.npmjs.org/vega-util/-/vega-util-1.17.4.tgz" - integrity sha512-+y3ZW7dEqM8Ck+KRsd+jkMfxfE7MrQxUyIpNjkfhIpGEreym+aTn7XUw1DKXqclr8mqTQvbilPo16B3lnBr0wA== - -vega-util@^1.17.2: - version "1.17.4" - resolved "https://registry.npmjs.org/vega-util/-/vega-util-1.17.4.tgz" - integrity sha512-+y3ZW7dEqM8Ck+KRsd+jkMfxfE7MrQxUyIpNjkfhIpGEreym+aTn7XUw1DKXqclr8mqTQvbilPo16B3lnBr0wA== - -vega-util@^1.17.4: +vega-util@^1.13.1, vega-util@^1.17.2, vega-util@^1.17.4: version "1.17.4" resolved "https://registry.npmjs.org/vega-util/-/vega-util-1.17.4.tgz" integrity sha512-+y3ZW7dEqM8Ck+KRsd+jkMfxfE7MrQxUyIpNjkfhIpGEreym+aTn7XUw1DKXqclr8mqTQvbilPo16B3lnBr0wA== From 66ef7a08881613a3050878c094f59903b7b91070 Mon Sep 17 00:00:00 2001 From: Chenglong Wang Date: Wed, 15 Apr 2026 22:19:16 -0700 Subject: [PATCH 5/6] cleanup --- .../9-generalized-data-source-plugins.md | 102 +-- py-src/data_formulator/app.py | 49 +- .../superset/routes => auth}/__init__.py | 6 +- .../gateways}/__init__.py | 0 .../gateways}/github_gateway.py | 0 .../{security/auth.py => auth/identity.py} | 4 +- .../providers}/__init__.py | 0 .../providers}/azure_easyauth.py | 0 .../providers}/base.py | 0 .../providers}/github_oauth.py | 0 .../providers}/oidc.py | 0 .../vault}/README.md | 0 .../vault}/__init__.py | 0 .../{credential_vault => auth/vault}/base.py | 0 .../vault}/local_vault.py | 0 py-src/data_formulator/data_connector.py | 61 +- .../superset_auth_bridge.py} | 5 +- .../superset_client.py | 5 +- .../data_loader/superset_data_loader.py | 25 +- py-src/data_formulator/plugins/__init__.py | 108 ---- py-src/data_formulator/plugins/auth_base.py | 262 -------- py-src/data_formulator/plugins/base.py | 99 --- py-src/data_formulator/plugins/data_writer.py | 119 ---- .../plugins/superset/__init__.py | 126 ---- .../plugins/superset/catalog.py | 605 ------------------ .../plugins/superset/routes/auth.py | 127 ---- .../plugins/superset/routes/catalog.py | 216 ------- .../plugins/superset/routes/data.py | 389 ----------- .../plugins/superset/session_helpers.py | 138 ---- py-src/data_formulator/routes/__init__.py | 2 + .../{agent_routes.py => routes/agents.py} | 2 +- .../credentials.py} | 4 +- .../demo_stream.py} | 0 .../{session_routes.py => routes/sessions.py} | 2 +- .../{tables_routes.py => routes/tables.py} | 2 +- py-src/data_formulator/security/__init__.py | 17 - pyproject.toml | 2 +- pytest.ini | 8 +- src/app/dfSlice.tsx | 1 - src/app/tableThunks.ts | 40 -- src/index.tsx | 3 - src/plugins/PluginHost.tsx | 72 --- src/plugins/index.ts | 6 - src/plugins/registry.ts | 69 -- src/plugins/superset/SupersetCatalog.tsx | 304 --------- src/plugins/superset/SupersetDashboards.tsx | 348 ---------- src/plugins/superset/SupersetFilterDialog.tsx | 511 --------------- src/plugins/superset/SupersetLogin.tsx | 264 -------- src/plugins/superset/SupersetPanel.tsx | 98 --- src/plugins/superset/api.ts | 182 ------ src/plugins/superset/index.tsx | 34 - src/plugins/superset/locales/en.json | 76 --- src/plugins/superset/locales/zh.json | 76 --- src/plugins/types.ts | 60 -- src/views/UnifiedDataUploadDialog.tsx | 36 +- tests/backend/README.md | 62 +- .../test_agent_diagnostics.py | 0 .../test_agent_utils_sql_table_names.py | 0 .../test_client_image_strip.py | 0 .../test_duckdb_notes_prompt.py | 0 .../{unit => agents}/test_model_registry.py | 0 .../test_provenance_models.py | 0 tests/backend/{security => auth}/test_auth.py | 6 +- .../test_auth_info_endpoint.py | 8 +- .../test_auth_provider_chain.py | 11 +- .../test_azure_easyauth_provider.py | 2 +- .../{unit => auth}/test_credential_vault.py | 2 +- .../test_credential_vault_factory.py | 44 +- .../test_flask_session_config.py | 0 .../test_github_oauth_provider.py | 2 +- .../{unit => auth}/test_oidc_provider.py | 4 +- tests/backend/contract/README.md | 16 - .../test_all_loader_verification.py | 0 .../test_atomic_metadata_update.py | 0 .../test_data_connector_config.py | 0 .../test_data_connector_framework.py | 2 +- .../test_data_connector_vault.py | 2 +- .../test_excel_fixture_parsing.py | 0 .../test_external_data_loader_table_names.py | 0 .../test_file_manager_encoding.py | 0 .../test_file_manager_table_names.py | 0 .../test_json_chinese_serialization.py | 0 .../test_parquet_utils_table_names.py | 0 .../{unit => data}/test_safe_data_filename.py | 0 .../test_table_name_contracts.py | 2 +- .../test_unicode_table_name_sanitization.py | 0 .../test_workspace_fresh_names.py | 0 .../{unit => data}/test_workspace_manager.py | 0 .../test_workspace_source_file_ops.py | 0 tests/backend/integration/README.md | 30 - .../integration/test_plugin_app_config.py | 182 ------ .../test_plugin_auth_with_vault.py | 170 ----- .../test_agent_diagnostics_wiring.py | 0 .../test_create_table_replace_source.py | 4 +- .../test_create_table_xls_upload.py | 4 +- .../test_credential_routes.py | 14 +- .../test_csv_encoding_roundtrip.py | 4 +- .../test_derive_data_repair_loop.py | 4 +- .../test_list_global_models_api.py | 10 +- .../test_parse_file_endpoint.py | 2 +- .../test_same_basename_upload.py | 4 +- .../test_session_routes_migration.py | 14 +- .../security/test_global_model_security.py | 12 +- .../{integration => security}/test_sandbox.py | 0 tests/backend/unit/README.md | 20 - tests/backend/unit/test_plugin_data_writer.py | 166 ----- tests/backend/unit/test_plugin_discovery.py | 289 --------- tests/backend/unit/test_superset_plugin.py | 369 ----------- tests/{plugin => database-dockers}/README.md | 0 .../bigquery}/Dockerfile | 0 .../bigquery}/README.md | 0 .../bigquery}/init_data.yaml | 0 .../bigquery}/test_bigquery_loader.py | 0 .../mongodb}/Dockerfile | 0 .../mongodb}/README.md | 0 .../mongodb}/init_data.js | 0 .../mongodb}/test_mongodb_loader.py | 0 .../mysql}/Dockerfile | 0 .../mysql}/README.md | 0 .../mysql}/init.sql | 0 .../mysql}/test_mysql_datalake.py | 0 .../mysql}/test_mysql_loader.py | 0 .../postgres}/Dockerfile | 0 .../postgres}/README.md | 0 .../postgres}/init.sql | 0 .../postgres}/test_postgresql_loader.py | 0 .../superset/.env.superset | 0 .../{ => database-dockers}/superset/README.md | 0 .../superset/docker-compose.yml | 0 .../superset/init-superset.sh | 0 .../superset/sample_data.py | 0 .../{ => database-dockers}/superset/start.sh | 0 .../superset/superset_config.py | 0 .../superset}/test_superset_data_connector.py | 12 +- .../test_mysql/test_mysql_data_connector.py | 263 -------- .../test_postgresql_data_connector.py | 430 ------------- 136 files changed, 226 insertions(+), 6605 deletions(-) rename py-src/data_formulator/{plugins/superset/routes => auth}/__init__.py (57%) rename py-src/data_formulator/{auth_gateways => auth/gateways}/__init__.py (100%) rename py-src/data_formulator/{auth_gateways => auth/gateways}/github_gateway.py (100%) rename py-src/data_formulator/{security/auth.py => auth/identity.py} (98%) rename py-src/data_formulator/{auth_providers => auth/providers}/__init__.py (100%) rename py-src/data_formulator/{auth_providers => auth/providers}/azure_easyauth.py (100%) rename py-src/data_formulator/{auth_providers => auth/providers}/base.py (100%) rename py-src/data_formulator/{auth_providers => auth/providers}/github_oauth.py (100%) rename py-src/data_formulator/{auth_providers => auth/providers}/oidc.py (100%) rename py-src/data_formulator/{credential_vault => auth/vault}/README.md (100%) rename py-src/data_formulator/{credential_vault => auth/vault}/__init__.py (100%) rename py-src/data_formulator/{credential_vault => auth/vault}/base.py (100%) rename py-src/data_formulator/{credential_vault => auth/vault}/local_vault.py (100%) rename py-src/data_formulator/{plugins/superset/auth_bridge.py => data_loader/superset_auth_bridge.py} (91%) rename py-src/data_formulator/{plugins/superset => data_loader}/superset_client.py (95%) delete mode 100644 py-src/data_formulator/plugins/__init__.py delete mode 100644 py-src/data_formulator/plugins/auth_base.py delete mode 100644 py-src/data_formulator/plugins/base.py delete mode 100644 py-src/data_formulator/plugins/data_writer.py delete mode 100644 py-src/data_formulator/plugins/superset/__init__.py delete mode 100644 py-src/data_formulator/plugins/superset/catalog.py delete mode 100644 py-src/data_formulator/plugins/superset/routes/auth.py delete mode 100644 py-src/data_formulator/plugins/superset/routes/catalog.py delete mode 100644 py-src/data_formulator/plugins/superset/routes/data.py delete mode 100644 py-src/data_formulator/plugins/superset/session_helpers.py create mode 100644 py-src/data_formulator/routes/__init__.py rename py-src/data_formulator/{agent_routes.py => routes/agents.py} (99%) rename py-src/data_formulator/{credential_routes.py => routes/credentials.py} (92%) rename py-src/data_formulator/{demo_stream_routes.py => routes/demo_stream.py} (100%) rename py-src/data_formulator/{session_routes.py => routes/sessions.py} (99%) rename py-src/data_formulator/{tables_routes.py => routes/tables.py} (99%) delete mode 100644 src/plugins/PluginHost.tsx delete mode 100644 src/plugins/index.ts delete mode 100644 src/plugins/registry.ts delete mode 100644 src/plugins/superset/SupersetCatalog.tsx delete mode 100644 src/plugins/superset/SupersetDashboards.tsx delete mode 100644 src/plugins/superset/SupersetFilterDialog.tsx delete mode 100644 src/plugins/superset/SupersetLogin.tsx delete mode 100644 src/plugins/superset/SupersetPanel.tsx delete mode 100644 src/plugins/superset/api.ts delete mode 100644 src/plugins/superset/index.tsx delete mode 100644 src/plugins/superset/locales/en.json delete mode 100644 src/plugins/superset/locales/zh.json delete mode 100644 src/plugins/types.ts rename tests/backend/{unit => agents}/test_agent_diagnostics.py (100%) rename tests/backend/{unit => agents}/test_agent_utils_sql_table_names.py (100%) rename tests/backend/{unit => agents}/test_client_image_strip.py (100%) rename tests/backend/{unit => agents}/test_duckdb_notes_prompt.py (100%) rename tests/backend/{unit => agents}/test_model_registry.py (100%) rename tests/backend/{unit => agents}/test_provenance_models.py (100%) rename tests/backend/{security => auth}/test_auth.py (96%) rename tests/backend/{integration => auth}/test_auth_info_endpoint.py (89%) rename tests/backend/{security => auth}/test_auth_provider_chain.py (94%) rename tests/backend/{unit => auth}/test_azure_easyauth_provider.py (94%) rename tests/backend/{unit => auth}/test_credential_vault.py (96%) rename tests/backend/{unit => auth}/test_credential_vault_factory.py (72%) rename tests/backend/{unit => auth}/test_flask_session_config.py (100%) rename tests/backend/{unit => auth}/test_github_oauth_provider.py (95%) rename tests/backend/{unit => auth}/test_oidc_provider.py (96%) delete mode 100644 tests/backend/contract/README.md rename tests/backend/{unit => data}/test_all_loader_verification.py (100%) rename tests/backend/{unit => data}/test_atomic_metadata_update.py (100%) rename tests/backend/{unit => data}/test_data_connector_config.py (100%) rename tests/backend/{unit => data}/test_data_connector_framework.py (99%) rename tests/backend/{unit => data}/test_data_connector_vault.py (99%) rename tests/backend/{integration => data}/test_excel_fixture_parsing.py (100%) rename tests/backend/{unit => data}/test_external_data_loader_table_names.py (100%) rename tests/backend/{unit => data}/test_file_manager_encoding.py (100%) rename tests/backend/{unit => data}/test_file_manager_table_names.py (100%) rename tests/backend/{unit => data}/test_json_chinese_serialization.py (100%) rename tests/backend/{unit => data}/test_parquet_utils_table_names.py (100%) rename tests/backend/{unit => data}/test_safe_data_filename.py (100%) rename tests/backend/{contract => data}/test_table_name_contracts.py (89%) rename tests/backend/{unit => data}/test_unicode_table_name_sanitization.py (100%) rename tests/backend/{unit => data}/test_workspace_fresh_names.py (100%) rename tests/backend/{unit => data}/test_workspace_manager.py (100%) rename tests/backend/{unit => data}/test_workspace_source_file_ops.py (100%) delete mode 100644 tests/backend/integration/README.md delete mode 100644 tests/backend/integration/test_plugin_app_config.py delete mode 100644 tests/backend/integration/test_plugin_auth_with_vault.py rename tests/backend/{unit => routes}/test_agent_diagnostics_wiring.py (100%) rename tests/backend/{integration => routes}/test_create_table_replace_source.py (94%) rename tests/backend/{integration => routes}/test_create_table_xls_upload.py (95%) rename tests/backend/{integration => routes}/test_credential_routes.py (88%) rename tests/backend/{integration => routes}/test_csv_encoding_roundtrip.py (95%) rename tests/backend/{integration => routes}/test_derive_data_repair_loop.py (97%) rename tests/backend/{unit => routes}/test_list_global_models_api.py (85%) rename tests/backend/{integration => routes}/test_parse_file_endpoint.py (94%) rename tests/backend/{contract => routes}/test_same_basename_upload.py (96%) rename tests/backend/{unit => routes}/test_session_routes_migration.py (77%) rename tests/backend/{integration => security}/test_sandbox.py (100%) delete mode 100644 tests/backend/unit/README.md delete mode 100644 tests/backend/unit/test_plugin_data_writer.py delete mode 100644 tests/backend/unit/test_plugin_discovery.py delete mode 100644 tests/backend/unit/test_superset_plugin.py rename tests/{plugin => database-dockers}/README.md (100%) rename tests/{plugin/test_bigquery => database-dockers/bigquery}/Dockerfile (100%) rename tests/{plugin/test_bigquery => database-dockers/bigquery}/README.md (100%) rename tests/{plugin/test_bigquery => database-dockers/bigquery}/init_data.yaml (100%) rename tests/{plugin/test_bigquery => database-dockers/bigquery}/test_bigquery_loader.py (100%) rename tests/{plugin/test_mongodb => database-dockers/mongodb}/Dockerfile (100%) rename tests/{plugin/test_mongodb => database-dockers/mongodb}/README.md (100%) rename tests/{plugin/test_mongodb => database-dockers/mongodb}/init_data.js (100%) rename tests/{plugin/test_mongodb => database-dockers/mongodb}/test_mongodb_loader.py (100%) rename tests/{plugin/test_mysql => database-dockers/mysql}/Dockerfile (100%) rename tests/{plugin/test_mysql => database-dockers/mysql}/README.md (100%) rename tests/{plugin/test_mysql => database-dockers/mysql}/init.sql (100%) rename tests/{plugin => database-dockers/mysql}/test_mysql_datalake.py (100%) rename tests/{plugin/test_mysql => database-dockers/mysql}/test_mysql_loader.py (100%) rename tests/{plugin/test_postgres => database-dockers/postgres}/Dockerfile (100%) rename tests/{plugin/test_postgres => database-dockers/postgres}/README.md (100%) rename tests/{plugin/test_postgres => database-dockers/postgres}/init.sql (100%) rename tests/{plugin/test_postgres => database-dockers/postgres}/test_postgresql_loader.py (100%) rename tests/{ => database-dockers}/superset/.env.superset (100%) rename tests/{ => database-dockers}/superset/README.md (100%) rename tests/{ => database-dockers}/superset/docker-compose.yml (100%) rename tests/{ => database-dockers}/superset/init-superset.sh (100%) rename tests/{ => database-dockers}/superset/sample_data.py (100%) rename tests/{ => database-dockers}/superset/start.sh (100%) rename tests/{ => database-dockers}/superset/superset_config.py (100%) rename tests/{backend/integration => database-dockers/superset}/test_superset_data_connector.py (97%) delete mode 100644 tests/plugin/test_mysql/test_mysql_data_connector.py delete mode 100644 tests/plugin/test_postgres/test_postgresql_data_connector.py diff --git a/design-docs/9-generalized-data-source-plugins.md b/design-docs/9-generalized-data-source-plugins.md index 861e08e0..ed3e0fa7 100644 --- a/design-docs/9-generalized-data-source-plugins.md +++ b/design-docs/9-generalized-data-source-plugins.md @@ -1,6 +1,6 @@ # Generalized Data Source Plugins — Unifying DataLoader + Plugin into a Lifecycle-Managed Connection -## Status: Phase 3 complete (legacy data-loader endpoints + plugin system removed) +## Status: Phase 3 complete (legacy plugins removed, backend restructured) ## 1. Problem @@ -1568,14 +1568,59 @@ For sources that can't filter server-side (e.g., some REST APIs), the framework - ✅ Removed legacy "Database" tab from UI #### 3d: Remove legacy plugin system ✅ -- ✅ Relocated `SupersetClient` + `SupersetAuthBridge` from `plugins/superset/` to `data_loader/superset/` (used by `SupersetLoader`) +- ✅ Relocated `SupersetClient` + `SupersetAuthBridge` from `plugins/superset/` to `data_loader/` (used by `SupersetLoader`) - ✅ Deleted `py-src/data_formulator/plugins/` directory (base classes, discovery engine, Superset plugin, all routes) - ✅ Deleted `src/plugins/` directory (frontend plugin host, registry, Superset UI components) - ✅ Removed plugin registration from `app.py` (`discover_and_register`, `ENABLED_PLUGINS`) - ✅ Removed frontend plugin imports (`getEnabledPlugins`, `PluginHost`, `registerPluginTranslations`) - ✅ Deleted legacy plugin tests + +#### 3e: Backend restructuring ✅ +- ✅ Created `auth/` package — merged `security/auth.py` → `auth/identity.py`, `auth_providers/` → `auth/providers/`, `auth_gateways/` → `auth/gateways/`, `credential_vault/` → `auth/vault/` +- ✅ Created `routes/` package — moved `tables_routes.py` → `routes/tables.py`, `agent_routes.py` → `routes/agents.py`, `session_routes.py` → `routes/sessions.py`, `credential_routes.py` → `routes/credentials.py`, `demo_stream_routes.py` → `routes/demo_stream.py` +- ✅ `security/` kept for non-auth concerns: `code_signing.py`, `sanitize.py`, `url_allowlist.py` +- ✅ Updated all import paths + patch targets across ~30 files +- ✅ Improved `_sanitize_error()` to preserve actionable detail in connector error messages +- ✅ Moved Docker-gated integration tests to `tests/database-dockers/` (mysql, postgres, bigquery, mongodb, superset) +- ✅ Fixed `test_auth_provider_chain` missing `_localhost_identity` reset - [ ] Integrate with unified data source panel ([doc #8](8-unified-data-source-panel.md)) +#### Post-restructuring backend layout + +``` +py-src/data_formulator/ +├── app.py ← Flask app + bootstrap +├── __main__.py ← CLI entry point +├── data_connector.py ← DataConnector framework + shared routes +├── workspace_factory.py ← Workspace resolution +├── model_registry.py ← AI model config +├── example_datasets_config.py ← Sample dataset config +│ +├── auth/ ← Identity, providers, gateways, vault +│ ├── identity.py ← init_auth, get_identity_id, get_active_provider +│ ├── providers/ ← AuthProvider subclasses (github, oidc, azure) +│ ├── gateways/ ← OAuth callback routes (github) +│ └── vault/ ← Fernet-encrypted credential storage +│ +├── routes/ ← Flask blueprints +│ ├── tables.py ← Table CRUD, file upload, parsing +│ ├── agents.py ← AI agent endpoints +│ ├── sessions.py ← Workspace session management +│ ├── credentials.py ← Vault API routes +│ └── demo_stream.py ← ISS demo + streaming +│ +├── security/ ← Non-auth security utilities +│ ├── code_signing.py ← HMAC signing for AI-generated code +│ ├── sanitize.py ← Error message scrubbing +│ └── url_allowlist.py ← API base URL validation +│ +├── agents/ ← AI agent implementations +├── data_loader/ ← ExternalDataLoader drivers (10 sources) +├── datalake/ ← Workspace storage layer +├── sandbox/ ← Code execution sandboxes +└── workflows/ ← Chart/viz generation +``` + ### Sub-doc Summary (9.1–9.3) | Doc | Title | Status | Key Deliverables | @@ -1596,45 +1641,14 @@ For sources that can't filter server-side (e.g., some REST APIs), the framework ### Q1: What happens to `DataSourcePlugin` and the `plugins/` directory? -They go away after migration. The auth and lifecycle components move into DF core: +**Done.** The entire `plugins/` directory and `DataSourcePlugin` base class have been removed (Phase 3d). The architecture is now: -``` -py-src/data_formulator/ - auth/ ← NEW: DF's auth layer (extracted from plugins/) - __init__.py - credentials.py ← encrypt/decrypt passwords & tokens at rest - connection_store.py ← read/write workspace/connections/{id}.json - token_manager.py ← token refresh, expiry checking (for token-mode sources) - sso.py ← SSO/OIDC provider (AuthProvider, extracted from plugins/) - data_loader/ ← EXISTING: all ExternalDataLoader subclasses - external_data_loader.py ← revised interface (§3.5) - mysql_data_loader.py - postgresql_data_loader.py - ... - data_connector.py ← NEW: DataConnector framework - (route generation, form computation, lifecycle) - plugins/ ← REMOVED after Phase 3 -``` - -Post-migration architecture: - -``` -ExternalDataLoader (driver) ← each source type implements this - ↓ -DataConnector (framework) ← generic lifecycle wrapper, one implementation - ↓ uses auth/ for credentials, tokens, SSO -data-sources.yml / auto-discovery ← config, not code -``` - -There are no "plugins" anymore — just **loaders** (the driver layer) and the **framework** (the lifecycle layer). The `plugins/` directory, `DataSourcePlugin` base class, and per-plugin `__init__.py` files are all removed. - -**What's reused** from the current plugin system (relocated to `auth/`): -- Credential encryption patterns → `auth/credentials.py` -- Session helpers, token refresh logic → `auth/token_manager.py` -- SSO bridge patterns (for token passthrough) → `auth/sso.py` -- Workspace connection persistence → `auth/connection_store.py` (new) -- Error isolation (one source failure doesn't crash others) — stays in framework -- Frontend error boundaries — stays in frontend +- **`data_loader/`** — all `ExternalDataLoader` subclasses (the driver layer), including `SupersetLoader` with its `superset_client.py` and `superset_auth_bridge.py` +- **`data_connector.py`** — generic lifecycle wrapper with shared routes (the framework layer) +- **`auth/`** — identity, providers, gateways, credential vault (the auth layer) +- **`routes/`** — all Flask blueprints + +There are no "plugins" anymore — just loaders, the connector framework, and config-driven registration. ### Q2: Multiple connections to the same source type? @@ -1688,13 +1702,7 @@ The `DataConnector` auth layer should support: ### Q6: Should the old `db-manager` endpoints remain? -The existing `POST /api/db-manager/load-table` is a stateless, one-shot endpoint. Once `DataConnector` plugins exist, it's redundant. But we should keep it for backward compatibility and deprecate it gradually. - -``` -Phase 1-2: Both endpoints work -Phase 3: /api/db-manager/* shows deprecation warning in logs -Phase 4: Remove (or keep as thin wrapper that delegates to plugin) -``` +**Done.** The legacy `/api/tables/data-loader/*` endpoints were removed in Phase 3a. All data loading flows through `/api/connectors/*` now. ## 9. Summary diff --git a/py-src/data_formulator/app.py b/py-src/data_formulator/app.py index d1c21f6b..669d9912 100644 --- a/py-src/data_formulator/app.py +++ b/py-src/data_formulator/app.py @@ -118,18 +118,18 @@ def _register_blueprints(): return _blueprints_registered = True # Import tables routes (imports database connectors) - print(" Loading data connectors...", flush=True) - from data_formulator.tables_routes import tables_bp + print(" Loading data loader drivers...", flush=True) + from data_formulator.routes.tables import tables_bp # Import agent routes (imports AI/ML libraries: litellm, sklearn, etc.) print(" Loading AI agents...", flush=True) - from data_formulator.agent_routes import agent_bp + from data_formulator.routes.agents import agent_bp # Import session routes - from data_formulator.session_routes import session_bp + from data_formulator.routes.sessions import session_bp # Import demo stream routes - from data_formulator.demo_stream_routes import demo_stream_bp, limiter as demo_stream_limiter, start_iss_collector + from data_formulator.routes.demo_stream import demo_stream_bp, limiter as demo_stream_limiter, start_iss_collector demo_stream_limiter.init_app(app) # Register blueprints @@ -142,24 +142,19 @@ def _register_blueprints(): start_iss_collector() # Initialise pluggable authentication (reads AUTH_PROVIDER env var) - from data_formulator.security.auth import init_auth, get_active_provider + from data_formulator.auth.identity import init_auth, get_active_provider init_auth(app) # Register auth gateway blueprints for stateful providers (e.g. GitHub OAuth) provider = get_active_provider() if provider and provider.name == "github": - from data_formulator.auth_gateways.github_gateway import github_bp + from data_formulator.auth.gateways.github_gateway import github_bp app.register_blueprint(github_bp) # Register credential vault API (safe even when vault is not configured) - from data_formulator.credential_routes import credential_bp + from data_formulator.routes.credentials import credential_bp app.register_blueprint(credential_bp) - # Auto-discover and register data source plugins - print(" Loading plugins...", flush=True) - from data_formulator.plugins import discover_and_register - discover_and_register(app) - # Auto-register all installed data loaders as DataConnector instances if not app.config['CLI_ARGS'].get('disable_data_connectors'): print(" Loading data connectors...", flush=True) @@ -202,7 +197,7 @@ def get_auth_info(): The response tells the frontend how to initiate login based on the active provider (OIDC PKCE, GitHub redirect, transparent, or none). """ - from data_formulator.security.auth import get_active_provider + from data_formulator.auth.identity import get_active_provider provider = get_active_provider() if provider: return flask.jsonify(provider.get_auth_info()) @@ -231,7 +226,7 @@ def get_app_config(): from data_formulator.datalake.workspace import get_data_formulator_home config["DATA_FORMULATOR_HOME"] = str(get_data_formulator_home()) - from data_formulator.security.auth import get_active_provider + from data_formulator.auth.identity import get_active_provider provider = get_active_provider() if provider: config["AUTH_PROVIDER"] = provider.name @@ -241,7 +236,7 @@ def get_app_config(): # For localhost mode this is the fixed local: identity; # for anonymous mode the server echoes back the browser-provided UUID. try: - from data_formulator.security.auth import get_identity_id + from data_formulator.auth.identity import get_identity_id identity = get_identity_id() id_type, _, id_value = identity.partition(':') config["IDENTITY"] = {"type": id_type, "id": id_value} @@ -249,27 +244,9 @@ def get_app_config(): pass # No identity available (e.g. during startup) # Expose credential vault availability to the frontend - from data_formulator.credential_vault import get_credential_vault + from data_formulator.auth.vault import get_credential_vault config["CREDENTIAL_VAULT_ENABLED"] = get_credential_vault() is not None - # Expose enabled data source plugins to the frontend - from data_formulator.plugins import ENABLED_PLUGINS - if ENABLED_PLUGINS: - plugins_info: dict[str, dict] = {} - for pid, plugin in ENABLED_PLUGINS.items(): - manifest = plugin.manifest() - frontend_cfg = plugin.get_frontend_config() - plugins_info[pid] = { - "id": manifest.get("id", pid), - "name": manifest.get("name", pid), - "icon": manifest.get("icon"), - "description": manifest.get("description"), - "capabilities": manifest.get("capabilities", []), - "auth_modes": manifest.get("auth_modes", []), - **frontend_cfg, - } - config["PLUGINS"] = plugins_info - # Expose data connectors to the frontend from data_formulator.data_connector import DATA_CONNECTORS if DATA_CONNECTORS: @@ -281,7 +258,7 @@ def get_app_config(): # Tell the frontend which connectors the current user has vault credentials for # so it can render "Connected" vs "Available" without N status calls. try: - from data_formulator.security.auth import get_identity_id + from data_formulator.auth.identity import get_identity_id identity = get_identity_id() connected_ids: list[str] = [] for sid, src in DATA_CONNECTORS.items(): diff --git a/py-src/data_formulator/plugins/superset/routes/__init__.py b/py-src/data_formulator/auth/__init__.py similarity index 57% rename from py-src/data_formulator/plugins/superset/routes/__init__.py rename to py-src/data_formulator/auth/__init__.py index e8eac8c4..59e481eb 100644 --- a/py-src/data_formulator/plugins/superset/routes/__init__.py +++ b/py-src/data_formulator/auth/__init__.py @@ -1,4 +1,2 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""Route sub-package for the Superset plugin.""" +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. diff --git a/py-src/data_formulator/auth_gateways/__init__.py b/py-src/data_formulator/auth/gateways/__init__.py similarity index 100% rename from py-src/data_formulator/auth_gateways/__init__.py rename to py-src/data_formulator/auth/gateways/__init__.py diff --git a/py-src/data_formulator/auth_gateways/github_gateway.py b/py-src/data_formulator/auth/gateways/github_gateway.py similarity index 100% rename from py-src/data_formulator/auth_gateways/github_gateway.py rename to py-src/data_formulator/auth/gateways/github_gateway.py diff --git a/py-src/data_formulator/security/auth.py b/py-src/data_formulator/auth/identity.py similarity index 98% rename from py-src/data_formulator/security/auth.py rename to py-src/data_formulator/auth/identity.py index e31f1cc7..4c28df4b 100644 --- a/py-src/data_formulator/security/auth.py +++ b/py-src/data_formulator/auth/identity.py @@ -25,11 +25,11 @@ from flask import Flask, g, request -from data_formulator.auth_providers import ( +from data_formulator.auth.providers import ( get_provider_class, list_available_providers, ) -from data_formulator.auth_providers.base import ( +from data_formulator.auth.providers.base import ( AuthenticationError, AuthProvider, AuthResult, diff --git a/py-src/data_formulator/auth_providers/__init__.py b/py-src/data_formulator/auth/providers/__init__.py similarity index 100% rename from py-src/data_formulator/auth_providers/__init__.py rename to py-src/data_formulator/auth/providers/__init__.py diff --git a/py-src/data_formulator/auth_providers/azure_easyauth.py b/py-src/data_formulator/auth/providers/azure_easyauth.py similarity index 100% rename from py-src/data_formulator/auth_providers/azure_easyauth.py rename to py-src/data_formulator/auth/providers/azure_easyauth.py diff --git a/py-src/data_formulator/auth_providers/base.py b/py-src/data_formulator/auth/providers/base.py similarity index 100% rename from py-src/data_formulator/auth_providers/base.py rename to py-src/data_formulator/auth/providers/base.py diff --git a/py-src/data_formulator/auth_providers/github_oauth.py b/py-src/data_formulator/auth/providers/github_oauth.py similarity index 100% rename from py-src/data_formulator/auth_providers/github_oauth.py rename to py-src/data_formulator/auth/providers/github_oauth.py diff --git a/py-src/data_formulator/auth_providers/oidc.py b/py-src/data_formulator/auth/providers/oidc.py similarity index 100% rename from py-src/data_formulator/auth_providers/oidc.py rename to py-src/data_formulator/auth/providers/oidc.py diff --git a/py-src/data_formulator/credential_vault/README.md b/py-src/data_formulator/auth/vault/README.md similarity index 100% rename from py-src/data_formulator/credential_vault/README.md rename to py-src/data_formulator/auth/vault/README.md diff --git a/py-src/data_formulator/credential_vault/__init__.py b/py-src/data_formulator/auth/vault/__init__.py similarity index 100% rename from py-src/data_formulator/credential_vault/__init__.py rename to py-src/data_formulator/auth/vault/__init__.py diff --git a/py-src/data_formulator/credential_vault/base.py b/py-src/data_formulator/auth/vault/base.py similarity index 100% rename from py-src/data_formulator/credential_vault/base.py rename to py-src/data_formulator/auth/vault/base.py diff --git a/py-src/data_formulator/credential_vault/local_vault.py b/py-src/data_formulator/auth/vault/local_vault.py similarity index 100% rename from py-src/data_formulator/credential_vault/local_vault.py rename to py-src/data_formulator/auth/vault/local_vault.py diff --git a/py-src/data_formulator/data_connector.py b/py-src/data_formulator/data_connector.py index 6ab82ae6..5214294c 100644 --- a/py-src/data_formulator/data_connector.py +++ b/py-src/data_formulator/data_connector.py @@ -22,6 +22,7 @@ import dataclasses import json as _json import logging +from pathlib import Path from typing import Any from flask import Blueprint, Flask, jsonify, request @@ -42,19 +43,37 @@ # --------------------------------------------------------------------------- def _sanitize_error(error: Exception) -> tuple[str, int]: - """Return a safe error message + HTTP status code. + """Return a user-facing error message + HTTP status code. - Never leaks internal details to the client. + Preserves actionable detail from known error categories while + stripping internal stack traces and implementation details. """ logger.error("DataConnector error", exc_info=error) - msg = str(error).lower() - if "required" in msg or "invalid" in msg: - return "Invalid connection parameters", 400 - if "permission" in msg or "access" in msg: - return "Access denied", 403 - if "connect" in msg or "refused" in msg: - return "Connection failed", 502 - return "An unexpected error occurred", 500 + raw = str(error) + msg = raw.lower() + + # Auth / credential errors — tell the user what went wrong + if any(kw in msg for kw in ("authenticat", "login", "credential", + "unauthorized", "401", "forbidden", "403")): + return f"Authentication failed: {raw}", 401 + if any(kw in msg for kw in ("expired", "token")): + return f"Token expired or invalid: {raw}", 401 + if any(kw in msg for kw in ("permission", "access denied", "denied")): + return f"Access denied: {raw}", 403 + + # Connection errors — actionable (wrong host, port, firewall) + if any(kw in msg for kw in ("connect", "refused", "unreachable", + "resolve", "dns", "network", "socket")): + return f"Connection failed: {raw}", 502 + if "timeout" in msg or "timed out" in msg: + return f"Connection timed out: {raw}", 504 + + # Validation errors + if "required" in msg or "invalid" in msg or "missing" in msg: + return f"Invalid parameters: {raw}", 400 + + # Unknown — still include the raw message so the user can report it + return f"Unexpected error: {raw}", 500 def _node_to_dict(node: CatalogNode) -> dict[str, Any]: @@ -122,7 +141,7 @@ def from_loader( icon=icon, ) - # -- DataSourcePlugin interface ---------------------------------------- + # -- Manifest / config interface ---------------------------------------- def _manifest(self) -> dict[str, Any]: return { @@ -195,13 +214,13 @@ def _resolve_delegated_login(self) -> dict[str, Any] | None: @staticmethod def _get_identity() -> str: - from data_formulator.security.auth import get_identity_id + from data_formulator.auth.identity import get_identity_id return get_identity_id() @staticmethod def _get_vault(): """Return the credential vault (or None if unavailable).""" - from data_formulator.credential_vault import get_credential_vault + from data_formulator.auth.vault import get_credential_vault return get_credential_vault() def _vault_store(self, identity: str, user_params: dict[str, Any]) -> bool: @@ -383,7 +402,7 @@ def list_data_loaders(): @connectors_bp.route("/api/connectors", methods=["GET"]) def list_connectors(): """List all registered connector instances (admin + user) with connection status.""" - from data_formulator.security.auth import get_identity_id + from data_formulator.auth.identity import get_identity_id try: identity = get_identity_id() @@ -440,7 +459,7 @@ def create_connector(): Persists to ``DATA_FORMULATOR_HOME/users//connectors.yaml``. """ from data_formulator.data_loader import DATA_LOADERS - from data_formulator.security.auth import get_identity_id + from data_formulator.auth.identity import get_identity_id data = request.get_json() or {} loader_type = data.get("loader_type") @@ -553,7 +572,7 @@ def delete_connector(connector_id: str): Admin connectors cannot be deleted (returns 403). """ - from data_formulator.security.auth import get_identity_id + from data_formulator.auth.identity import get_identity_id if connector_id in _ADMIN_CONNECTOR_IDS: return jsonify({"status": "error", "message": "Admin connectors cannot be deleted"}), 403 @@ -773,7 +792,7 @@ def connector_import_data(): table_name = data.get("table_name") import_options = data.get("import_options", {}) - from data_formulator.security.auth import get_identity_id + from data_formulator.auth.identity import get_identity_id from data_formulator.workspace_factory import get_workspace from data_formulator.datalake.parquet_utils import sanitize_table_name @@ -815,7 +834,7 @@ def connector_refresh_data(): if not table_name: return jsonify({"status": "error", "message": "table_name is required"}), 400 - from data_formulator.security.auth import get_identity_id + from data_formulator.auth.identity import get_identity_id from data_formulator.workspace_factory import get_workspace workspace = get_workspace(get_identity_id()) @@ -898,7 +917,7 @@ def connector_import_group(): source_filters = data.get("source_filters", []) group_name = data.get("group_name", "") - from data_formulator.security.auth import get_identity_id + from data_formulator.auth.identity import get_identity_id from data_formulator.workspace_factory import get_workspace from data_formulator.datalake.parquet_utils import sanitize_table_name @@ -986,7 +1005,7 @@ def _resolve_env_refs(params: dict[str, Any]) -> dict[str, Any]: return resolved -def _get_df_home() -> "Path": +def _get_df_home() -> Path: """Return DATA_FORMULATOR_HOME as a Path.""" from data_formulator.datalake.workspace import get_data_formulator_home return get_data_formulator_home() @@ -1013,7 +1032,6 @@ def _load_connectors_yaml(path: "Path") -> list[dict]: def _save_user_connectors(identity: str, specs: list[SourceSpec]) -> None: """Write user-created connectors to DATA_FORMULATOR_HOME/users//connectors.yaml.""" - from pathlib import Path from data_formulator.datalake.workspace import get_user_home user_dir = get_user_home(identity) user_dir.mkdir(parents=True, exist_ok=True) @@ -1044,7 +1062,6 @@ def _save_user_connectors(identity: str, specs: list[SourceSpec]) -> None: def _load_admin_specs() -> list[SourceSpec]: """Load admin connectors from DATA_FORMULATOR_HOME/connectors.yaml + env vars.""" import os - from pathlib import Path specs: list[SourceSpec] = [] diff --git a/py-src/data_formulator/plugins/superset/auth_bridge.py b/py-src/data_formulator/data_loader/superset_auth_bridge.py similarity index 91% rename from py-src/data_formulator/plugins/superset/auth_bridge.py rename to py-src/data_formulator/data_loader/superset_auth_bridge.py index dccde5d6..671ff5ee 100644 --- a/py-src/data_formulator/plugins/superset/auth_bridge.py +++ b/py-src/data_formulator/data_loader/superset_auth_bridge.py @@ -1,10 +1,7 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT License. -"""Authenticate users via the Superset REST API (JWT). - -Migrated verbatim from data-formulator 0.6 ``superset/auth_bridge.py``. -""" +"""Authenticate users via the Superset REST API (JWT).""" from __future__ import annotations diff --git a/py-src/data_formulator/plugins/superset/superset_client.py b/py-src/data_formulator/data_loader/superset_client.py similarity index 95% rename from py-src/data_formulator/plugins/superset/superset_client.py rename to py-src/data_formulator/data_loader/superset_client.py index 89f0a597..c907ca28 100644 --- a/py-src/data_formulator/plugins/superset/superset_client.py +++ b/py-src/data_formulator/data_loader/superset_client.py @@ -1,10 +1,7 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT License. -"""Thin wrapper around the Superset public REST API. - -Migrated verbatim from data-formulator 0.6 ``superset/superset_client.py``. -""" +"""Thin wrapper around the Superset public REST API.""" from __future__ import annotations diff --git a/py-src/data_formulator/data_loader/superset_data_loader.py b/py-src/data_formulator/data_loader/superset_data_loader.py index 00c044fb..521d6080 100644 --- a/py-src/data_formulator/data_loader/superset_data_loader.py +++ b/py-src/data_formulator/data_loader/superset_data_loader.py @@ -7,8 +7,7 @@ dashboard (table_group) → dataset (table) Authentication is JWT-based (``auth_mode() = "token"``). Data is fetched -via Superset's SQL Lab API, reusing the existing ``SupersetClient`` and -``SupersetAuthBridge`` from the legacy plugin. +via Superset's SQL Lab API. """ import json @@ -22,25 +21,14 @@ CatalogNode, ExternalDataLoader, ) +from data_formulator.data_loader.superset_client import SupersetClient +from data_formulator.data_loader.superset_auth_bridge import SupersetAuthBridge logger = logging.getLogger(__name__) -# Lazy-imported Superset helpers (only if the plugin deps are available) -_SupersetClient = None -_SupersetAuthBridge = None - - -def _ensure_imports(): - global _SupersetClient, _SupersetAuthBridge - if _SupersetClient is None: - from data_formulator.plugins.superset.superset_client import SupersetClient - from data_formulator.plugins.superset.auth_bridge import SupersetAuthBridge - _SupersetClient = SupersetClient - _SupersetAuthBridge = SupersetAuthBridge - # --------------------------------------------------------------------------- -# SQL building helpers (extracted from plugins/superset/routes/data.py) +# SQL building helpers # --------------------------------------------------------------------------- def _quote_identifier(name: str) -> str: @@ -133,7 +121,6 @@ def catalog_hierarchy() -> list[dict[str, str]]: ] def __init__(self, params: dict[str, Any]): - _ensure_imports() self.params = params self.url = (params.get("url") or "").rstrip("/") @@ -145,8 +132,8 @@ def __init__(self, params: dict[str, Any]): if not self.url: raise ValueError("Superset URL is required") - self._client = _SupersetClient(self.url) - self._bridge = _SupersetAuthBridge(self.url) + self._client = SupersetClient(self.url) + self._bridge = SupersetAuthBridge(self.url) # Authenticate immediately self._access_token: str | None = params.get("access_token") diff --git a/py-src/data_formulator/plugins/__init__.py b/py-src/data_formulator/plugins/__init__.py deleted file mode 100644 index f473595d..00000000 --- a/py-src/data_formulator/plugins/__init__.py +++ /dev/null @@ -1,108 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""Data source plugin auto-discovery and registration. - -On import, this module does **not** scan. Call :func:`discover_and_register` -once from ``app.py`` after the Flask app is created. - -Discovery walks every **sub-package** under ``plugins/``, looking for a -module-level ``plugin_class`` attribute that is a concrete -:class:`DataSourcePlugin` subclass. Enablement is gated by the -``required_env`` list in the plugin's ``manifest()``: all listed -environment variables must be set for the plugin to activate. - -Adding a new plugin: - -1. Create a sub-directory under ``plugins/`` -2. In its ``__init__.py``, set ``plugin_class = YourPlugin`` -3. Set the required env vars in ``.env`` -4. Restart — no core code changes needed -""" - -from __future__ import annotations - -import importlib -import logging -import os -import pkgutil -from typing import Any - -from data_formulator.plugins.base import DataSourcePlugin - -_log = logging.getLogger(__name__) - -ENABLED_PLUGINS: dict[str, DataSourcePlugin] = {} -DISABLED_PLUGINS: dict[str, str] = {} - - -def discover_and_register(app: Any) -> None: - """Scan ``plugins/`` sub-packages and register enabled plugins. - - Called once in :func:`data_formulator.app._register_blueprints`. - """ - for _finder, pkg_name, ispkg in pkgutil.iter_modules(__path__): - if not ispkg: - continue - - try: - mod = importlib.import_module(f"data_formulator.plugins.{pkg_name}") - except ImportError as exc: - DISABLED_PLUGINS[pkg_name] = f"Missing dependency: {exc.name}" - _log.info( - "Plugin '%s' disabled (import error): %s", pkg_name, exc, - ) - continue - - plugin_cls = getattr(mod, "plugin_class", None) - if plugin_cls is None: - continue - if not (isinstance(plugin_cls, type) and issubclass(plugin_cls, DataSourcePlugin)): - _log.warning( - "Plugin '%s': plugin_class is not a DataSourcePlugin subclass, skipped", - pkg_name, - ) - continue - - try: - manifest = plugin_cls.manifest() - except Exception as exc: - DISABLED_PLUGINS[pkg_name] = f"manifest() failed: {exc}" - _log.error("Plugin '%s' manifest() failed: %s", pkg_name, exc) - continue - - plugin_id = manifest.get("id", pkg_name) - required_env = manifest.get("required_env", []) - - missing_env = [e for e in required_env if not os.environ.get(e)] - if missing_env: - DISABLED_PLUGINS[plugin_id] = ( - f"Not configured: {', '.join(missing_env)}" - ) - _log.info( - "Plugin '%s' disabled: missing env %s", - plugin_id, - ", ".join(missing_env), - ) - continue - - try: - plugin: DataSourcePlugin = plugin_cls() - bp = plugin.create_blueprint() - app.register_blueprint(bp) - plugin.on_enable(app) - - ENABLED_PLUGINS[plugin_id] = plugin - _log.info( - "Plugin '%s' enabled (from plugins/%s/)", - plugin_id, - pkg_name, - ) - except Exception as exc: - DISABLED_PLUGINS[plugin_id] = str(exc) - _log.error( - "Plugin '%s' failed to initialise: %s", - plugin_id, - exc, - exc_info=True, - ) diff --git a/py-src/data_formulator/plugins/auth_base.py b/py-src/data_formulator/plugins/auth_base.py deleted file mode 100644 index f0670f14..00000000 --- a/py-src/data_formulator/plugins/auth_base.py +++ /dev/null @@ -1,262 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""Base class for plugin authentication with built-in Credential Vault lifecycle. - -Plugin authors subclass :class:`PluginAuthHandler` and implement four methods: - -* ``do_login(username, password)`` — authenticate with the external system -* ``do_clear_session()`` — clear Flask session keys for this plugin -* ``get_session_auth()`` — check if the current session is authenticated -* ``get_current_user()`` — return the current user dict from session - -The base class enforces: - -* **Vault store/delete** — encrypted credential storage, keyed by identity -* **Three-mode auth negotiation** — Session → Vault auto-login → Manual -* **Logout cleanup** — always clears both Session *and* Vault -* **Standard route generation** — ``/login``, ``/logout``, ``/status``, ``/me`` - -Usage:: - - class MyAuthHandler(PluginAuthHandler): - def do_login(self, username, password): - ... - def do_clear_session(self): - ... - def get_session_auth(self): - ... - def get_current_user(self): - ... - - _handler = MyAuthHandler("my_plugin") - auth_bp = _handler.create_auth_blueprint("/api/plugins/my_plugin/auth") - - # Add plugin-specific routes to auth_bp if needed - @auth_bp.route("/custom", methods=["POST"]) - def custom_route(): ... -""" -from __future__ import annotations - -import logging -from abc import ABC, abstractmethod -from typing import Any, Optional - -from flask import Blueprint, jsonify, request -from requests.exceptions import ConnectionError as RequestsConnectionError, HTTPError - -logger = logging.getLogger(__name__) - - -class PluginAuthHandler(ABC): - """Base class for plugin auth with built-in Vault lifecycle. - - Standard routes (login, logout, status, me) are generated by - :meth:`create_auth_blueprint`. Vault store / delete / auto-login - is handled automatically — plugin authors cannot forget it. - """ - - def __init__(self, plugin_id: str): - self.plugin_id = plugin_id - - # ------------------------------------------------------------------ - # Abstract: plugin author implements these - # ------------------------------------------------------------------ - - @abstractmethod - def do_login(self, username: str, password: str) -> dict[str, Any]: - """Authenticate with the external system and save to Flask session. - - Must call the external system's login API, persist the resulting - token / user in the Flask session, and return user info. - - Returns:: - - {"user": {"id": ..., "username": ..., "first_name": ..., "last_name": ...}} - - Raises: - Exception on authentication failure. - """ - ... - - @abstractmethod - def do_clear_session(self) -> None: - """Clear all plugin-specific keys from the Flask session.""" - ... - - @abstractmethod - def get_session_auth(self) -> Optional[dict[str, Any]]: - """Check the Flask session for existing authentication. - - Returns a status dict if authenticated:: - - {"authenticated": True, "mode": "session", "user": {...}} - - Returns ``None`` if no valid session exists. - """ - ... - - @abstractmethod - def get_current_user(self) -> Optional[dict[str, Any]]: - """Return the current user dict from the session, or ``None``.""" - ... - - # ------------------------------------------------------------------ - # Vault helpers — built-in, plugin authors should not override - # ------------------------------------------------------------------ - - def _get_vault(self): - from data_formulator.credential_vault import get_credential_vault - return get_credential_vault() - - def _get_identity(self) -> str: - from data_formulator.security.auth import get_identity_id - return get_identity_id() - - def vault_store(self, credentials: dict) -> None: - """Store credentials in the Vault (best-effort, never raises).""" - vault = self._get_vault() - if not vault: - return - try: - vault.store(self._get_identity(), self.plugin_id, credentials) - except Exception: - logger.warning("Failed to store credentials in vault for %s", self.plugin_id, exc_info=True) - - def vault_delete(self) -> None: - """Delete credentials from the Vault (best-effort, never raises).""" - vault = self._get_vault() - if not vault: - return - try: - vault.delete(self._get_identity(), self.plugin_id) - except Exception: - logger.warning("Failed to delete credentials from vault for %s", self.plugin_id, exc_info=True) - - def vault_retrieve(self) -> Optional[dict]: - """Retrieve credentials from the Vault, or ``None``.""" - vault = self._get_vault() - if not vault: - return None - try: - return vault.retrieve(self._get_identity(), self.plugin_id) - except Exception: - logger.warning("Failed to retrieve credentials from vault for %s", self.plugin_id, exc_info=True) - return None - - def try_vault_login(self) -> Optional[dict[str, Any]]: - """Attempt auto-login with Vault credentials. - - Returns ``None`` if no Vault or no stored credentials. - Otherwise returns a status dict (success or vault_stale). - """ - stored = self.vault_retrieve() - if not stored: - return None - - username = stored.get("username", "") - password = stored.get("password", "") - if not username or not password: - return None - - try: - result = self.do_login(username, password) - return { - "status": "ok", - "authenticated": True, - "mode": "vault", - "user": result["user"], - } - except Exception as exc: - logger.info("Vault credentials stale for %s: %s", self.plugin_id, exc) - return { - "status": "ok", - "authenticated": False, - "vault_stale": True, - } - - # ------------------------------------------------------------------ - # Blueprint factory - # ------------------------------------------------------------------ - - def create_auth_blueprint(self, url_prefix: str) -> Blueprint: - """Create a Flask Blueprint with standard auth routes. - - Generated routes: - - - ``POST /login`` — authenticate + Vault store/delete - - ``POST /logout`` — clear session + Vault delete (enforced) - - ``GET /status`` — three-mode negotiation - - ``GET /me`` — current user - - Returns the Blueprint so plugin-specific routes can be added:: - - auth_bp = handler.create_auth_blueprint(prefix) - @auth_bp.route("/my-custom-route", methods=["POST"]) - def custom(): ... - """ - bp_name = f"plugin_{self.plugin_id}_auth" - bp = Blueprint(bp_name, self.__class__.__module__, url_prefix=url_prefix) - handler = self - - @bp.route("/login", methods=["POST"]) - def login(): - data = request.get_json(force=True) - username = data.get("username", "") - password = data.get("password", "") - remember = data.get("remember", False) - - if not username or not password: - return jsonify({"status": "error", "message": "Missing credentials"}), 400 - - try: - result = handler.do_login(username, password) - except Exception as exc: - logger.warning("Login failed for %s: %s", username, exc) - if isinstance(exc, HTTPError) and exc.response is not None: - code = exc.response.status_code - if code in (401, 403): - msg = "Invalid username or password" - elif code == 429: - msg = "Too many login attempts, please try again later" - else: - msg = "Authentication service unavailable" - elif isinstance(exc, (RequestsConnectionError, OSError)): - msg = "Unable to connect to authentication service" - else: - msg = "Login failed" - return jsonify({"status": "error", "message": msg}), 401 - - if remember: - handler.vault_store({"username": username, "password": password}) - else: - handler.vault_delete() - - return jsonify({"status": "ok", **result}) - - @bp.route("/logout", methods=["POST"]) - def logout(): - handler.do_clear_session() - handler.vault_delete() - return jsonify({"status": "ok"}) - - @bp.route("/status", methods=["GET"]) - def auth_status(): - session_result = handler.get_session_auth() - if session_result is not None: - return jsonify({"status": "ok", **session_result}) - - vault_result = handler.try_vault_login() - if vault_result is not None: - return jsonify(vault_result) - - return jsonify({"status": "ok", "authenticated": False}) - - @bp.route("/me", methods=["GET"]) - def me(): - user = handler.get_current_user() - if not user: - return jsonify({"status": "error", "message": "Not authenticated"}), 401 - return jsonify({"status": "ok", "user": user}) - - return bp diff --git a/py-src/data_formulator/plugins/base.py b/py-src/data_formulator/plugins/base.py deleted file mode 100644 index 354aa640..00000000 --- a/py-src/data_formulator/plugins/base.py +++ /dev/null @@ -1,99 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""Base classes for the data source plugin system. - -A ``DataSourcePlugin`` is a self-contained integration with one external -BI platform (Superset, Metabase, Grafana, …). Each plugin ships: - -* **Backend** — Flask Blueprint with auth / catalog / data routes -* **Frontend** — React panel rendered inside the data upload dialog -* **Manifest** — declarative metadata that the framework uses for - auto-discovery, enablement gating, and frontend configuration - -Plugins are auto-discovered by :func:`data_formulator.plugins.discover_and_register`. -""" - -from __future__ import annotations - -from abc import ABC, abstractmethod -from typing import Any, Optional - -from flask import Blueprint, Flask - - -class DataSourcePlugin(ABC): - """Abstract base for data source plugins. - - Lifecycle (managed by ``discover_and_register``): - - 1. ``manifest()`` — framework reads ``required_env`` to decide enablement - 2. ``__init__()`` — instantiate the plugin - 3. ``create_blueprint()`` — obtain the Flask Blueprint - 4. ``on_enable(app)`` — called once after Blueprint registration - 5. Per-request: routes in the Blueprint handle auth / catalog / data - 6. ``on_disable()`` — called on teardown (future) - - Subclass contract: - - * ``manifest()`` must be a **staticmethod** returning a dict. - * ``create_blueprint()`` must return a Blueprint whose ``url_prefix`` - is ``/api/plugins//``. - * ``get_frontend_config()`` must **never** include secrets. - """ - - @staticmethod - @abstractmethod - def manifest() -> dict[str, Any]: - """Declarative plugin metadata. - - Required keys:: - - id — unique slug (e.g. ``"superset"``) - name — human-readable display name - env_prefix — e.g. ``"PLG_SUPERSET"`` - required_env — list of env vars that must be set to enable - - Optional keys:: - - icon, description, version, optional_env, - auth_modes — e.g. ``["sso", "jwt", "password"]`` - capabilities — e.g. ``["datasets", "dashboards", "filters"]`` - """ - ... - - @abstractmethod - def create_blueprint(self) -> Blueprint: - """Return a Flask Blueprint for this plugin's HTTP routes. - - The blueprint's ``url_prefix`` **must** be - ``/api/plugins//``. - """ - ... - - @abstractmethod - def get_frontend_config(self) -> dict[str, Any]: - """Non-sensitive configuration sent to the frontend. - - Merged with ``manifest()`` and included in ``/api/app-config`` - under ``PLUGINS.``. Must **not** contain secrets. - """ - ... - - # -- lifecycle hooks --------------------------------------------------- - - def on_enable(self, app: Flask) -> None: - """Called once after the Blueprint is registered.""" - - def on_disable(self) -> None: - """Called on application teardown (reserved for future use).""" - - # -- optional capabilities --------------------------------------------- - - def get_auth_status(self, session: dict) -> Optional[dict[str, Any]]: - """Return current user's auth status for this plugin, or ``None``.""" - return None - - def supports_sso_passthrough(self) -> bool: - """Whether this plugin can use the DF user's SSO token directly.""" - return False diff --git a/py-src/data_formulator/plugins/data_writer.py b/py-src/data_formulator/plugins/data_writer.py deleted file mode 100644 index ce99f073..00000000 --- a/py-src/data_formulator/plugins/data_writer.py +++ /dev/null @@ -1,119 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""High-level helper for plugins to write data into a user's Workspace. - -Instead of calling ``workspace.write_parquet()`` directly, plugins use -:class:`PluginDataWriter` which handles: - -* Identity-scoped workspace resolution (via ``get_identity_id()``) -* Table name sanitisation -* Automatic ``source_info`` stamping (``loader_type = "plugin:"``) -* ``overwrite=False`` collision avoidance (auto-suffix ``_1``, ``_2``, …) -""" - -from __future__ import annotations - -import logging -from typing import Any, Optional - -import pandas as pd - -from data_formulator.datalake.parquet_utils import sanitize_table_name -from data_formulator.security.auth import get_identity_id -from data_formulator.workspace_factory import get_workspace - -logger = logging.getLogger(__name__) - - -class PluginDataWriter: - """Write DataFrames into the current user's active Workspace. - - Parameters - ---------- - plugin_id: - Slug that identifies the plugin (e.g. ``"superset"``). - Stored as ``loader_type = "plugin:"``. - """ - - def __init__(self, plugin_id: str) -> None: - self._plugin_id = plugin_id - - # ------------------------------------------------------------------ - # Public API - # ------------------------------------------------------------------ - - def write_dataframe( - self, - df: pd.DataFrame, - table_name: str, - *, - overwrite: bool = True, - source_metadata: Optional[dict[str, Any]] = None, - ) -> dict[str, Any]: - """Write *df* as a Parquet table in the user's workspace. - - Returns a dict suitable for a JSON response:: - - { - "table_name": "sales_data", - "row_count": 1234, - "columns": [...], - "is_renamed": False, - } - """ - identity = get_identity_id() - workspace = get_workspace(identity) - - safe_name = sanitize_table_name(table_name) - - if not overwrite: - safe_name = self._unique_name(safe_name, workspace) - - source_info = self._build_source_info( - safe_name, source_metadata, - ) - - table_meta = workspace.write_parquet( - df, safe_name, source_info=source_info, - ) - - is_renamed = safe_name != sanitize_table_name(table_name) - - return { - "table_name": table_meta.name, - "row_count": table_meta.row_count, - "columns": [ - {"name": c.name, "type": c.dtype} - for c in (table_meta.columns or []) - ], - "is_renamed": is_renamed, - } - - # ------------------------------------------------------------------ - # Internal helpers - # ------------------------------------------------------------------ - - def _build_source_info( - self, - table_name: str, - source_metadata: Optional[dict[str, Any]], - ) -> dict[str, Any]: - meta: dict[str, Any] = { - "loader_type": f"plugin:{self._plugin_id}", - "source_table": table_name, - } - if source_metadata: - meta["loader_params"] = source_metadata - return meta - - @staticmethod - def _unique_name(base: str, workspace: Any) -> str: - """Append ``_1``, ``_2``, … until the name doesn't collide.""" - existing = set(workspace.list_tables()) - if base not in existing: - return base - idx = 1 - while f"{base}_{idx}" in existing: - idx += 1 - return f"{base}_{idx}" diff --git a/py-src/data_formulator/plugins/superset/__init__.py b/py-src/data_formulator/plugins/superset/__init__.py deleted file mode 100644 index a40ee461..00000000 --- a/py-src/data_formulator/plugins/superset/__init__.py +++ /dev/null @@ -1,126 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""Superset data source plugin for Data Formulator. - -.. deprecated:: - This legacy plugin is superseded by ``SupersetLoader`` registered as a - ``DataConnector`` (see ``data_loader/superset_data_loader.py``). - It will be removed in Phase 3 of the generalized-plugin migration. - New deployments should use the DataConnector route at - ``/api/connectors/superset/`` instead of ``/api/plugins/superset/``. - -Provides: -- Password / SSO authentication against a Superset instance -- Dataset & dashboard catalog browsing with native filter support -- Dataset loading via Superset SQL Lab → Workspace Parquet - -Activation requires ``PLG_SUPERSET_URL`` to be set. -""" - -from __future__ import annotations - -import logging -import os -import warnings -from typing import Any - -from flask import Blueprint, Flask - -from data_formulator.plugins.base import DataSourcePlugin -from data_formulator.plugins.superset.auth_bridge import SupersetAuthBridge -from data_formulator.plugins.superset.catalog import SupersetCatalog -from data_formulator.plugins.superset.superset_client import SupersetClient - - -class SupersetPlugin(DataSourcePlugin): - """Concrete ``DataSourcePlugin`` for Apache Superset. - - .. deprecated:: - Superseded by ``SupersetLoader`` + ``DataConnector``. - Routes at ``/api/plugins/superset/`` will be removed in Phase 3. - Use ``/api/connectors/superset/`` instead. - """ - - @staticmethod - def manifest() -> dict[str, Any]: - return { - "id": "superset", - "name": "Apache Superset", - "env_prefix": "PLG_SUPERSET", - "required_env": ["PLG_SUPERSET_URL"], - "icon": "superset", - "description": "Connect to an Apache Superset instance to browse and load datasets.", - "auth_modes": ["password", "sso"], - "capabilities": ["datasets", "dashboards", "filters"], - } - - def create_blueprint(self) -> Blueprint: - """Assemble a parent Blueprint that nests auth / catalog / data sub-blueprints.""" - parent = Blueprint("plugin_superset", __name__) - - from data_formulator.plugins.superset.routes.auth import auth_bp - from data_formulator.plugins.superset.routes.catalog import catalog_bp - from data_formulator.plugins.superset.routes.data import data_bp - - parent.register_blueprint(auth_bp) - parent.register_blueprint(catalog_bp) - parent.register_blueprint(data_bp) - - return parent - - def get_frontend_config(self) -> dict[str, Any]: - superset_url = os.environ.get("PLG_SUPERSET_URL", "") - sso_login_url = os.environ.get("PLG_SUPERSET_SSO_LOGIN_URL", "") - if not sso_login_url and superset_url: - sso_login_url = f"{superset_url.rstrip('/')}/df-sso-bridge/" - return { - "base_url": superset_url, - "sso_login_url": sso_login_url, - "guest_enabled": True, - "auth_url": "/api/plugins/superset/auth/login", - "status_url": "/api/plugins/superset/auth/status", - "catalog_url": "/api/plugins/superset/catalog/datasets", - "load_url": "/api/plugins/superset/data/load-dataset", - } - - def on_enable(self, app: Flask) -> None: - """Create shared service objects and store them as Flask extensions.""" - logger = logging.getLogger(__name__) - logger.warning( - "SupersetPlugin (legacy) is deprecated. " - "Use the DataConnector at /api/connectors/superset/ instead. " - "This plugin will be removed in Phase 3." - ) - warnings.warn( - "SupersetPlugin is deprecated; use SupersetLoader via DataConnector", - DeprecationWarning, - stacklevel=2, - ) - superset_url = os.environ["PLG_SUPERSET_URL"].rstrip("/") - cache_ttl = int(os.environ.get("PLG_SUPERSET_CACHE_TTL", "300")) - - bridge = SupersetAuthBridge(superset_url) - client = SupersetClient(superset_url) - catalog = SupersetCatalog(client, cache_ttl=cache_ttl) - - app.extensions["plugin_superset_bridge"] = bridge - app.extensions["plugin_superset_client"] = client - app.extensions["plugin_superset_catalog"] = catalog - - def get_auth_status(self, session: dict) -> dict[str, Any] | None: - from data_formulator.plugins.superset.session_helpers import KEY_USER - user = session.get(KEY_USER) - if user: - return { - "authenticated": True, - "user": { - "id": user.get("id"), - "username": user.get("username", ""), - }, - } - return {"authenticated": False} - - -# Plugin class attribute required by the discovery system -plugin_class = SupersetPlugin diff --git a/py-src/data_formulator/plugins/superset/catalog.py b/py-src/data_formulator/plugins/superset/catalog.py deleted file mode 100644 index 6cca5654..00000000 --- a/py-src/data_formulator/plugins/superset/catalog.py +++ /dev/null @@ -1,605 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""Two-tier Superset dataset catalog with TTL caching. - -Tier 1 -- summary: lightweight list for browsing. -Tier 2 -- detail: full column descriptions, types, extra metadata. - -Migrated verbatim from data-formulator 0.6 ``superset/catalog.py``. -""" - -from __future__ import annotations - -import json -import logging -import time -from typing import Any - -logger = logging.getLogger(__name__) - - -class SupersetCatalog: - - def __init__(self, superset_client: Any, cache_ttl: int = 300): - self.client = superset_client - self.cache_ttl = cache_ttl - self._cache: dict[str, dict] = {} - - # -- tier 1: summary ------------------------------------------------- - - def _fetch_all_datasets(self, access_token: str | None, page_size: int = 1000) -> list[dict]: - """Fetch all dataset pages from Superset (auto-pagination).""" - all_results: list[dict] = [] - page = 0 - while True: - raw = self.client.list_datasets(access_token, page=page, page_size=page_size) - batch = raw.get("result", []) - all_results.extend(batch) - total = raw.get("count", len(all_results)) - if len(all_results) >= total or len(batch) < page_size: - break - page += 1 - return all_results - - def get_catalog_summary( - self, - access_token: str | None, - user_id: int | None, - ) -> list[dict]: - """Lightweight dataset list (cached per user).""" - cache_key = f"summary_{user_id}" - cached = self._cache.get(cache_key) - if cached and time.time() - cached["ts"] < self.cache_ttl: - return cached["data"] - - all_raw = self._fetch_all_datasets(access_token) - datasets: list[dict] = [] - for ds in all_raw: - columns = ds.get("columns") or [] - if not columns and ds.get("id") is not None: - try: - detail = self.client.get_dataset_detail(access_token, ds["id"]) - columns = detail.get("columns") or [] - except Exception: - logger.debug("Failed to fetch dataset detail for %s", ds.get("id"), exc_info=True) - datasets.append( - { - "id": ds["id"], - "name": ds.get("table_name") or "", - "schema": ds.get("schema") or "", - "database": (ds.get("database") or {}).get("database_name", "") or "", - "description": ds.get("description") or "", - "column_count": len(columns), - "column_names": [c.get("column_name") or "" for c in columns], - "row_count": ds.get("row_count"), - } - ) - - self._cache[cache_key] = {"data": datasets, "ts": time.time()} - return datasets - - # -- tier 2: detail -------------------------------------------------- - - def get_dataset_detail( - self, - access_token: str, - dataset_id: int, - ) -> dict: - return self.client.get_dataset_detail(access_token, dataset_id) - - @staticmethod - def _load_json_blob(value: Any) -> dict[str, Any]: - if isinstance(value, dict): - return value - if isinstance(value, str) and value.strip(): - try: - parsed = json.loads(value) - if isinstance(parsed, dict): - return parsed - except Exception: - logger.debug("Failed to parse Superset json_metadata", exc_info=True) - return {} - - @staticmethod - def _quote_identifier(name: str) -> str: - escaped = (name or "").replace('"', '""') - return f'"{escaped}"' - - @staticmethod - def _sql_literal(value: Any) -> str: - if value is None: - return "NULL" - if isinstance(value, bool): - return "TRUE" if value else "FALSE" - if isinstance(value, (int, float)) and not isinstance(value, bool): - return str(value) - escaped = str(value).replace("'", "''") - return f"'{escaped}'" - - @staticmethod - def _normalize_column_type(column: dict[str, Any] | None) -> str: - if not column: - return "STRING" - raw = ( - column.get("type_generic") - or column.get("type") - or column.get("python_date_format") - or column.get("expressionType") - or "" - ) - normalized = str(raw).upper() - if column.get("is_dttm"): - return "TEMPORAL" - if any(token in normalized for token in ("DATE", "TIME", "TEMPORAL")): - return "TEMPORAL" - if any(token in normalized for token in ("INT", "FLOAT", "DOUBLE", "NUMERIC", "DECIMAL", "BIGINT")): - return "NUMERIC" - if "BOOL" in normalized: - return "BOOLEAN" - return "STRING" - - @staticmethod - def _infer_input_type(filter_type: str, column_type: str) -> str: - normalized_filter = (filter_type or "").lower() - if "time" in normalized_filter or column_type == "TEMPORAL": - return "time" - if "number" in normalized_filter or "range" in normalized_filter or column_type == "NUMERIC": - return "numeric" - if "select" in normalized_filter: - return "select" - return "text" - - @staticmethod - def _build_dataset_sql(detail: dict[str, Any]) -> tuple[int, str, str]: - database = detail.get("database") or {} - db_id = database["id"] - table_name = detail.get("table_name") or "" - schema = detail.get("schema", "") or "" - dataset_sql = (detail.get("sql") or "").strip() - dataset_kind = (detail.get("kind") or "").lower() - - if dataset_kind == "virtual" and dataset_sql: - return db_id, schema, f"SELECT * FROM ({dataset_sql.rstrip(';')}) AS _vds" - - prefix = f'"{schema}".' if schema else "" - return db_id, schema, f'SELECT * FROM {prefix}"{table_name}"' - - def _get_dataset_detail_cached( - self, - access_token: str, - dataset_id: int, - cache: dict[int, dict[str, Any]], - ) -> dict[str, Any]: - if dataset_id not in cache: - cache[dataset_id] = self.client.get_dataset_detail(access_token, dataset_id) - return cache[dataset_id] - - # -- dashboards ------------------------------------------------------ - - def _fetch_all_dashboards(self, access_token: str | None, page_size: int = 1000) -> list[dict]: - """Fetch all dashboard pages from Superset (auto-pagination).""" - all_results: list[dict] = [] - page = 0 - while True: - raw = self.client.list_dashboards(access_token, page=page, page_size=page_size) - batch = raw.get("result", []) - all_results.extend(batch) - total = raw.get("count", len(all_results)) - if len(all_results) >= total or len(batch) < page_size: - break - page += 1 - return all_results - - def get_dashboard_summary( - self, - access_token: str | None, - user_id: int | None, - ) -> list[dict]: - """Lightweight dashboard list (cached per user).""" - cache_key = f"dashboards_{user_id}" - cached = self._cache.get(cache_key) - if cached and time.time() - cached["ts"] < self.cache_ttl: - return cached["data"] - - all_raw = self._fetch_all_dashboards(access_token) - dashboards: list[dict] = [] - for db in all_raw: - owners = db.get("owners") or [] - dashboards.append( - { - "id": db["id"], - "title": db.get("dashboard_title") or "", - "slug": db.get("slug") or "", - "status": db.get("status") or "published", - "url": db.get("url") or "", - "changed_on_delta_humanized": db.get("changed_on_delta_humanized") or "", - "owners": [ - (o.get("first_name") or "") + " " + (o.get("last_name") or "") - for o in owners - ], - } - ) - - self._cache[cache_key] = {"data": dashboards, "ts": time.time()} - return dashboards - - def get_dashboard_datasets( - self, - access_token: str, - dashboard_id: int, - ) -> list[dict]: - """Return datasets used by a specific dashboard.""" - raw = self.client.get_dashboard_datasets(access_token, dashboard_id) - datasets: list[dict] = [] - for ds in raw.get("result", []): - columns = ds.get("columns") or [] - datasets.append( - { - "id": ds.get("id"), - "name": ds.get("table_name") or ds.get("name") or "", - "schema": ds.get("schema") or "", - "database": ((ds.get("database") or {}).get("database_name", "") - if isinstance(ds.get("database"), dict) - else ds.get("database_name") or "") or "", - "description": ds.get("description") or "", - "column_count": len(columns), - "column_names": [c.get("column_name") or "" for c in columns], - "row_count": ds.get("row_count"), - } - ) - return datasets - - def get_dashboard_filters( - self, - access_token: str, - dashboard_id: int, - dataset_id: int | None = None, - ) -> list[dict]: - """Return native filter definitions for a dashboard.""" - detail = self.client.get_dashboard_detail(access_token, dashboard_id) - metadata = self._load_json_blob(detail.get("json_metadata")) - raw_filters = ( - metadata.get("native_filter_configuration") - or metadata.get("filter_configuration") - or [] - ) - if isinstance(raw_filters, str): - try: - raw_filters = json.loads(raw_filters) - except Exception: - raw_filters = [] - - dataset_cache: dict[int, dict[str, Any]] = {} - filter_defs: list[dict] = [] - seen: set[tuple[str, int, str]] = set() - - is_time_filter_type = lambda ft: any( - tok in (ft or "").lower() for tok in ("time", "date", "temporal") - ) - - def _extract_default_value(rf: dict) -> Any: - """Pull the default filter value from Superset's native filter config.""" - dm = rf.get("defaultDataMask") or {} - fs = dm.get("filterState") or {} - return fs.get("value") - - for raw_filter in raw_filters: - if not isinstance(raw_filter, dict): - continue - targets = raw_filter.get("targets") or [] - control_values = raw_filter.get("controlValues") or {} - filter_id = str(raw_filter.get("id") or raw_filter.get("name") or f"filter-{len(filter_defs)}") - filter_name = raw_filter.get("name") or raw_filter.get("description") or "Unnamed filter" - filter_type = str(raw_filter.get("filterType") or raw_filter.get("type") or "") - multi = bool( - control_values.get("multiSelect") - or control_values.get("enableMultiple") - or control_values.get("multi_select") - ) - required = bool(raw_filter.get("required")) - time_filter = is_time_filter_type(filter_type) - - if time_filter and dataset_id is not None: - requested_dataset_id = int(dataset_id) - dataset_detail = self._get_dataset_detail_cached(access_token, requested_dataset_id, dataset_cache) - columns = dataset_detail.get("columns") or [] - column_name = "" - - for target in targets: - if not isinstance(target, dict): - continue - target_dataset_id = target.get("datasetId") or target.get("dataset_id") - if target_dataset_id and int(target_dataset_id) != requested_dataset_id: - continue - column_obj = target.get("column") or {} - column_name = ( - column_obj.get("name") - or target.get("column_name") - or target.get("columnName") - or "" - ) - if column_name: - break - - if not column_name: - main_dttm = (dataset_detail.get("main_dttm_col") or "").strip() - if main_dttm: - column_name = main_dttm - else: - for col in columns: - if col.get("is_dttm"): - column_name = col.get("column_name") or col.get("name") or "" - if column_name: - break - - if not column_name: - continue - - dedupe_key = (filter_id, requested_dataset_id, column_name) - if dedupe_key in seen: - continue - seen.add(dedupe_key) - - column_meta = next( - ( - col for col in columns - if (col.get("column_name") or col.get("name") or "") == column_name - ), - None, - ) - column_type = self._normalize_column_type(column_meta) - input_type = self._infer_input_type(filter_type, column_type) - default_val = _extract_default_value(raw_filter) - filter_defs.append( - { - "id": filter_id, - "name": filter_name, - "filter_type": filter_type or input_type, - "input_type": input_type, - "dataset_id": requested_dataset_id, - "dataset_name": dataset_detail.get("table_name") or "", - "column_name": column_name, - "column_type": column_type, - "multi": multi, - "required": required, - "supports_search": False, - "default_value": default_val, - } - ) - continue - - effective_targets = list(targets) - if not effective_targets and time_filter and dataset_id is not None: - effective_targets = [{"datasetId": dataset_id}] - - for target in effective_targets: - if not isinstance(target, dict): - continue - target_dataset_id = target.get("datasetId") or target.get("dataset_id") - if not target_dataset_id: - continue - target_dataset_id = int(target_dataset_id) - if dataset_id is not None and target_dataset_id != dataset_id: - continue - - column_obj = target.get("column") or {} - column_name = ( - column_obj.get("name") - or target.get("column_name") - or target.get("columnName") - or "" - ) - - dataset_detail = self._get_dataset_detail_cached(access_token, target_dataset_id, dataset_cache) - columns = dataset_detail.get("columns") or [] - - if not column_name and time_filter: - main_dttm = (dataset_detail.get("main_dttm_col") or "").strip() - if main_dttm: - column_name = main_dttm - else: - for col in columns: - if col.get("is_dttm"): - column_name = col.get("column_name") or col.get("name") or "" - break - - if not column_name: - continue - - dedupe_key = (filter_id, target_dataset_id, column_name) - if dedupe_key in seen: - continue - seen.add(dedupe_key) - - column_meta = next( - ( - col for col in columns - if (col.get("column_name") or col.get("name") or "") == column_name - ), - None, - ) - column_type = self._normalize_column_type(column_meta) - input_type = self._infer_input_type(filter_type, column_type) - default_val = _extract_default_value(raw_filter) - filter_defs.append( - { - "id": filter_id, - "name": filter_name, - "filter_type": filter_type or input_type, - "input_type": input_type, - "dataset_id": target_dataset_id, - "dataset_name": dataset_detail.get("table_name") or "", - "column_name": column_name, - "column_type": column_type, - "multi": multi, - "required": required, - "supports_search": input_type == "select", - "default_value": default_val, - } - ) - - return filter_defs - - def get_filter_options( - self, - access_token: str, - dataset_id: int, - column_name: str, - keyword: str = "", - limit: int = 50, - offset: int = 0, - ) -> dict[str, Any]: - """Fetch distinct values for a dataset column.""" - detail = self.client.get_dataset_detail(access_token, dataset_id) - columns = detail.get("columns") or [] - valid_columns = { - (column.get("column_name") or column.get("name") or ""): column - for column in columns - } - if column_name not in valid_columns: - raise ValueError(f"Unknown column: {column_name}") - - safe_limit = max(1, min(int(limit), 200)) - safe_offset = max(0, int(offset)) - trimmed_keyword = keyword.strip() - - def _normalize_raw_options(payload: Any) -> list[dict[str, Any]]: - result = payload.get("result", payload) if isinstance(payload, dict) else payload - if isinstance(result, dict): - if isinstance(result.get("values"), list): - result = result.get("values") - elif isinstance(result.get("data"), list): - result = result.get("data") - else: - result = [result] - - normalized: list[dict[str, Any]] = [] - if not isinstance(result, list): - return normalized - - for item in result: - raw = item - label = None - if isinstance(item, dict): - if "value" in item: - raw = item.get("value") - label = item.get("label") - elif "label" in item: - raw = item.get("label") - label = item.get("label") - elif column_name in item: - raw = item.get(column_name) - elif item: - raw = next(iter(item.values())) - elif isinstance(item, (list, tuple)): - raw = item[0] if item else None - - if raw is None: - continue - if trimmed_keyword and trimmed_keyword.lower() not in str(raw).lower(): - continue - normalized.append({ - "label": "" if label is None else str(label), - "value": raw, - }) - - deduped: list[dict[str, Any]] = [] - seen: set[str] = set() - for item in normalized: - key = repr(item["value"]) - if key in seen: - continue - seen.add(key) - if not item["label"]: - item["label"] = str(item["value"]) - deduped.append(item) - return deduped - - for fetcher in ( - lambda: self.client.get_datasource_column_values(access_token, dataset_id, column_name), - lambda: self.client.get_dataset_distinct_values(access_token, column_name), - ): - try: - raw_options = _normalize_raw_options(fetcher()) - sliced = raw_options[safe_offset : safe_offset + safe_limit + 1] - return { - "dataset_id": dataset_id, - "column_name": column_name, - "options": sliced[:safe_limit], - "has_more": len(sliced) > safe_limit, - } - except Exception: - logger.debug( - "Superset native option endpoint failed for dataset=%s column=%s", - dataset_id, - column_name, - exc_info=True, - ) - - db_id, schema, base_sql = self._build_dataset_sql(detail) - quoted_column = self._quote_identifier(column_name) - - where_clauses = [f"{quoted_column} IS NOT NULL"] - if trimmed_keyword: - where_clauses.append( - f"CAST({quoted_column} AS VARCHAR) ILIKE {self._sql_literal(f'%{trimmed_keyword}%')}" - ) - - sql = ( - f"SELECT DISTINCT {quoted_column} " - f"FROM ({base_sql}) AS _src " - f"WHERE {' AND '.join(where_clauses)} " - f"ORDER BY 1 " - f"LIMIT {safe_limit + 1} OFFSET {safe_offset}" - ) - sql_session = self.client.create_sql_session(access_token) - result = self.client.execute_sql_with_session( - sql_session, - db_id, - sql, - schema, - row_limit=safe_limit + 1, - ) - rows = result.get("data", []) or [] - has_more = len(rows) > safe_limit - rows = rows[:safe_limit] - - result_columns = result.get("columns") or [] - col_key = None - if result_columns: - col_key = ( - result_columns[0].get("column_name") - or result_columns[0].get("name") - or result_columns[0].get("label") - ) - if not col_key and rows: - col_key = next(iter(rows[0]), None) if isinstance(rows[0], dict) else None - - options = [] - for row in rows: - if isinstance(row, dict): - raw = row.get(col_key) if col_key else next(iter(row.values()), None) - elif isinstance(row, (list, tuple)): - raw = row[0] if row else None - else: - raw = row - options.append({ - "label": "" if raw is None else str(raw), - "value": raw, - }) - return { - "dataset_id": dataset_id, - "column_name": column_name, - "options": options, - "has_more": has_more, - } - - # -- cache management ------------------------------------------------ - - def invalidate(self, user_id: int | None = None) -> None: - if user_id is not None: - self._cache.pop(f"summary_{user_id}", None) - self._cache.pop(f"dashboards_{user_id}", None) - else: - self._cache.clear() diff --git a/py-src/data_formulator/plugins/superset/routes/auth.py b/py-src/data_formulator/plugins/superset/routes/auth.py deleted file mode 100644 index d5514188..00000000 --- a/py-src/data_formulator/plugins/superset/routes/auth.py +++ /dev/null @@ -1,127 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""Authentication routes for the Superset plugin. - -Uses :class:`PluginAuthHandler` for standard auth lifecycle (login, logout, -status, me) with built-in Credential Vault support. Superset-specific routes -(guest login, SSO save-tokens) are added to the same Blueprint. -""" -from __future__ import annotations - -import logging -from typing import Any, Optional - -from flask import current_app, jsonify, request - -from data_formulator.plugins.auth_base import PluginAuthHandler -from data_formulator.plugins.superset.session_helpers import ( - clear_session, - get_user, - require_auth, - save_session, - user_from_jwt_fallback, -) - -logger = logging.getLogger(__name__) - - -def _bridge(): - return current_app.extensions["plugin_superset_bridge"] - - -def _format_user(user: dict, *, is_guest: bool = False) -> dict: - info = { - "id": user.get("id"), - "username": user.get("username", ""), - "first_name": user.get("first_name", ""), - "last_name": user.get("last_name", ""), - } - if is_guest: - info["is_guest"] = True - return info - - -# ------------------------------------------------------------------ -# Superset-specific auth handler -# ------------------------------------------------------------------ - -class SupersetAuthHandler(PluginAuthHandler): - """Superset auth: proxies login to Superset JWT API.""" - - def do_login(self, username: str, password: str) -> dict[str, Any]: - result = _bridge().login(username, password) - access_token = result["access_token"] - refresh_token = result.get("refresh_token") - - try: - user_info = _bridge().get_user_info(access_token) - except Exception as exc: - logger.warning("Superset /api/v1/me unavailable, using JWT fallback: %s", exc) - user_info = user_from_jwt_fallback(access_token, username) - - save_session(access_token, user_info, refresh_token) - return {"user": _format_user(user_info)} - - def do_clear_session(self) -> None: - clear_session() - - def get_session_auth(self) -> Optional[dict[str, Any]]: - token, user = require_auth() - if token and user: - return {"authenticated": True, "mode": "session", "user": _format_user(user)} - if user: - return { - "authenticated": True, - "mode": "session", - "user": _format_user(user, is_guest=user.get("_guest", False)), - } - return None - - def get_current_user(self) -> Optional[dict[str, Any]]: - user = get_user() - return user if user else None - - -# ------------------------------------------------------------------ -# Blueprint (standard routes via base class + Superset extras) -# ------------------------------------------------------------------ - -_handler = SupersetAuthHandler("superset") -auth_bp = _handler.create_auth_blueprint("/api/plugins/superset/auth") - - -@auth_bp.route("/guest", methods=["POST"]) -def guest_login(): - """Enter guest mode — browse public data without Superset credentials.""" - guest_user = {"id": None, "username": "guest", "first_name": "Guest", "last_name": "", "_guest": True} - save_session("", guest_user, None) - - return jsonify({ - "status": "ok", - "user": _format_user(guest_user, is_guest=True), - }) - - -@auth_bp.route("/sso/save-tokens", methods=["POST"]) -def sso_save_tokens(): - """Receive Superset JWT tokens obtained via the Popup SSO flow.""" - data = request.get_json(force=True) - access_token = data.get("access_token") - refresh_token = data.get("refresh_token") - user_from_popup = data.get("user", {}) - - if not access_token: - return jsonify({"status": "error", "message": "Missing access_token"}), 400 - - try: - user_info = _bridge().get_user_info(access_token) - except Exception: - user_info = user_from_popup - - if not user_info or not user_info.get("id"): - user_info = user_from_jwt_fallback(access_token, user_from_popup.get("username", "")) - - save_session(access_token, user_info, refresh_token) - - return jsonify({"status": "ok", "user": _format_user(user_info)}) diff --git a/py-src/data_formulator/plugins/superset/routes/catalog.py b/py-src/data_formulator/plugins/superset/routes/catalog.py deleted file mode 100644 index 0dd20794..00000000 --- a/py-src/data_formulator/plugins/superset/routes/catalog.py +++ /dev/null @@ -1,216 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""Catalog routes for the Superset plugin. - -Migrated from 0.6 ``superset/catalog_routes.py`` with: -- Plugin-namespaced session helpers -- Routes under ``/api/plugins/superset/catalog/`` -- Extensions keyed as ``plugin_superset_*`` -""" - -from __future__ import annotations - -import logging - -from flask import Blueprint, current_app, jsonify, request -from requests.exceptions import HTTPError - -from data_formulator.plugins.superset.session_helpers import ( - require_auth, - try_refresh, -) -from data_formulator.security.sanitize import safe_error_response - -logger = logging.getLogger(__name__) - -catalog_bp = Blueprint( - "plugin_superset_catalog", - __name__, - url_prefix="/api/plugins/superset/catalog", -) - - -def _catalog(): - return current_app.extensions["plugin_superset_catalog"] - - -def _with_retry(fn, *args, **kwargs): - """Call *fn* and retry once with a refreshed token on 401.""" - try: - return fn(*args, **kwargs) - except HTTPError as e: - if e.response is not None and e.response.status_code == 401: - new_token = try_refresh() - if new_token: - new_args = (new_token,) + args[1:] - return fn(*new_args, **kwargs) - return None - raise - - -@catalog_bp.route("/datasets", methods=["GET"]) -def list_datasets(): - """List datasets visible to the current user.""" - token, user = require_auth() - if not user: - return jsonify({"status": "error", "message": "Authentication required"}), 401 - user_id = user["id"] - - catalog = _catalog() - try: - if token: - datasets = _with_retry(catalog.get_catalog_summary, token, user_id) - else: - datasets = catalog.get_catalog_summary(None, user_id) - if datasets is None: - return jsonify({"status": "error", "message": "Authentication expired, please log in again"}), 401 - except HTTPError as e: - return safe_error_response(e, 502, log_message="Superset API call failed") - except Exception as e: - return safe_error_response(e, 500, log_message="Failed to list datasets") - - return jsonify({"status": "ok", "datasets": datasets, "count": len(datasets)}) - - -@catalog_bp.route("/dashboards", methods=["GET"]) -def list_dashboards(): - """List dashboards visible to the current user.""" - token, user = require_auth() - if not user: - return jsonify({"status": "error", "message": "Authentication required"}), 401 - user_id = user["id"] - - catalog = _catalog() - try: - if token: - dashboards = _with_retry(catalog.get_dashboard_summary, token, user_id) - else: - dashboards = catalog.get_dashboard_summary(None, user_id) - if dashboards is None: - return jsonify({"status": "error", "message": "Authentication expired, please log in again"}), 401 - except HTTPError as e: - return safe_error_response(e, 502, log_message="Superset API call failed") - except Exception as e: - return safe_error_response(e, 500, log_message="Failed to list dashboards") - - return jsonify({"status": "ok", "dashboards": dashboards, "count": len(dashboards)}) - - -@catalog_bp.route("/dashboards//datasets", methods=["GET"]) -def get_dashboard_datasets(dashboard_id: int): - """Get datasets used by a specific dashboard.""" - token, user = require_auth() - if not user: - return jsonify({"status": "error", "message": "Authentication required"}), 401 - - catalog = _catalog() - try: - if token: - datasets = _with_retry(catalog.get_dashboard_datasets, token, dashboard_id) - else: - datasets = catalog.get_dashboard_datasets(None, dashboard_id) - if datasets is None: - return jsonify({"status": "error", "message": "Authentication expired, please log in again"}), 401 - except HTTPError as e: - return safe_error_response(e, 502, log_message="Superset API call failed") - except Exception as e: - return safe_error_response(e, 500, log_message="Failed to get dashboard datasets") - - return jsonify({"status": "ok", "datasets": datasets, "count": len(datasets)}) - - -@catalog_bp.route("/dashboards//filters", methods=["GET"]) -def get_dashboard_filters(dashboard_id: int): - """Get native filters defined for a dashboard.""" - token, user = require_auth() - if not user: - return jsonify({"status": "error", "message": "Authentication required"}), 401 - - dataset_id_raw = request.args.get("dataset_id") - dataset_id = int(dataset_id_raw) if dataset_id_raw else None - - catalog = _catalog() - try: - if token: - filters = _with_retry(catalog.get_dashboard_filters, token, dashboard_id, dataset_id) - else: - filters = catalog.get_dashboard_filters(None, dashboard_id, dataset_id) - if filters is None: - return jsonify({"status": "error", "message": "Authentication expired, please log in again"}), 401 - except HTTPError as e: - return safe_error_response(e, 502, log_message="Superset API call failed") - except Exception as e: - return safe_error_response(e, 500, log_message="Failed to get dashboard filters") - - return jsonify({ - "status": "ok", - "dashboard_id": dashboard_id, - "dataset_id": dataset_id, - "filters": filters, - "count": len(filters), - }) - - -@catalog_bp.route("/filters/options", methods=["GET"]) -def get_filter_options(): - """Get option values for a dashboard filter field.""" - token, user = require_auth() - if not user: - return jsonify({"status": "error", "message": "Authentication required"}), 401 - - dataset_id_raw = request.args.get("dataset_id") - column_name = (request.args.get("column_name") or "").strip() - keyword = (request.args.get("keyword") or "").strip() - limit = int(request.args.get("limit", 50)) - offset = int(request.args.get("offset", 0)) - - if not dataset_id_raw or not column_name: - return jsonify({"status": "error", "message": "dataset_id and column_name are required"}), 400 - - dataset_id = int(dataset_id_raw) - catalog = _catalog() - try: - if token: - payload = _with_retry( - catalog.get_filter_options, token, dataset_id, column_name, keyword, limit, offset, - ) - else: - payload = catalog.get_filter_options(None, dataset_id, column_name, keyword, limit, offset) - if payload is None: - return jsonify({"status": "error", "message": "Authentication expired, please log in again"}), 401 - except HTTPError as e: - return safe_error_response(e, 502, log_message="Superset API call failed") - except ValueError as e: - return safe_error_response( - e, 400, - client_message="Invalid filter options request", - log_message="Invalid filter options request", - ) - except Exception as e: - return safe_error_response(e, 500, log_message="Failed to get filter options") - - return jsonify({"status": "ok", **payload}) - - -@catalog_bp.route("/datasets/", methods=["GET"]) -def get_dataset_detail(dataset_id: int): - """Full detail for a single dataset.""" - token, user = require_auth() - if not user: - return jsonify({"status": "error", "message": "Authentication required"}), 401 - - catalog = _catalog() - try: - if token: - detail = _with_retry(catalog.get_dataset_detail, token, dataset_id) - else: - detail = catalog.get_dataset_detail(None, dataset_id) - if detail is None: - return jsonify({"status": "error", "message": "Authentication expired, please log in again"}), 401 - except HTTPError as e: - return safe_error_response(e, 502, log_message="Superset API call failed") - except Exception as e: - return safe_error_response(e, 500, log_message="Failed to get dataset detail") - - return jsonify({"status": "ok", "dataset": detail}) diff --git a/py-src/data_formulator/plugins/superset/routes/data.py b/py-src/data_formulator/plugins/superset/routes/data.py deleted file mode 100644 index 6a288aec..00000000 --- a/py-src/data_formulator/plugins/superset/routes/data.py +++ /dev/null @@ -1,389 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""Data routes for the Superset plugin. - -Migrated from 0.6 ``superset/data_routes.py`` with: -- **PluginDataWriter** replaces DuckDB ``db_manager`` -- Plugin-namespaced session helpers -- Routes under ``/api/plugins/superset/data/`` -""" - -from __future__ import annotations - -import json -import logging -import re -from typing import Any - -import pandas as pd -from flask import Blueprint, Response, current_app, jsonify, request, stream_with_context -from requests.exceptions import HTTPError - -from data_formulator.plugins.data_writer import PluginDataWriter -from data_formulator.plugins.superset.session_helpers import ( - require_auth, - try_refresh, -) -from data_formulator.security.sanitize import safe_error_response - -logger = logging.getLogger(__name__) - -data_bp = Blueprint( - "plugin_superset_data", - __name__, - url_prefix="/api/plugins/superset/data", -) - - -# ------------------------------------------------------------------ -# SQL building helpers (lifted from 0.6) -# ------------------------------------------------------------------ - -def _sanitize_table_name(raw: str) -> str: - """Normalize a raw table name to a safe identifier.""" - name = (raw or "").lower().replace("-", "_").replace(" ", "_") - name = re.sub(r"[^\w]", "_", name, flags=re.UNICODE) - name = re.sub(r"_+", "_", name).strip("_") - if not name or not name[0].isalpha(): - name = f"table_{name}" - return name - - -def _quote_identifier(name: str) -> str: - escaped = (name or "").replace('"', '""') - return f'"{escaped}"' - - -def _column_ref(name: str) -> str: - stripped = (name or "").strip() - if re.fullmatch(r"\w+", stripped, flags=re.UNICODE): - return stripped - return _quote_identifier(stripped) - - -def _sql_literal(value: Any) -> str: - if value is None: - return "NULL" - if isinstance(value, bool): - return "TRUE" if value else "FALSE" - if isinstance(value, (int, float)) and not isinstance(value, bool): - return str(value) - escaped = str(value).replace("'", "''") - return f"'{escaped}'" - - -def _build_dataset_sql(detail: dict) -> tuple[int, str, str]: - db_id = detail["database"]["id"] - table_name = detail["table_name"] - schema = detail.get("schema", "") or "" - dataset_sql = (detail.get("sql") or "").strip() - dataset_kind = (detail.get("kind") or "").lower() - - if dataset_kind == "virtual" and dataset_sql: - return db_id, schema, f"SELECT * FROM ({dataset_sql.rstrip(';')}) AS _vds" - - prefix = f'"{schema}".' if schema else "" - return db_id, schema, f'SELECT * FROM {prefix}"{table_name}"' - - -def _build_column_map(detail: dict) -> dict[str, str]: - """Build a mapping: any known column identifier → actual SQL column reference. - - Superset columns can be referenced by column_name, name, verbose_name, - or expression. The SQL-safe reference is always the ``column_name`` - (the physical name in the database). - - Lookups are case-insensitive: lowercase variants of every key are also - stored so that ``column_map.get(name)`` works regardless of casing - differences between Superset metadata fields (e.g. ``main_dttm_col`` - may be upper-case while the physical column is lower-case). - """ - col_map: dict[str, str] = {} - - for col in (detail.get("columns") or []): - physical = (col.get("column_name") or col.get("name") or "").strip() - if not physical: - continue - col_map[physical] = physical - for alias_key in ("name", "verbose_name"): - alias = (col.get(alias_key) or "").strip() - if alias and alias not in col_map: - col_map[alias] = physical - expr = (col.get("expression") or "").strip() - if expr and expr not in col_map: - col_map[expr] = physical - - main_dttm = (detail.get("main_dttm_col") or "").strip() - if main_dttm and main_dttm not in col_map: - # First try: case-insensitive match against column_name - matched = False - for col in (detail.get("columns") or []): - cn = (col.get("column_name") or col.get("name") or "").strip() - if cn.lower() == main_dttm.lower(): - col_map[main_dttm] = cn - matched = True - break - if not matched: - # main_dttm_col refers to a source-table column that doesn't - # appear in the virtual dataset output (e.g. "TS" is used inside - # the SQL as bl.TS but the output is aliased to "出库日期"). - # Fall back to the first column marked is_dttm=True. - for col in (detail.get("columns") or []): - if col.get("is_dttm"): - cn = (col.get("column_name") or col.get("name") or "").strip() - if cn: - col_map[main_dttm] = cn - matched = True - break - if not matched: - # Last resort: look for a column whose type looks like a date - for col in (detail.get("columns") or []): - col_type = (col.get("type") or "").upper() - if any(kw in col_type for kw in ("DATE", "TIME", "TIMESTAMP", "DATETIME")): - cn = (col.get("column_name") or col.get("name") or "").strip() - if cn: - col_map[main_dttm] = cn - matched = True - break - if not matched: - logger.warning( - "main_dttm_col=%r does not match any dataset column; " - "filters using this name will be rejected.", - main_dttm, - ) - - for metric in (detail.get("metrics") or []): - for key in ("metric_name", "verbose_name"): - val = (metric.get(key) or "").strip() - expr = (metric.get("expression") or "").strip() - if val and val not in col_map: - col_map[val] = expr if expr else val - - lower_extras: dict[str, str] = {} - for k, v in col_map.items(): - lk = k.lower() - if lk not in col_map and lk not in lower_extras: - lower_extras[lk] = v - col_map.update(lower_extras) - - return col_map - - -def _resolve_column(column: str, column_map: dict[str, str]) -> str | None: - """Look up a filter column in the map, with case-insensitive fallback.""" - return column_map.get(column) or column_map.get(column.lower()) - - -def _build_where_clauses(filters: list[dict], column_map: dict[str, str]) -> list[str]: - clauses: list[str] = [] - allowed_ops = { - "IN", "NOT_IN", "EQ", "NEQ", "GT", "GTE", "LT", "LTE", - "BETWEEN", "LIKE", "ILIKE", "IS_NULL", "IS_NOT_NULL", - } - compare_op_map = { - "EQ": "=", "NEQ": "!=", "GT": ">", "GTE": ">=", - "LT": "<", "LTE": "<=", "LIKE": "LIKE", "ILIKE": "ILIKE", - } - - for raw_filter in filters: - if not isinstance(raw_filter, dict): - raise ValueError("Invalid filter payload") - column = (raw_filter.get("column") or raw_filter.get("column_name") or "").strip() - operator = str(raw_filter.get("operator") or "").upper() - value = raw_filter.get("value") - if not column or operator not in allowed_ops: - raise ValueError(f"Invalid filter definition: {raw_filter}") - physical = _resolve_column(column, column_map) - if not physical: - raise ValueError(f"Unknown filter column: {column}") - - quoted_column = _column_ref(physical) - if operator == "IS_NULL": - clauses.append(f"{quoted_column} IS NULL") - continue - if operator == "IS_NOT_NULL": - clauses.append(f"{quoted_column} IS NOT NULL") - continue - if operator in {"IN", "NOT_IN"}: - values = value if isinstance(value, list) else [value] - values = [v for v in values if v not in (None, "")] - if not values: - continue - joined = ", ".join(_sql_literal(v) for v in values) - keyword = "NOT IN" if operator == "NOT_IN" else "IN" - clauses.append(f"{quoted_column} {keyword} ({joined})") - continue - if operator == "BETWEEN": - if not isinstance(value, list) or len(value) != 2: - raise ValueError(f"BETWEEN requires two values for column {column}") - if value[0] in (None, "") or value[1] in (None, ""): - continue - clauses.append( - f"{quoted_column} BETWEEN {_sql_literal(value[0])} AND {_sql_literal(value[1])}" - ) - continue - - if value in (None, ""): - continue - if operator in {"LIKE", "ILIKE"}: - text_value = str(value) - if "%" not in text_value and "_" not in text_value: - text_value = f"%{text_value}%" - clauses.append(f"{quoted_column} {compare_op_map[operator]} {_sql_literal(text_value)}") - continue - clauses.append(f"{quoted_column} {compare_op_map[operator]} {_sql_literal(value)}") - - return clauses - - -# ------------------------------------------------------------------ -# Routes -# ------------------------------------------------------------------ - -@data_bp.route("/load-dataset", methods=["POST"]) -def load_dataset(): - """Fetch data from Superset (RBAC + RLS) and write into the user's Workspace. - - Supports streaming progress via ``"stream": true`` in the request body. - """ - token, user = require_auth() - if not user: - return jsonify({"status": "error", "message": "Not authenticated"}), 401 - if not token: - return jsonify({"status": "error", "message": "Loading data requires login. Please sign in first."}), 401 - - data = request.get_json(force=True) - dataset_id = data.get("dataset_id") - row_limit = int(data.get("row_limit", 20_000)) - stream_mode = bool(data.get("stream", False)) - table_name_override = (data.get("table_name") or "").strip() - filters = data.get("filters") or [] - - if not dataset_id: - return jsonify({"status": "error", "message": "dataset_id required"}), 400 - - superset_client = current_app.extensions["plugin_superset_client"] - - try: - detail = superset_client.get_dataset_detail(token, dataset_id) - except HTTPError as exc: - if exc.response is not None and exc.response.status_code == 401: - token = try_refresh() - if token: - try: - detail = superset_client.get_dataset_detail(token, dataset_id) - except Exception as retry_err: - logger.warning("Auth retry failed for dataset %s: %s", dataset_id, retry_err) - return jsonify({"status": "error", "message": "Authentication failed"}), 401 - else: - return jsonify({"status": "error", "message": "Authentication expired, please log in again"}), 401 - else: - return safe_error_response(exc, 502, log_message="Failed to fetch dataset detail") - except Exception as exc: - return safe_error_response(exc, 500, log_message="Failed to fetch dataset detail") - - db_id, schema, base_sql = _build_dataset_sql(detail) - table_name = detail["table_name"] - column_map = _build_column_map(detail) - logger.info( - "Dataset %s columns raw: %s", - dataset_id, - [(c.get("column_name"), c.get("name"), c.get("verbose_name"), - c.get("expression"), c.get("is_dttm"), c.get("type")) - for c in (detail.get("columns") or [])], - ) - logger.info("Dataset %s main_dttm_col=%s", dataset_id, detail.get("main_dttm_col")) - logger.info("Dataset %s column_map=%s", dataset_id, column_map) - try: - where_clauses = _build_where_clauses(filters, column_map) - except ValueError as exc: - return safe_error_response( - exc, 400, - client_message="Invalid filter definition", - log_message="Invalid filter definition", - ) - - final_table_name = _sanitize_table_name(table_name_override or table_name) - writer = PluginDataWriter("superset") - - def _generate(): - try: - sql_session = superset_client.create_sql_session(token) - where_sql = f" WHERE {' AND '.join(where_clauses)}" if where_clauses else "" - full_sql = f"SELECT * FROM ({base_sql}) AS _src{where_sql} LIMIT {row_limit}" - logger.info( - "Superset load dataset_id=%s filters=%s sql=%s", - dataset_id, filters, full_sql, - ) - result = superset_client.execute_sql_with_session( - sql_session, db_id, full_sql, schema, row_limit, - ) - all_rows = result.get("data", []) or [] - columns = [c.get("column_name", c.get("name", "")) for c in result.get("columns", [])] - - if stream_mode: - yield json.dumps({ - "type": "progress", - "loaded_batches": 1, - "total_loaded_rows": len(all_rows), - }, ensure_ascii=False) + "\n" - - write_result: dict[str, Any] = {} - if all_rows: - df = pd.DataFrame(all_rows) - write_result = writer.write_dataframe( - df, - final_table_name, - source_metadata={ - "plugin": "superset", - "dataset_id": dataset_id, - "filters": filters, - "row_limit": row_limit, - }, - ) - else: - write_result = { - "table_name": final_table_name, - "row_count": 0, - "columns": [], - "is_renamed": False, - } - - done_payload = { - "status": "ok", - "table_name": write_result.get("table_name", final_table_name), - "row_count": write_result.get("row_count", 0), - "columns": columns, - } - - if stream_mode: - yield json.dumps({"type": "done", **done_payload}, ensure_ascii=False) + "\n" - else: - yield json.dumps(done_payload, ensure_ascii=False) - - except Exception as exc: - logger.error("Failed to load dataset %s", dataset_id, exc_info=exc) - if isinstance(exc, ValueError): - msg = "Invalid request parameters for dataset loading" - elif isinstance(exc, TypeError): - msg = "Data type error while processing dataset" - else: - msg = "An unexpected error occurred while loading the dataset" - err = {"status": "error", "message": msg} - if stream_mode: - yield json.dumps({"type": "error", **err}, ensure_ascii=False) + "\n" - else: - yield json.dumps(err, ensure_ascii=False) - - if stream_mode: - return Response( - stream_with_context(_generate()), - content_type="text/x-ndjson; charset=utf-8", - headers={"Cache-Control": "no-cache", "X-Accel-Buffering": "no"}, - ) - - payload_text = "".join(_generate()) - parsed = json.loads(payload_text) - status_code = 500 if parsed.get("status") == "error" else 200 - return Response(payload_text, status=status_code, content_type="application/json") diff --git a/py-src/data_formulator/plugins/superset/session_helpers.py b/py-src/data_formulator/plugins/superset/session_helpers.py deleted file mode 100644 index b68974a7..00000000 --- a/py-src/data_formulator/plugins/superset/session_helpers.py +++ /dev/null @@ -1,138 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""Plugin-namespaced Flask session helpers for the Superset plugin. - -All session keys are prefixed with ``plugin_superset_`` to avoid -collisions with other plugins or the core app. -""" - -from __future__ import annotations - -import base64 -import json -import logging -import time -from typing import Any, Optional - -from flask import current_app, session - -logger = logging.getLogger(__name__) - -# Session key prefix — isolates Superset plugin state from other plugins -_PREFIX = "plugin_superset_" - -# Session keys -KEY_TOKEN = f"{_PREFIX}token" -KEY_REFRESH_TOKEN = f"{_PREFIX}refresh_token" -KEY_USER = f"{_PREFIX}user" - - -def get_token() -> Optional[str]: - return session.get(KEY_TOKEN) - - -def get_refresh_token() -> Optional[str]: - return session.get(KEY_REFRESH_TOKEN) - - -def get_user() -> Optional[dict]: - return session.get(KEY_USER) - - -def save_session( - access_token: str, - user_info: dict[str, Any], - refresh_token: Optional[str] = None, -) -> None: - """Persist Superset auth state in the Flask session.""" - session[KEY_TOKEN] = access_token - session[KEY_USER] = user_info - if refresh_token is not None: - session[KEY_REFRESH_TOKEN] = refresh_token - session.permanent = True - - -def clear_session() -> None: - """Remove all Superset plugin keys from the session.""" - for key in (KEY_TOKEN, KEY_REFRESH_TOKEN, KEY_USER): - session.pop(key, None) - - -def is_token_expired(token: str, buffer_seconds: int = 60) -> bool: - """Decode the JWT exp claim and check if it's expired (or about to). - Returns True on parse failure (conservative: prefer refresh over stale use).""" - try: - payload = token.split(".")[1] - payload += "=" * (-len(payload) % 4) - claims = json.loads(base64.urlsafe_b64decode(payload)) - return time.time() > claims.get("exp", 0) - buffer_seconds - except Exception: - return True - - -def try_refresh() -> Optional[str]: - """Attempt to refresh the Superset access_token. Returns the new token - on success, or None on failure.""" - refresh_tok = get_refresh_token() - if not refresh_tok: - logger.warning("Superset access_token expired with no refresh_token") - return None - try: - bridge = current_app.extensions["plugin_superset_bridge"] - result = bridge.refresh_token(refresh_tok) - new_token = result.get("access_token") - if new_token: - session[KEY_TOKEN] = new_token - logger.info("Superset access_token refreshed automatically") - return new_token - except Exception as e: - logger.warning("Superset token refresh failed: %s", e) - return None - - -def require_auth() -> tuple[Optional[str], Optional[dict]]: - """Return ``(token, user)`` or ``(None, None)`` if no session exists. - - Guest sessions have an empty token but a valid user dict with - ``_guest=True``. In that case ``(None, user)`` is returned so that - catalog routes can call Superset without an Authorization header. - - For normal sessions, the token is automatically refreshed if expired. - """ - token = get_token() - user = get_user() - if not user: - return None, None - - # Guest mode: user present but token is empty - if not token: - return None, user - - if is_token_expired(token): - token = try_refresh() - if not token: - return None, None - - return token, user - - -def user_from_jwt_fallback(access_token: str, username: str) -> dict: - """Build minimal user info from JWT claims when /api/v1/me is unavailable.""" - try: - parts = access_token.split(".") - if len(parts) < 2: - return {"id": None, "username": username, "first_name": "", "last_name": ""} - payload = parts[1] - padding = "=" * (-len(payload) % 4) - decoded = base64.urlsafe_b64decode(payload + padding).decode("utf-8") - claims = json.loads(decoded) - user_id = claims.get("sub") - try: - user_id = int(user_id) if user_id is not None else None - except (TypeError, ValueError): - user_id = None - return {"id": user_id, "username": username, "first_name": "", "last_name": ""} - except Exception: - logger.debug("JWT fallback parse failed", exc_info=True) - return {"id": None, "username": username, "first_name": "", "last_name": ""} diff --git a/py-src/data_formulator/routes/__init__.py b/py-src/data_formulator/routes/__init__.py new file mode 100644 index 00000000..59e481eb --- /dev/null +++ b/py-src/data_formulator/routes/__init__.py @@ -0,0 +1,2 @@ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. diff --git a/py-src/data_formulator/agent_routes.py b/py-src/data_formulator/routes/agents.py similarity index 99% rename from py-src/data_formulator/agent_routes.py rename to py-src/data_formulator/routes/agents.py index 9b1342d5..31d19a3e 100644 --- a/py-src/data_formulator/agent_routes.py +++ b/py-src/data_formulator/routes/agents.py @@ -23,7 +23,7 @@ from data_formulator.agents.agent_sort_data import SortDataAgent from data_formulator.agents.agent_simple import SimpleAgents -from data_formulator.security.auth import get_identity_id +from data_formulator.auth.identity import get_identity_id from data_formulator.security.code_signing import sign_result, verify_code, MAX_CODE_SIZE from data_formulator.datalake.workspace import Workspace from data_formulator.workspace_factory import get_workspace diff --git a/py-src/data_formulator/credential_routes.py b/py-src/data_formulator/routes/credentials.py similarity index 92% rename from py-src/data_formulator/credential_routes.py rename to py-src/data_formulator/routes/credentials.py index 498d0ae7..d48d2abe 100644 --- a/py-src/data_formulator/credential_routes.py +++ b/py-src/data_formulator/routes/credentials.py @@ -14,8 +14,8 @@ from flask import Blueprint, jsonify, request -from data_formulator.credential_vault import get_credential_vault -from data_formulator.security.auth import get_identity_id +from data_formulator.auth.vault import get_credential_vault +from data_formulator.auth.identity import get_identity_id logger = logging.getLogger(__name__) diff --git a/py-src/data_formulator/demo_stream_routes.py b/py-src/data_formulator/routes/demo_stream.py similarity index 100% rename from py-src/data_formulator/demo_stream_routes.py rename to py-src/data_formulator/routes/demo_stream.py diff --git a/py-src/data_formulator/session_routes.py b/py-src/data_formulator/routes/sessions.py similarity index 99% rename from py-src/data_formulator/session_routes.py rename to py-src/data_formulator/routes/sessions.py index 2d2022d2..15a1e517 100644 --- a/py-src/data_formulator/session_routes.py +++ b/py-src/data_formulator/routes/sessions.py @@ -32,7 +32,7 @@ from flask import Blueprint, request, jsonify, send_file -from data_formulator.security.auth import get_identity_id +from data_formulator.auth.identity import get_identity_id from data_formulator.workspace_factory import ( get_workspace, get_workspace_manager, diff --git a/py-src/data_formulator/tables_routes.py b/py-src/data_formulator/routes/tables.py similarity index 99% rename from py-src/data_formulator/tables_routes.py rename to py-src/data_formulator/routes/tables.py index d24b1208..0be5e4e9 100644 --- a/py-src/data_formulator/tables_routes.py +++ b/py-src/data_formulator/routes/tables.py @@ -13,7 +13,7 @@ from flask import request, jsonify, Blueprint, Response import pandas as pd from pathlib import Path -from data_formulator.security.auth import get_identity_id +from data_formulator.auth.identity import get_identity_id from data_formulator.datalake.workspace import Workspace from data_formulator.workspace_factory import get_workspace as _create_workspace from data_formulator.datalake.parquet_utils import sanitize_table_name as parquet_sanitize_table_name, safe_data_filename diff --git a/py-src/data_formulator/security/__init__.py b/py-src/data_formulator/security/__init__.py index 8b62ed97..59e481eb 100644 --- a/py-src/data_formulator/security/__init__.py +++ b/py-src/data_formulator/security/__init__.py @@ -1,19 +1,2 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT License. - -"""Security utilities: authentication, code signing, sanitization, URL allowlist.""" - -from data_formulator.security.auth import get_identity_id -from data_formulator.security.code_signing import sign_code, sign_result, verify_code, MAX_CODE_SIZE -from data_formulator.security.sanitize import sanitize_error_message -from data_formulator.security.url_allowlist import validate_api_base - -__all__ = [ - "get_identity_id", - "sign_code", - "sign_result", - "verify_code", - "MAX_CODE_SIZE", - "sanitize_error_message", - "validate_api_base", -] diff --git a/pyproject.toml b/pyproject.toml index 738cd963..38cd8a89 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -56,7 +56,7 @@ dependencies = [ "db-dtypes", # bigquery # SSO / Auth deps "PyJWT[crypto]>=2.8.0", # OIDC JWT verification (includes cryptography) - "requests", # GitHub OAuth code exchange, plugin HTTP calls + "requests", # GitHub OAuth code exchange, Superset API calls ] [project.urls] diff --git a/pytest.ini b/pytest.ini index d8644fb0..209f4201 100644 --- a/pytest.ini +++ b/pytest.ini @@ -6,11 +6,9 @@ python_files = test_*.py python_classes = Test* python_functions = test_* markers = - backend: backend unit or integration tests - frontend: frontend-related contract or integration tests - contract: boundary contract tests - security: security-focused tests (sandbox confinement, auth, signing) + backend: backend tests + frontend: frontend tests + security: security tests (sandbox, auth, signing) auth: authentication provider tests - plugin: pluggable provider / data-source plugin tests vault: credential vault tests xfail_known_bug: regression tests for known bugs diff --git a/src/app/dfSlice.tsx b/src/app/dfSlice.tsx index c4a9a1df..a1c69f64 100644 --- a/src/app/dfSlice.tsx +++ b/src/app/dfSlice.tsx @@ -55,7 +55,6 @@ export interface ServerConfig { label?: string; [key: string]: unknown; }; - PLUGINS?: Record; CONNECTORS?: Array<{ source_id: string; source_type: string; diff --git a/src/app/tableThunks.ts b/src/app/tableThunks.ts index a080d730..e531ff57 100644 --- a/src/app/tableThunks.ts +++ b/src/app/tableThunks.ts @@ -409,46 +409,6 @@ export function buildDictTableFromWorkspace( }; } -/** - * Load a table that a plugin has already written to the workspace. - * - * This is the canonical way for **any** data-source plugin to surface its - * output in the app. It fetches the workspace table listing, builds a - * proper {@link DictTable} (with type inference, source config, etc.), - * and dispatches it through the standard `loadTable` path. - * - * @param tableName The workspace table name returned by the plugin backend. - * @param pluginId Plugin identifier (used in the `source` config). - */ -export const loadPluginTable = createAsyncThunk< - LoadTableResult | null, - { tableName: string; pluginId: string }, - { state: DataFormulatorState } ->( - 'dataFormulator/loadPluginTable', - async ({ tableName, pluginId }, { dispatch }) => { - const listResp = await fetchWithIdentity(getUrls().LIST_TABLES, { method: 'GET' }); - const listData = await listResp.json(); - if (listData.status !== 'success') return null; - - const wsTable = listData.tables.find((t: any) => t.name === tableName); - if (!wsTable) return null; - - const source: DataSourceConfig = { - type: 'database', - databaseTable: wsTable.name, - canRefresh: false, - lastRefreshed: Date.now(), - originalTableName: wsTable.name, - }; - const tableObj = buildDictTableFromWorkspace(wsTable, source); - - const result = await dispatch(loadTable({ table: tableObj })).unwrap(); - return result; - }, -); - - /** * Check if any ancestor table of a given table is local-only (no virtual field). * A table without `virtual` has all its data in the browser only, not on the server. diff --git a/src/index.tsx b/src/index.tsx index 790cfdf8..580ec0d6 100644 --- a/src/index.tsx +++ b/src/index.tsx @@ -5,9 +5,6 @@ import React from 'react'; import './index.css'; import './i18n'; -import { registerPluginTranslations } from './plugins/registry'; - -registerPluginTranslations(); import store, { persistor } from './app/store' import { Provider } from 'react-redux' diff --git a/src/plugins/PluginHost.tsx b/src/plugins/PluginHost.tsx deleted file mode 100644 index 98587cb3..00000000 --- a/src/plugins/PluginHost.tsx +++ /dev/null @@ -1,72 +0,0 @@ -// Copyright (c) Microsoft Corporation. -// Licensed under the MIT License. - -/** - * PluginHost — renders a plugin's Panel component inside the upload dialog. - * - * Wraps the plugin with error boundaries and provides the standard - * {@link PluginHostCallbacks} interface. - */ - -import React, { Component, FC, useCallback } from 'react'; -import { Box, Typography, Alert } from '@mui/material'; -import { useTranslation } from 'react-i18next'; - -import type { DataSourcePluginModule, PluginConfig, PluginHostCallbacks, DataLoadedInfo } from './types'; - -interface PluginHostProps { - module: DataSourcePluginModule; - config: PluginConfig; - onDataLoaded: (info: DataLoadedInfo) => void; - onClose: () => void; -} - -interface ErrorBoundaryState { - error: Error | null; -} - -class PluginErrorBoundary extends Component< - { pluginId: string; children: React.ReactNode }, - ErrorBoundaryState -> { - state: ErrorBoundaryState = { error: null }; - - static getDerivedStateFromError(error: Error) { - return { error }; - } - - render() { - if (this.state.error) { - return ( - - - Plugin "{this.props.pluginId}" crashed: {this.state.error.message} - - - ); - } - return this.props.children; - } -} - -export const PluginHost: FC = ({ - module, - config, - onDataLoaded, - onClose, -}) => { - const callbacks: PluginHostCallbacks = { - onDataLoaded, - onClose, - }; - - const { Panel } = module; - - return ( - - - - - - ); -}; diff --git a/src/plugins/index.ts b/src/plugins/index.ts deleted file mode 100644 index f4e54611..00000000 --- a/src/plugins/index.ts +++ /dev/null @@ -1,6 +0,0 @@ -// Copyright (c) Microsoft Corporation. -// Licensed under the MIT License. - -export type { DataSourcePluginModule, PluginConfig, PluginHostCallbacks, PluginPanelProps, DataLoadedInfo } from './types'; -export { getEnabledPlugins, getPluginModule } from './registry'; -export { PluginHost } from './PluginHost'; diff --git a/src/plugins/registry.ts b/src/plugins/registry.ts deleted file mode 100644 index fdb09cbf..00000000 --- a/src/plugins/registry.ts +++ /dev/null @@ -1,69 +0,0 @@ -// Copyright (c) Microsoft Corporation. -// Licensed under the MIT License. - -/** - * Plugin registry — build-time discovery via `import.meta.glob`. - * - * Every sub-directory under `src/plugins/` that contains an `index.ts` - * exporting a {@link DataSourcePluginModule} is automatically discovered. - * The registry merges these modules with the backend `PLUGINS` config - * from `/api/app-config` so only **enabled** plugins are surfaced. - */ - -import type { DataSourcePluginModule, PluginConfig } from './types'; -import i18n from '../i18n'; - -// Build-time eager import of every `plugins/*/index.{ts,tsx}` -const pluginModules = import.meta.glob<{ default: DataSourcePluginModule }>( - ['./**/index.ts', './**/index.tsx'], - { eager: true }, -); - -/** All frontend plugin modules keyed by `id`. */ -const _modules: Map = new Map(); - -for (const [path, mod] of Object.entries(pluginModules)) { - if (path === './index.ts' || path === './index.tsx') continue; - const plugin = mod.default; - if (plugin?.id) { - _modules.set(plugin.id, plugin); - } -} - -/** - * Return the list of plugins that are both: - * 1. Discovered on the frontend (have an `index.ts`) - * 2. Enabled on the backend (present in `PLUGINS` from app-config) - */ -export function getEnabledPlugins( - backendPlugins: Record | undefined, -): Array<{ module: DataSourcePluginModule; config: PluginConfig }> { - if (!backendPlugins) return []; - - const result: Array<{ module: DataSourcePluginModule; config: PluginConfig }> = []; - for (const [id, config] of Object.entries(backendPlugins)) { - const mod = _modules.get(id); - if (mod) { - result.push({ module: mod, config }); - } - } - return result; -} - -/** Get a single plugin module by id (for direct rendering). */ -export function getPluginModule(id: string): DataSourcePluginModule | undefined { - return _modules.get(id); -} - -/** - * Merge each plugin's self-contained translations into the i18next - * `translation` namespace. Call once at app startup, after i18n.init(). - */ -export function registerPluginTranslations(): void { - for (const [, mod] of _modules) { - if (!mod.locales) continue; - for (const [lang, bundle] of Object.entries(mod.locales)) { - i18n.addResourceBundle(lang, 'translation', bundle, true, true); - } - } -} diff --git a/src/plugins/superset/SupersetCatalog.tsx b/src/plugins/superset/SupersetCatalog.tsx deleted file mode 100644 index 1d7f55c0..00000000 --- a/src/plugins/superset/SupersetCatalog.tsx +++ /dev/null @@ -1,304 +0,0 @@ -// Copyright (c) Microsoft Corporation. -// Licensed under the MIT License. - -import React, { FC, useState, useEffect, useCallback } from 'react'; -import { - Box, Typography, Button, TextField, CircularProgress, IconButton, - Tooltip, Chip, Paper, Divider, Alert, useTheme, alpha, - InputAdornment, LinearProgress, Dialog, DialogTitle, DialogContent, - DialogActions, Select, MenuItem, -} from '@mui/material'; -import StorageIcon from '@mui/icons-material/Storage'; -import RefreshIcon from '@mui/icons-material/Refresh'; -import SearchIcon from '@mui/icons-material/Search'; -import DownloadIcon from '@mui/icons-material/Download'; -import AddIcon from '@mui/icons-material/Add'; -import TableRowsIcon from '@mui/icons-material/TableRows'; -import { useTranslation } from 'react-i18next'; - -import { fetchDatasets as apiFetchDatasets, loadDataset as apiLoadDataset, SupersetDataset } from './api'; - -const MAX_COLUMN_DISPLAY = 60; -const MAX_TOOLTIP_ROWS = 12; -const ROW_LIMIT_OPTIONS = [20000, 50000, 100000, 200000, 500000]; - -interface SupersetCatalogProps { - onDatasetLoaded?: (tableName: string, rowCount: number) => void; -} - -const ColumnChip: FC<{ columns: string[] }> = ({ columns }) => { - const { t } = useTranslation(); - const joined = columns.join(', '); - const truncated = joined.length > MAX_COLUMN_DISPLAY; - const display = truncated ? joined.slice(0, MAX_COLUMN_DISPLAY) + '…' : joined; - - const chip = ( - - ); - - return ( - - {columns.map((col, i) => ( - - {col} - - ))} - - } - placement="top" arrow - > - {chip} - - ); -}; - -export const SupersetCatalog: FC = ({ onDatasetLoaded }) => { - const theme = useTheme(); - const { t } = useTranslation(); - - const [datasets, setDatasets] = useState([]); - const [loading, setLoading] = useState(false); - const [searchQuery, setSearchQuery] = useState(''); - const [loadingDatasetId, setLoadingDatasetId] = useState(null); - const [error, setError] = useState(null); - const [successMessage, setSuccessMessage] = useState(null); - const [rowLimit, setRowLimit] = useState(20000); - const [suffixDialogOpen, setSuffixDialogOpen] = useState(false); - const [suffixDialogDs, setSuffixDialogDs] = useState(null); - const [suffixInput, setSuffixInput] = useState(''); - - const doFetchDatasets = useCallback(async () => { - setLoading(true); - setError(null); - try { - const ds = await apiFetchDatasets(); - setDatasets(ds); - } catch (err: any) { - setError(err.message || 'Network error'); - } finally { - setLoading(false); - } - }, []); - - useEffect(() => { doFetchDatasets(); }, [doFetchDatasets]); - - const doLoadDataset = async (dataset: SupersetDataset, tableNameOverride?: string) => { - setLoadingDatasetId(dataset.id); - setError(null); - setSuccessMessage(null); - try { - const result = await apiLoadDataset({ - dataset_id: dataset.id, - row_limit: rowLimit, - table_name: tableNameOverride, - }); - setSuccessMessage(t('plugin.superset.loadSuccess', { - name: tableNameOverride || dataset.name, - count: result.row_count, - })); - onDatasetLoaded?.(result.table_name, result.row_count); - } catch (err: any) { - setError(t('plugin.superset.loadFailed', { message: err.message })); - } finally { - setLoadingDatasetId(null); - } - }; - - const filteredDatasets = datasets.filter(ds => { - if (!searchQuery) return true; - const q = searchQuery.toLowerCase(); - return ( - (ds.name ?? '').toLowerCase().includes(q) || - (ds.database ?? '').toLowerCase().includes(q) || - (ds.schema ?? '').toLowerCase().includes(q) || - (ds.description ?? '').toLowerCase().includes(q) || - (ds.column_names ?? []).some(c => (c ?? '').toLowerCase().includes(q)) - ); - }); - - return ( - - - - - {t('plugin.superset.datasetsTitle')} - - - - - - - - - - - - setSearchQuery(e.target.value)} - fullWidth - InputProps={{ - startAdornment: ( - - - - ), - }} - sx={{ '& .MuiOutlinedInput-root': { fontSize: 13 } }} - /> - - - - - - {error && setError(null)}>{error}} - {successMessage && setSuccessMessage(null)}>{successMessage}} - {loading && } - - - {!loading && filteredDatasets.length === 0 && ( - - {t('plugin.superset.noDatasets')} - - )} - - {filteredDatasets.map(ds => ( - - - - - - - {ds.name} - - - {ds.description && ( - 80 ? ds.description : ''} placement="top" arrow> - - {ds.description} - - - )} - - - {`${ds.database}.${ds.schema}`} - - {ds.row_count != null && ( - - )} - - {ds.column_names.length > 0 && } - - - - - - doLoadDataset(ds)} disabled={loadingDatasetId === ds.id}> - {loadingDatasetId === ds.id ? : } - - - - - - { - setSuffixDialogDs(ds); - const d = new Date(); - setSuffixInput(`${d.getFullYear()}${String(d.getMonth() + 1).padStart(2, '0')}${String(d.getDate()).padStart(2, '0')}`); - setSuffixDialogOpen(true); - }} - disabled={loadingDatasetId === ds.id} - > - - - - - - - - ))} - - - setSuffixDialogOpen(false)} maxWidth="xs" fullWidth PaperProps={{ sx: { borderRadius: 2 } }}> - - {t('plugin.superset.suffixDialogTitle')} - - - - {t('plugin.superset.suffixDialogDesc', { name: suffixDialogDs?.name ?? '' })} - - - - {suffixDialogDs?.name ?? ''}_ - - setSuffixInput(e.target.value)} - onKeyDown={e => { - if (e.key === 'Enter' && suffixInput.trim()) { - doLoadDataset(suffixDialogDs!, `${suffixDialogDs!.name}_${suffixInput.trim()}`); - setSuffixDialogOpen(false); - } - }} - slotProps={{ input: { disableUnderline: true, sx: { fontSize: 13, px: 1, py: 0.75 } } }} - /> - - - - - - - - - ); -}; diff --git a/src/plugins/superset/SupersetDashboards.tsx b/src/plugins/superset/SupersetDashboards.tsx deleted file mode 100644 index eb2a6e7a..00000000 --- a/src/plugins/superset/SupersetDashboards.tsx +++ /dev/null @@ -1,348 +0,0 @@ -// Copyright (c) Microsoft Corporation. -// Licensed under the MIT License. - -import React, { FC, useState, useEffect, useCallback } from 'react'; -import { - Box, Typography, TextField, IconButton, Tooltip, Chip, Paper, - Divider, Alert, useTheme, alpha, InputAdornment, LinearProgress, - Collapse, Select, MenuItem, CircularProgress, -} from '@mui/material'; -import DashboardIcon from '@mui/icons-material/Dashboard'; -import RefreshIcon from '@mui/icons-material/Refresh'; -import SearchIcon from '@mui/icons-material/Search'; -import ExpandMoreIcon from '@mui/icons-material/ExpandMore'; -import ExpandLessIcon from '@mui/icons-material/ExpandLess'; -import DownloadIcon from '@mui/icons-material/Download'; -import FilterAltIcon from '@mui/icons-material/FilterAlt'; -import TableRowsIcon from '@mui/icons-material/TableRows'; -import PersonIcon from '@mui/icons-material/Person'; -import { useTranslation } from 'react-i18next'; - -import { - fetchDashboards as apiFetchDashboards, - fetchDashboardDatasets as apiFetchDashboardDatasets, - loadDataset as apiLoadDataset, - SupersetDashboard, - SupersetDataset, -} from './api'; -import { SupersetFilterDialog, FilterPayload } from './SupersetFilterDialog'; - -/* ------------------------------------------------------------------ */ -/* Constants & helpers */ -/* ------------------------------------------------------------------ */ - -const MAX_COLUMN_DISPLAY = 60; -const MAX_TOOLTIP_ROWS = 12; -const ROW_LIMIT_OPTIONS = [20000, 50000, 100000, 200000, 500000]; - -const ColumnChip: FC<{ columns: string[] }> = ({ columns }) => { - const { t } = useTranslation(); - const joined = columns.join(', '); - const truncated = joined.length > MAX_COLUMN_DISPLAY; - const display = truncated ? joined.slice(0, MAX_COLUMN_DISPLAY) + '…' : joined; - - return ( - - {columns.map((col, i) => ( - - {col} - - ))} - - } - placement="top" arrow - > - - - ); -}; - -/* ------------------------------------------------------------------ */ -/* Component */ -/* ------------------------------------------------------------------ */ - -interface SupersetDashboardsProps { - onDatasetLoaded?: (tableName: string, rowCount: number) => void; -} - -export const SupersetDashboards: FC = ({ onDatasetLoaded }) => { - const theme = useTheme(); - const { t } = useTranslation(); - - const [dashboards, setDashboards] = useState([]); - const [loading, setLoading] = useState(false); - const [searchQuery, setSearchQuery] = useState(''); - const [error, setError] = useState(null); - const [successMessage, setSuccessMessage] = useState(null); - - const [expandedId, setExpandedId] = useState(null); - const [datasetsMap, setDatasetsMap] = useState>({}); - const [loadingDatasetsFor, setLoadingDatasetsFor] = useState(null); - - const [loadingDatasetId, setLoadingDatasetId] = useState(null); - const [rowLimit, setRowLimit] = useState(20000); - - const [filterDialogOpen, setFilterDialogOpen] = useState(false); - const [filterDialogDashboard, setFilterDialogDashboard] = useState(null); - const [filterDialogDataset, setFilterDialogDataset] = useState(null); - - /* ---- fetch dashboards ---- */ - - const doFetchDashboards = useCallback(async () => { - setLoading(true); - setError(null); - try { - const dbs = await apiFetchDashboards(); - setDashboards(dbs); - } catch (err: any) { - setError(err.message || 'Network error'); - } finally { - setLoading(false); - } - }, []); - - useEffect(() => { doFetchDashboards(); }, [doFetchDashboards]); - - /* ---- expand / collapse ---- */ - - const toggleExpand = async (dashboardId: number) => { - if (expandedId === dashboardId) { setExpandedId(null); return; } - setExpandedId(dashboardId); - if (datasetsMap[dashboardId]) return; - setLoadingDatasetsFor(dashboardId); - try { - const ds = await apiFetchDashboardDatasets(dashboardId); - setDatasetsMap(prev => ({ ...prev, [dashboardId]: ds })); - } catch (err: any) { - setError(err.message); - } finally { - setLoadingDatasetsFor(null); - } - }; - - /* ---- load dataset ---- */ - - const doLoadDataset = async ( - dataset: SupersetDataset, - tableNameOverride?: string, - filters?: FilterPayload[], - ) => { - setLoadingDatasetId(dataset.id); - setError(null); - setSuccessMessage(null); - try { - const result = await apiLoadDataset({ - dataset_id: dataset.id, - row_limit: rowLimit, - table_name: tableNameOverride, - filters: filters as any, - }); - setSuccessMessage(t('plugin.superset.loadSuccess', { name: result.table_name, count: result.row_count })); - onDatasetLoaded?.(result.table_name, result.row_count); - } catch (err: any) { - setError(t('plugin.superset.loadFailed', { message: err.message })); - } finally { - setLoadingDatasetId(null); - } - }; - - /* ---- search filter ---- */ - - const filteredDashboards = dashboards.filter(db => { - if (!searchQuery) return true; - const q = searchQuery.toLowerCase(); - return ( - (db.title ?? '').toLowerCase().includes(q) || - (db.slug ?? '').toLowerCase().includes(q) || - (db.owners ?? []).some(o => (o ?? '').toLowerCase().includes(q)) - ); - }); - - /* ---- render ---- */ - - return ( - - {/* header */} - - - - {t('plugin.superset.dashboardsTitle')} - - - - - - - - - - - {/* search + row limit */} - - setSearchQuery(e.target.value)} - InputProps={{ - startAdornment: ( - - - - ), - }} - sx={{ '& .MuiOutlinedInput-root': { fontSize: 13 } }} - /> - - - - - - {/* alerts */} - {error && setError(null)}>{error}} - {successMessage && setSuccessMessage(null)}>{successMessage}} - {loading && } - - {/* dashboard list */} - - {!loading && filteredDashboards.length === 0 && ( - - {t('plugin.superset.noDashboards')} - - )} - - {filteredDashboards.map(db => { - const isExpanded = expandedId === db.id; - const datasets = datasetsMap[db.id]; - - return ( - - {/* dashboard row */} - toggleExpand(db.id)}> - - - - {db.title} - - - {db.owners.length > 0 && ( - } - label={db.owners.join(', ')} - sx={{ fontSize: 10, height: 18, color: 'text.secondary', borderColor: 'divider' }} - /> - )} - {db.changed_on_delta_humanized && ( - - {db.changed_on_delta_humanized} - - )} - - - {isExpanded ? : } - - - {/* expanded dataset list */} - - - - {loadingDatasetsFor === db.id && } - {datasets && datasets.length === 0 && ( - - {t('plugin.superset.noDatasetsInDashboard')} - - )} - {datasets && datasets.map(ds => ( - - - - - - {ds.name} - - - - - {`${ds.database}.${ds.schema}`} - - - {ds.row_count != null && ( - - )} - - {ds.column_names.length > 0 && ( - - )} - - - - - { setFilterDialogDashboard(db); setFilterDialogDataset(ds); setFilterDialogOpen(true); }}> - - - - - - - doLoadDataset(ds)} disabled={loadingDatasetId === ds.id}> - {loadingDatasetId === ds.id ? : } - - - - - - ))} - - - - ); - })} - - - {filterDialogOpen && filterDialogDashboard && filterDialogDataset && ( - { setFilterDialogOpen(false); setFilterDialogDashboard(null); setFilterDialogDataset(null); }} - onSubmit={async (filters, tableNameOverride) => { - if (!filterDialogDataset) return; - await doLoadDataset(filterDialogDataset, tableNameOverride, filters); - }} - /> - )} - - ); -}; diff --git a/src/plugins/superset/SupersetFilterDialog.tsx b/src/plugins/superset/SupersetFilterDialog.tsx deleted file mode 100644 index 2ff29900..00000000 --- a/src/plugins/superset/SupersetFilterDialog.tsx +++ /dev/null @@ -1,511 +0,0 @@ -// Copyright (c) Microsoft Corporation. -// Licensed under the MIT License. - -import React, { FC, useCallback, useEffect, useMemo, useRef, useState } from 'react'; -import { - Alert, Autocomplete, Box, Button, Chip, CircularProgress, - Dialog, DialogActions, DialogContent, DialogTitle, Divider, - FormControl, MenuItem, Select, Stack, TextField, Typography, -} from '@mui/material'; -import FilterAltIcon from '@mui/icons-material/FilterAlt'; -import { useTranslation } from 'react-i18next'; - -import { - fetchDashboardFilters, fetchFilterOptions, - DashboardFilter, FilterOption, SupersetDataset, -} from './api'; - -/* ------------------------------------------------------------------ */ -/* Types */ -/* ------------------------------------------------------------------ */ - -export interface FilterPayload { - column: string; - operator: string; - value?: unknown; -} - -interface FilterFormValue { - operator: string; - value: string | number | boolean | Array; - valueTo?: string; -} - -interface SupersetFilterDialogProps { - open: boolean; - dashboardId: number; - dashboardTitle: string; - dataset: SupersetDataset; - onClose: () => void; - onSubmit: (filters: FilterPayload[], tableNameOverride?: string) => Promise | void; -} - -/* ------------------------------------------------------------------ */ -/* Helpers */ -/* ------------------------------------------------------------------ */ - -const defaultOperatorForFilter = (f: DashboardFilter): string => { - if (f.input_type === 'time') return 'BETWEEN'; - if (f.input_type === 'numeric') return 'EQ'; - if (f.input_type === 'select') return f.multi ? 'IN' : 'EQ'; - return 'ILIKE'; -}; - -const isEmptyValue = (fv: FilterFormValue | undefined, inputType: string): boolean => { - if (!fv) return true; - if (fv.operator === 'IS_NULL' || fv.operator === 'IS_NOT_NULL') return false; - if (inputType === 'select') { - return Array.isArray(fv.value) ? fv.value.length === 0 : fv.value === '' || fv.value == null; - } - if (fv.operator === 'BETWEEN') { - return fv.value === '' || fv.value == null || fv.valueTo === '' || fv.valueTo == null; - } - return fv.value === '' || fv.value == null; -}; - -const normalizeNumericValue = (v: string): string | number => { - const n = Number(v); - return Number.isFinite(n) ? n : v; -}; - -/* ------------------------------------------------------------------ */ -/* Component */ -/* ------------------------------------------------------------------ */ - -export const SupersetFilterDialog: FC = ({ - open, dashboardId, dashboardTitle, dataset, onClose, onSubmit, -}) => { - const { t } = useTranslation(); - - const [loading, setLoading] = useState(false); - const [submitLoading, setSubmitLoading] = useState(false); - const [error, setError] = useState(null); - const [filters, setFilters] = useState([]); - const [formValues, setFormValues] = useState>({}); - const [suffixInput, setSuffixInput] = useState(''); - const [suffixManuallyEdited, setSuffixManuallyEdited] = useState(false); - - const [optionsMap, setOptionsMap] = useState>({}); - const [optionsMoreMap, setOptionsMoreMap] = useState>({}); - const [optionSearchMap, setOptionSearchMap] = useState>({}); - const [optionsLoadingKey, setOptionsLoadingKey] = useState(null); - const searchTimersRef = useRef>>({}); - - /* ---- auto-generated suffix from filter values ---- */ - - const buildAutoSuffix = useCallback(() => { - const parts: string[] = []; - for (const f of filters) { - const fv = formValues[f.id]; - if (isEmptyValue(fv, f.input_type)) continue; - let valStr: string; - if (fv.operator === 'IS_NULL') valStr = 'null'; - else if (fv.operator === 'IS_NOT_NULL') valStr = 'notnull'; - else if (Array.isArray(fv.value)) valStr = fv.value.slice(0, 3).map(String).join('_'); - else valStr = String(fv.value); - valStr = valStr.replace(/[^a-zA-Z0-9\u4e00-\u9fff_-]/g, '_').replace(/_+/g, '_').slice(0, 20); - parts.push(valStr); - } - if (parts.length === 0) { - const d = new Date(); - return `${d.getFullYear()}${String(d.getMonth() + 1).padStart(2, '0')}${String(d.getDate()).padStart(2, '0')}`; - } - return parts.join('_'); - }, [filters, formValues]); - - useEffect(() => { - if (!suffixManuallyEdited) setSuffixInput(buildAutoSuffix()); - }, [buildAutoSuffix, suffixManuallyEdited]); - - /* ---- fetch filter definitions ---- */ - - useEffect(() => { - if (!open) return; - setLoading(true); - setError(null); - setFilters([]); - setFormValues({}); - setOptionsMap({}); - setOptionsMoreMap({}); - setOptionSearchMap({}); - setSuffixInput(''); - setSuffixManuallyEdited(false); - - fetchDashboardFilters(dashboardId, dataset.id) - .then(fs => { - setFilters(fs); - const defaults: Record = {}; - for (const f of fs) { - const op = defaultOperatorForFilter(f); - if (f.default_value != null) { - const dv = f.default_value; - if (f.multi) { - defaults[f.id] = { operator: op, value: Array.isArray(dv) ? dv as any : [dv] as any, valueTo: '' }; - } else { - defaults[f.id] = { operator: op, value: Array.isArray(dv) ? (dv[0] ?? '') : dv as any, valueTo: '' }; - } - } else { - defaults[f.id] = { operator: op, value: f.multi ? [] : '', valueTo: '' }; - } - } - setFormValues(defaults); - }) - .catch(() => setFilters([])) - .finally(() => setLoading(false)); - - return () => { - Object.values(searchTimersRef.current).forEach(clearTimeout); - searchTimersRef.current = {}; - }; - }, [open, dashboardId, dataset.id]); - - /* ---- options loading (with debounce) ---- */ - - const loadOptions = async (filter: DashboardFilter, keyword = '') => { - if (filter.input_type !== 'select') return; - setOptionsLoadingKey(filter.id); - try { - const { options, has_more } = await fetchFilterOptions(filter.dataset_id, filter.column_name, keyword); - setOptionsMap(prev => ({ ...prev, [filter.id]: options })); - setOptionsMoreMap(prev => ({ ...prev, [filter.id]: has_more })); - } catch (err: any) { - setError(err.message || t('plugin.superset.loadOptionsFailed')); - } finally { - setOptionsLoadingKey(cur => (cur === filter.id ? null : cur)); - } - }; - - const queueOptionsLoad = (filter: DashboardFilter, keyword = '') => { - if (searchTimersRef.current[filter.id]) clearTimeout(searchTimersRef.current[filter.id]); - searchTimersRef.current[filter.id] = setTimeout(() => loadOptions(filter, keyword), keyword ? 300 : 0); - }; - - /* ---- form helpers ---- */ - - const handleValueChange = (filterId: string, patch: Partial) => { - setFormValues(prev => ({ ...prev, [filterId]: { ...prev[filterId], ...patch } })); - }; - - const getSelectedOptions = (filter: DashboardFilter) => { - const selected = formValues[filter.id]?.value; - const opts = optionsMap[filter.id] || []; - const asOption = (raw: string | number | boolean) => { - const found = opts.find(o => String(o.value) === String(raw)); - return found || { label: String(raw), value: raw }; - }; - if (filter.multi) return Array.isArray(selected) ? selected.map(asOption) : []; - if (selected === '' || selected == null || Array.isArray(selected)) return null; - return asOption(selected); - }; - - /* ---- build payload ---- */ - - const buildPayload = useMemo(() => (): FilterPayload[] => { - return filters.flatMap(f => { - const fv = formValues[f.id]; - if (isEmptyValue(fv, f.input_type)) return []; - if (fv.operator === 'IS_NULL' || fv.operator === 'IS_NOT_NULL') { - return [{ column: f.column_name, operator: fv.operator }]; - } - if (f.input_type === 'numeric') { - if (fv.operator === 'BETWEEN') { - return [{ column: f.column_name, operator: fv.operator, value: [normalizeNumericValue(String(fv.value)), normalizeNumericValue(String(fv.valueTo ?? ''))] }]; - } - return [{ column: f.column_name, operator: fv.operator, value: normalizeNumericValue(String(fv.value)) }]; - } - if (f.input_type === 'time' && fv.operator === 'BETWEEN') { - return [{ column: f.column_name, operator: fv.operator, value: [String(fv.value), String(fv.valueTo ?? '')] }]; - } - return [{ column: f.column_name, operator: fv.operator, value: fv.value }]; - }); - }, [filters, formValues]); - - /* ---- submit ---- */ - - const handleSubmit = async () => { - try { - setSubmitLoading(true); - setError(null); - const fullName = suffixInput.trim() ? `${dataset.name}_${suffixInput.trim()}` : undefined; - await onSubmit(buildPayload(), fullName); - onClose(); - } catch (err: any) { - setError(err.message || t('plugin.superset.loadFailed', { message: 'Unknown error' })); - } finally { - setSubmitLoading(false); - } - }; - - /* ---- operator control (per input_type) ---- */ - - const renderOperatorControl = (filter: DashboardFilter) => { - const fv = formValues[filter.id]; - let options: Array<{ value: string; label: string }>; - if (filter.input_type === 'select') { - options = filter.multi - ? [{ value: 'IN', label: t('plugin.superset.op.in') }, { value: 'NOT_IN', label: t('plugin.superset.op.notIn') }] - : [{ value: 'EQ', label: t('plugin.superset.op.eq') }, { value: 'NEQ', label: t('plugin.superset.op.neq') }]; - } else if (filter.input_type === 'numeric') { - options = [ - { value: 'EQ', label: t('plugin.superset.op.eq') }, - { value: 'GT', label: t('plugin.superset.op.gt') }, - { value: 'GTE', label: t('plugin.superset.op.gte') }, - { value: 'LT', label: t('plugin.superset.op.lt') }, - { value: 'LTE', label: t('plugin.superset.op.lte') }, - { value: 'BETWEEN', label: t('plugin.superset.op.between') }, - { value: 'IS_NULL', label: t('plugin.superset.op.isNull') }, - { value: 'IS_NOT_NULL', label: t('plugin.superset.op.isNotNull') }, - ]; - } else if (filter.input_type === 'time') { - options = [ - { value: 'BETWEEN', label: t('plugin.superset.op.timeRange') }, - { value: 'EQ', label: t('plugin.superset.op.eq') }, - { value: 'GT', label: t('plugin.superset.op.gt') }, - { value: 'GTE', label: t('plugin.superset.op.gte') }, - { value: 'LT', label: t('plugin.superset.op.lt') }, - { value: 'LTE', label: t('plugin.superset.op.lte') }, - { value: 'IS_NULL', label: t('plugin.superset.op.isNull') }, - { value: 'IS_NOT_NULL', label: t('plugin.superset.op.isNotNull') }, - ]; - } else { - options = [ - { value: 'ILIKE', label: t('plugin.superset.op.contains') }, - { value: 'EQ', label: t('plugin.superset.op.eq') }, - { value: 'NEQ', label: t('plugin.superset.op.neq') }, - { value: 'IS_NULL', label: t('plugin.superset.op.isNull') }, - { value: 'IS_NOT_NULL', label: t('plugin.superset.op.isNotNull') }, - ]; - } - return ( - - - - ); - }; - - /* ---- value control (per input_type) ---- */ - - const renderValueControl = (filter: DashboardFilter) => { - const fv = formValues[filter.id]; - const operator = fv?.operator || defaultOperatorForFilter(filter); - - if (operator === 'IS_NULL' || operator === 'IS_NOT_NULL') { - return ( - - {t('plugin.superset.noValueNeeded')} - - ); - } - - const inputSx = { '& .MuiOutlinedInput-root': { fontSize: 12, '& input': { py: 0.75 } } }; - - // Select type - if (filter.input_type === 'select') { - return ( - - o} - isOptionEqualToValue={(a, b) => String(a.value) === String(b.value)} - getOptionLabel={o => o.label} - onOpen={() => { if (!optionsMap[filter.id]) queueOptionsLoad(filter, ''); }} - onChange={(_, val) => { - handleValueChange(filter.id, { - value: filter.multi - ? (val as FilterOption[]).map(i => i.value).filter(i => i != null) as any - : ((val as FilterOption | null)?.value ?? '') as any, - }); - setOptionSearchMap(prev => ({ ...prev, [filter.id]: '' })); - }} - inputValue={optionSearchMap[filter.id] || ''} - onInputChange={(_, val, reason) => { - if (reason === 'input') { - setOptionSearchMap(prev => ({ ...prev, [filter.id]: val })); - queueOptionsLoad(filter, val); - } - }} - slotProps={{ - listbox: { sx: { fontSize: 12 } }, - chip: { size: 'small', sx: { fontSize: 11, height: 20 } }, - }} - renderInput={params => ( - - )} - /> - {optionsMoreMap[filter.id] && ( - - {t('plugin.superset.resultsTruncated')} - - )} - - ); - } - - // Time type - if (filter.input_type === 'time') { - return ( - - handleValueChange(filter.id, { value: e.target.value })} - InputLabelProps={{ shrink: true }} sx={{ flex: 1, ...inputSx }} - /> - {operator === 'BETWEEN' && ( - handleValueChange(filter.id, { valueTo: e.target.value })} - InputLabelProps={{ shrink: true }} sx={{ flex: 1, ...inputSx }} - /> - )} - - ); - } - - // Numeric type - if (filter.input_type === 'numeric') { - return ( - - handleValueChange(filter.id, { value: e.target.value })} - placeholder={t('plugin.superset.valuePlaceholder')} sx={{ flex: 1, ...inputSx }} - /> - {operator === 'BETWEEN' && ( - handleValueChange(filter.id, { valueTo: e.target.value })} - placeholder={t('plugin.superset.valueToPlaceholder')} sx={{ flex: 1, ...inputSx }} - /> - )} - - ); - } - - // Text type (default) - return ( - handleValueChange(filter.id, { value: e.target.value })} - placeholder={t('plugin.superset.filterValuePlaceholder')} - sx={{ flex: 1, ...inputSx }} - /> - ); - }; - - /* ---- render ---- */ - - return ( - - - - - - {t('plugin.superset.filterDialogHeading')} - - - {dashboardTitle} / {dataset.name} - - - - - - - - {t('plugin.superset.filterDialogHint')} - - - {error && setError(null)}>{error}} - - {/* table name input */} - - - {t('plugin.superset.tableNameLabel')} - - - - {dataset.name}_ - - { setSuffixInput(e.target.value); setSuffixManuallyEdited(true); }} - slotProps={{ input: { disableUnderline: true, sx: { fontSize: 13, px: 1, py: 0.75 } } }} - /> - - {suffixInput.trim() && ( - - )} - - - - - {/* loading / empty / filter list */} - {loading ? ( - - ) : filters.length === 0 ? ( - - {t('plugin.superset.noFiltersAvailable')} - - ) : ( - - {filters.map(f => ( - - - - {f.name} - - - - {f.multi && } - {f.default_value != null && ( - - )} - - - {renderOperatorControl(f)} - {renderValueControl(f)} - - - ))} - - )} - - - - - - - - - ); -}; diff --git a/src/plugins/superset/SupersetLogin.tsx b/src/plugins/superset/SupersetLogin.tsx deleted file mode 100644 index cedf1d2f..00000000 --- a/src/plugins/superset/SupersetLogin.tsx +++ /dev/null @@ -1,264 +0,0 @@ -// Copyright (c) Microsoft Corporation. -// Licensed under the MIT License. - -import React, { FC, useState, useCallback, useRef } from 'react'; -import { - Box, Button, TextField, Typography, Alert, CircularProgress, - Divider, IconButton, InputAdornment, FormControlLabel, Checkbox, - alpha, useTheme, -} from '@mui/material'; -import LoginIcon from '@mui/icons-material/Login'; -import OpenInNewIcon from '@mui/icons-material/OpenInNew'; -import Visibility from '@mui/icons-material/Visibility'; -import VisibilityOff from '@mui/icons-material/VisibilityOff'; -import { useTranslation } from 'react-i18next'; - -import { supersetLogin, supersetSsoSaveTokens, supersetGuestLogin } from './api'; -import type { PluginConfig } from '../types'; - -interface SupersetLoginProps { - config: PluginConfig; - onLoginSuccess: (user: Record) => void; - vaultStale?: boolean; -} - -export const SupersetLogin: FC = ({ config, onLoginSuccess, vaultStale }) => { - const { t } = useTranslation(); - const theme = useTheme(); - const [username, setUsername] = useState(''); - const [password, setPassword] = useState(''); - const [remember, setRemember] = useState(false); - const [loading, setLoading] = useState(false); - const [error, setError] = useState(''); - const [showPassword, setShowPassword] = useState(false); - - const ssoLoginUrl = config.sso_login_url as string | undefined; - const authModes = (config.auth_modes as string[]) || []; - const showPasswordLogin = authModes.includes('password'); - const showSso = authModes.includes('sso') && !!ssoLoginUrl; - const guestEnabled = config.guest_enabled as boolean | undefined; - - const handlePasswordLogin = useCallback(async () => { - if (!username || !password) return; - setLoading(true); - setError(''); - try { - const result = await supersetLogin(username, password, remember); - if (result.status === 'ok') { - onLoginSuccess(result.user); - } else { - setError(result.message || t('plugin.superset.loginFailed')); - } - } catch (err: any) { - setError(err.message || t('plugin.superset.loginFailed')); - } finally { - setLoading(false); - } - }, [username, password, remember, onLoginSuccess, t]); - - const handleGuestLogin = useCallback(async () => { - setLoading(true); - setError(''); - try { - const result = await supersetGuestLogin(); - if (result.status === 'ok') { - onLoginSuccess(result.user); - } else { - setError(result.message || t('plugin.superset.guestFailed')); - } - } catch (err: any) { - setError(err.message || t('plugin.superset.guestFailed')); - } finally { - setLoading(false); - } - }, [onLoginSuccess, t]); - - const popupRef = useRef(null); - const pollTimerRef = useRef | null>(null); - - const handleSsoLogin = useCallback(() => { - if (!ssoLoginUrl) return; - setLoading(true); - setError(''); - - const url = new URL(ssoLoginUrl); - url.searchParams.set('df_origin', window.location.origin); - - const width = 600; - const height = 700; - const left = window.screenX + (window.outerWidth - width) / 2; - const top = window.screenY + (window.outerHeight - height) / 2; - const popup = window.open( - url.toString(), - 'df-sso-login', - `width=${width},height=${height},left=${left},top=${top},toolbar=no,menubar=no`, - ); - - if (!popup) { - setError(t('plugin.superset.ssoFailed')); - setLoading(false); - return; - } - - popupRef.current = popup; - - const handler = async (event: MessageEvent) => { - if (event.data?.type !== 'df-sso-auth') return; - window.removeEventListener('message', handler); - if (pollTimerRef.current) { clearInterval(pollTimerRef.current); pollTimerRef.current = null; } - popup.close(); - - const { access_token, refresh_token, user } = event.data; - if (access_token) { - try { - const result = await supersetSsoSaveTokens(access_token, refresh_token, user); - if (result.status === 'ok') { - onLoginSuccess(result.user); - } else { - setError(result.message || t('plugin.superset.ssoFailed')); - } - } catch (err: any) { - setError(err.message || t('plugin.superset.ssoFailed')); - } - } - setLoading(false); - }; - - window.addEventListener('message', handler); - - pollTimerRef.current = setInterval(() => { - if (popup.closed) { - if (pollTimerRef.current) { clearInterval(pollTimerRef.current); pollTimerRef.current = null; } - window.removeEventListener('message', handler); - setLoading(false); - } - }, 1000); - }, [ssoLoginUrl, onLoginSuccess, t]); - - return ( - - - {t('plugin.superset.connectTo', { name: config.name || 'Apache Superset' })} - - - {vaultStale && ( - - {t('plugin.superset.vaultStale')} - - )} - - {error && {error}} - - {showPasswordLogin && ( - - setUsername(e.target.value)} - autoFocus - fullWidth - onKeyDown={e => e.key === 'Enter' && handlePasswordLogin()} - /> - setPassword(e.target.value)} - fullWidth - onKeyDown={e => e.key === 'Enter' && handlePasswordLogin()} - InputProps={{ - endAdornment: ( - - setShowPassword(prev => !prev)} - edge="end" - tabIndex={-1} - > - {showPassword ? : } - - - ), - }} - /> - - setRemember(e.target.checked)} - /> - } - label={ - - {t('plugin.superset.rememberCredentials')} - - } - sx={{ ml: 0 }} - /> - - {t('plugin.superset.rememberCredentialsHint')} - - - - - )} - - {showPasswordLogin && showSso && ( - - - {t('plugin.superset.or')} - - - )} - - {showSso && ( - - )} - - {guestEnabled && ( - <> - {(showPasswordLogin || showSso) && ( - - - {t('plugin.superset.or')} - - - )} - - - )} - - ); -}; diff --git a/src/plugins/superset/SupersetPanel.tsx b/src/plugins/superset/SupersetPanel.tsx deleted file mode 100644 index f3b76e67..00000000 --- a/src/plugins/superset/SupersetPanel.tsx +++ /dev/null @@ -1,98 +0,0 @@ -// Copyright (c) Microsoft Corporation. -// Licensed under the MIT License. - -import React, { FC, useState, useEffect, useCallback } from 'react'; -import { Box, Tab, Tabs, Typography, Button } from '@mui/material'; -import TableRowsIcon from '@mui/icons-material/TableRows'; -import DashboardIcon from '@mui/icons-material/Dashboard'; -import LogoutIcon from '@mui/icons-material/Logout'; -import { useTranslation } from 'react-i18next'; - -import type { PluginPanelProps } from '../types'; -import { supersetAuthStatus, supersetLogout } from './api'; -import { SupersetLogin } from './SupersetLogin'; -import { SupersetCatalog } from './SupersetCatalog'; -import { SupersetDashboards } from './SupersetDashboards'; - -export const SupersetPanel: FC = ({ config, callbacks }) => { - const { t } = useTranslation(); - const [tab, setTab] = useState<0 | 1>(0); - const [authenticated, setAuthenticated] = useState(null); - const [user, setUser] = useState | null>(null); - const [vaultStale, setVaultStale] = useState(false); - - useEffect(() => { - supersetAuthStatus() - .then(data => { - setAuthenticated(data.authenticated); - if (data.authenticated) setUser(data.user); - if (data.vault_stale) setVaultStale(true); - }) - .catch(() => setAuthenticated(false)); - }, []); - - const handleLoginSuccess = useCallback((u: Record) => { - setAuthenticated(true); - setUser(u); - }, []); - - const handleLogout = useCallback(async () => { - await supersetLogout(); - setAuthenticated(false); - setUser(null); - }, []); - - const handleDatasetLoaded = useCallback((tableName: string, rowCount: number) => { - callbacks.onDataLoaded({ - tableName, - rowCount, - source: 'superset', - }); - }, [callbacks]); - - if (authenticated === null) { - return - {t('plugin.superset.checkingAuth')} - ; - } - - if (!authenticated) { - return ; - } - - return ( - - - setTab(v)} - variant="fullWidth" - sx={{ - flex: 1, minHeight: 36, - '& .MuiTab-root': { minHeight: 36, py: 0.5, textTransform: 'none', fontSize: 13 }, - }} - > - } iconPosition="start" label={t('plugin.superset.dashboards')} /> - } iconPosition="start" label={t('plugin.superset.datasets')} /> - - - - - - - - - - - - - - ); -}; diff --git a/src/plugins/superset/api.ts b/src/plugins/superset/api.ts deleted file mode 100644 index 5827b14f..00000000 --- a/src/plugins/superset/api.ts +++ /dev/null @@ -1,182 +0,0 @@ -// Copyright (c) Microsoft Corporation. -// Licensed under the MIT License. - -/** - * API wrapper for the Superset plugin backend routes. - * - * All calls go through `fetchWithIdentity` so that identity headers - * and OIDC tokens are automatically attached. - */ - -import { fetchWithIdentity } from '../../app/utils'; - -const BASE = '/api/plugins/superset'; - -// -- Auth --------------------------------------------------------------- - -export async function supersetLogin(username: string, password: string, remember = false) { - const resp = await fetchWithIdentity(`${BASE}/auth/login`, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ username, password, remember }), - }); - return resp.json(); -} - -export async function supersetSsoSaveTokens( - accessToken: string, - refreshToken?: string, - user?: Record, -) { - const resp = await fetchWithIdentity(`${BASE}/auth/sso/save-tokens`, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ access_token: accessToken, refresh_token: refreshToken, user }), - }); - return resp.json(); -} - -export async function supersetAuthStatus() { - const resp = await fetchWithIdentity(`${BASE}/auth/status`); - return resp.json(); -} - -export async function supersetMe() { - const resp = await fetchWithIdentity(`${BASE}/auth/me`); - return resp.json(); -} - -export async function supersetLogout() { - const resp = await fetchWithIdentity(`${BASE}/auth/logout`, { method: 'POST' }); - return resp.json(); -} - -export async function supersetGuestLogin() { - const resp = await fetchWithIdentity(`${BASE}/auth/guest`, { method: 'POST' }); - return resp.json(); -} - -// -- Catalog ------------------------------------------------------------ - -export interface SupersetDataset { - id: number; - name: string; - schema: string; - database: string; - description: string; - column_count: number; - column_names: string[]; - row_count: number | null; -} - -export interface SupersetDashboard { - id: number; - title: string; - slug: string; - status: string; - url: string; - changed_on_delta_humanized: string; - owners: string[]; -} - -export interface DashboardFilter { - id: string; - name: string; - filter_type: string; - input_type: string; - dataset_id: number; - dataset_name: string; - column_name: string; - column_type: string; - multi: boolean; - required: boolean; - supports_search: boolean; - default_value?: unknown; -} - -export interface FilterOption { - label: string; - value: unknown; -} - -export async function fetchDatasets(): Promise { - const resp = await fetchWithIdentity(`${BASE}/catalog/datasets`); - const data = await resp.json(); - if (data.status !== 'ok') throw new Error(data.message || 'Failed to fetch datasets'); - return data.datasets; -} - -export async function fetchDashboards(): Promise { - const resp = await fetchWithIdentity(`${BASE}/catalog/dashboards`); - const data = await resp.json(); - if (data.status !== 'ok') throw new Error(data.message || 'Failed to fetch dashboards'); - return data.dashboards; -} - -export async function fetchDashboardDatasets(dashboardId: number): Promise { - const resp = await fetchWithIdentity(`${BASE}/catalog/dashboards/${dashboardId}/datasets`); - const data = await resp.json(); - if (data.status !== 'ok') throw new Error(data.message || 'Failed to fetch dashboard datasets'); - return data.datasets; -} - -export async function fetchDashboardFilters( - dashboardId: number, - datasetId?: number, -): Promise { - const params = new URLSearchParams(); - if (datasetId != null) params.set('dataset_id', String(datasetId)); - const resp = await fetchWithIdentity(`${BASE}/catalog/dashboards/${dashboardId}/filters?${params}`); - const data = await resp.json(); - if (data.status !== 'ok') throw new Error(data.message || 'Failed to fetch filters'); - return data.filters; -} - -export async function fetchFilterOptions( - datasetId: number, - columnName: string, - keyword?: string, - limit = 50, - offset = 0, -): Promise<{ options: FilterOption[]; has_more: boolean }> { - const params = new URLSearchParams({ - dataset_id: String(datasetId), - column_name: columnName, - limit: String(limit), - offset: String(offset), - }); - if (keyword) params.set('keyword', keyword); - const resp = await fetchWithIdentity(`${BASE}/catalog/filters/options?${params}`); - const data = await resp.json(); - if (data.status !== 'ok') throw new Error(data.message || 'Failed to fetch filter options'); - return { options: data.options, has_more: data.has_more }; -} - -// -- Data --------------------------------------------------------------- - -export interface LoadDatasetRequest { - dataset_id: number; - row_limit?: number; - table_name?: string; - filters?: Array<{ column: string; operator: string; value: unknown }>; - stream?: boolean; -} - -export interface LoadDatasetResult { - status: string; - table_name: string; - row_count: number; - columns: string[]; - message?: string; -} - -export async function loadDataset(req: LoadDatasetRequest): Promise { - const resp = await fetchWithIdentity(`${BASE}/data/load-dataset`, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify(req), - }); - const data = await resp.json(); - if (data.status !== 'ok') throw new Error(data.message || 'Failed to load dataset'); - return data; -} diff --git a/src/plugins/superset/index.tsx b/src/plugins/superset/index.tsx deleted file mode 100644 index ccceadce..00000000 --- a/src/plugins/superset/index.tsx +++ /dev/null @@ -1,34 +0,0 @@ -// Copyright (c) Microsoft Corporation. -// Licensed under the MIT License. - -/** - * Superset frontend plugin module. - * - * Discovered by `import.meta.glob` in `src/plugins/registry.ts`. - */ - -import type { DataSourcePluginModule } from '../types'; - -import { SupersetPanel } from './SupersetPanel'; - -import en from './locales/en.json'; -import zh from './locales/zh.json'; - -// MUI-compatible Superset icon (inline SVG wrapped as component) -import React from 'react'; -import SvgIcon from '@mui/material/SvgIcon'; - -const SupersetIcon: React.FC<{ sx?: object }> = (props) => ( - - - -); - -const supersetPlugin: DataSourcePluginModule = { - id: 'superset', - Icon: SupersetIcon, - Panel: SupersetPanel, - locales: { en, zh }, -}; - -export default supersetPlugin; diff --git a/src/plugins/superset/locales/en.json b/src/plugins/superset/locales/en.json deleted file mode 100644 index 19aee117..00000000 --- a/src/plugins/superset/locales/en.json +++ /dev/null @@ -1,76 +0,0 @@ -{ - "plugin": { - "superset": { - "description": "Connect to an Apache Superset instance to browse and load datasets.", - "connectTo": "Connect to {{name}}", - "username": "Username", - "password": "Password", - "login": "Sign In", - "logout": "Sign Out", - "or": "or", - "ssoLogin": "Sign in with SSO", - "loginFailed": "Login failed", - "ssoFailed": "SSO login failed", - "checkingAuth": "Checking authentication...", - "dashboards": "Dashboards", - "datasets": "Datasets", - "datasetsTitle": "Superset Datasets", - "dashboardsTitle": "Superset Dashboards", - "refresh": "Refresh", - "searchPlaceholder": "Search datasets...", - "rowLimitTip": "Max rows to load", - "noDatasets": "No datasets found.", - "noDashboards": "No dashboards found.", - "noDatasetsInDashboard": "No datasets in this dashboard.", - "columns": "{{count}} columns", - "rows": "{{count}} rows", - "loadOverwrite": "Load (overwrite if exists)", - "createNewDataset": "Load as new table with suffix", - "loadDirect": "Load dataset", - "loadWithFilters": "Load with Filters", - "loadSuccess": "Dataset \"{{name}}\" loaded ({{count}} rows).", - "loadFailed": "Failed to load dataset: {{message}}", - "suffixDialogTitle": "Enter Dataset Name Suffix", - "suffixDialogDesc": "Specify a suffix for dataset \"{{name}}\".", - "suffixPlaceholder": "e.g. 20260101", - "confirmLoad": "Confirm & Load", - "filterDialogTitle": "Filters: {{dashboard}} → {{dataset}}", - "filterDialogHeading": "Load with Filters", - "filterDialogHint": "Filters from this dashboard are shown below. Empty filters will not be applied.", - "noFiltersAvailable": "No native filters available for this dataset. Data will be loaded as-is.", - "filterValuePlaceholder": "Enter filter text...", - "tableNameSuffix": "Table name suffix (optional)", - "tableNameLabel": "Table name to import", - "suffixAutoGenerated": "Auto-generated from filters", - "defaultValue": "default", - "multiSelect": "multi", - "noValueNeeded": "No value needed for this operator", - "valuePlaceholder": "Value", - "valueToPlaceholder": "End value", - "searchableSelect": "Type to search values...", - "selectValue": "Select a value", - "resultsTruncated": "Results truncated — type a keyword to narrow", - "loadOptionsFailed": "Failed to load filter options", - "searchDashboards": "Search dashboards...", - "op.eq": "Equals", - "op.neq": "Not equal", - "op.gt": "Greater than", - "op.gte": "Greater or equal", - "op.lt": "Less than", - "op.lte": "Less or equal", - "op.in": "Any of", - "op.notIn": "None of", - "op.between": "Between", - "op.timeRange": "Time range", - "op.contains": "Contains", - "op.isNull": "Is null", - "op.isNotNull": "Is not null", - "rememberCredentials": "Remember credentials on this server", - "rememberCredentialsHint": "Allows background agents to pull data automatically; without this, agents cannot access this data source for you.", - "vaultStale": "Your saved credentials are no longer valid. Please enter new credentials.", - "browsePublic": "Skip login, browse public data", - "guestFailed": "Failed to enter guest mode", - "loginRequired": "Loading data requires login. Please sign in first." - } - } -} diff --git a/src/plugins/superset/locales/zh.json b/src/plugins/superset/locales/zh.json deleted file mode 100644 index bc9e4d41..00000000 --- a/src/plugins/superset/locales/zh.json +++ /dev/null @@ -1,76 +0,0 @@ -{ - "plugin": { - "superset": { - "description": "连接到 Apache Superset 实例,浏览并加载数据集。", - "connectTo": "连接到 {{name}}", - "username": "用户名", - "password": "密码", - "login": "登录", - "logout": "退出登录", - "or": "或", - "ssoLogin": "SSO 单点登录", - "loginFailed": "登录失败", - "ssoFailed": "SSO 登录失败", - "checkingAuth": "正在检查认证状态...", - "dashboards": "仪表盘", - "datasets": "数据集", - "datasetsTitle": "Superset 数据集", - "dashboardsTitle": "Superset 仪表盘", - "refresh": "刷新", - "searchPlaceholder": "搜索数据集...", - "rowLimitTip": "最大加载行数", - "noDatasets": "未找到数据集。", - "noDashboards": "未找到仪表盘。", - "noDatasetsInDashboard": "该仪表盘下没有数据集。", - "columns": "{{count}} 列", - "rows": "{{count}} 行", - "loadOverwrite": "加载(已存在则覆盖)", - "createNewDataset": "以新表名后缀加载", - "loadDirect": "加载数据集", - "loadWithFilters": "带筛选条件加载", - "loadSuccess": "数据集 \"{{name}}\" 加载成功({{count}} 行)。", - "loadFailed": "加载数据集失败:{{message}}", - "suffixDialogTitle": "输入数据集名称后缀", - "suffixDialogDesc": "为数据集 \"{{name}}\" 指定一个后缀。", - "suffixPlaceholder": "例如 20260101", - "confirmLoad": "确认加载", - "filterDialogTitle": "筛选条件:{{dashboard}} → {{dataset}}", - "filterDialogHeading": "按条件加载", - "filterDialogHint": "读取该仪表盘允许的筛选项。留空的筛选项不会生效。", - "noFiltersAvailable": "当前仪表盘没有可用于该数据集的筛选项,将按原始数据集直接加载。", - "filterValuePlaceholder": "输入筛选内容", - "tableNameSuffix": "表名后缀(可选)", - "tableNameLabel": "将要导入的表名", - "suffixAutoGenerated": "根据条件自动生成", - "defaultValue": "默认值", - "multiSelect": "多选", - "noValueNeeded": "此操作无需填写值", - "valuePlaceholder": "值", - "valueToPlaceholder": "结束值", - "searchableSelect": "读取候选值,可搜索", - "selectValue": "选择值", - "resultsTruncated": "结果已截断,输入关键字缩小范围", - "loadOptionsFailed": "加载筛选选项失败", - "searchDashboards": "搜索仪表盘...", - "op.eq": "等于", - "op.neq": "不等于", - "op.gt": "大于", - "op.gte": "大于等于", - "op.lt": "小于", - "op.lte": "小于等于", - "op.in": "包含任一值", - "op.notIn": "不包含这些值", - "op.between": "介于", - "op.timeRange": "时间范围", - "op.contains": "包含", - "op.isNull": "为空", - "op.isNotNull": "非空", - "rememberCredentials": "在此服务器上记住凭证", - "rememberCredentialsHint": "保存后,后台 Agent 可自动拉取数据;不保存则 Agent 无法代你访问此数据源。", - "vaultStale": "已保存的凭证已失效,请重新输入。", - "browsePublic": "跳过登录,浏览公共数据", - "guestFailed": "进入访客模式失败", - "loginRequired": "加载数据需要登录,请先登录。" - } - } -} diff --git a/src/plugins/types.ts b/src/plugins/types.ts deleted file mode 100644 index e382fd00..00000000 --- a/src/plugins/types.ts +++ /dev/null @@ -1,60 +0,0 @@ -// Copyright (c) Microsoft Corporation. -// Licensed under the MIT License. - -/** - * Frontend type system for the data source plugin framework. - * - * Each plugin module exports a {@link DataSourcePluginModule} which the - * {@link PluginHost} component uses to render the plugin's UI inside the - * data upload dialog. - */ - -import type { FC } from 'react'; - -/** Backend plugin config received via `/api/app-config` → `PLUGINS.` */ -export interface PluginConfig { - id: string; - name: string; - icon?: string; - description?: string; - capabilities?: string[]; - auth_modes?: string[]; - /** Plugin-specific fields from `get_frontend_config()` */ - [key: string]: unknown; -} - -/** Callbacks that the plugin host provides to each plugin panel. */ -export interface PluginHostCallbacks { - /** Called after a dataset has been loaded into the workspace. */ - onDataLoaded: (info: DataLoadedInfo) => void; - /** Close the upload dialog. */ - onClose: () => void; -} - -export interface DataLoadedInfo { - tableName: string; - rowCount: number; - columns?: string[]; - source: string; -} - -/** - * A frontend plugin module — the unit that `import.meta.glob` discovers. - * - * Each plugin's `index.ts` must default-export an object matching this shape. - */ -export interface DataSourcePluginModule { - /** Must match `manifest.id` from the backend. */ - id: string; - /** MUI icon component for the data source menu card. */ - Icon: FC<{ sx?: object }>; - /** Main panel component rendered in the upload dialog. */ - Panel: FC; - /** Plugin-local translations, keyed by language code (e.g. `{ en: {...}, zh: {...} }`). */ - locales?: Record>; -} - -export interface PluginPanelProps { - config: PluginConfig; - callbacks: PluginHostCallbacks; -} diff --git a/src/views/UnifiedDataUploadDialog.tsx b/src/views/UnifiedDataUploadDialog.tsx index f9fa2913..82074ff2 100644 --- a/src/views/UnifiedDataUploadDialog.tsx +++ b/src/views/UnifiedDataUploadDialog.tsx @@ -39,7 +39,7 @@ import Backdrop from '@mui/material/Backdrop'; import { useDispatch, useSelector } from 'react-redux'; import { DataFormulatorState, dfActions } from '../app/dfSlice'; import { AppDispatch } from '../app/store'; -import { loadTable, loadPluginTable } from '../app/tableThunks'; +import { loadTable } from '../app/tableThunks'; import { DataSourceConfig, DictTable } from '../components/ComponentType'; import { createTableFromFromObjectArray, createTableFromText, loadTextDataWrapper, loadBinaryDataWrapper, readFileText } from '../data/utils'; import { DataLoadingChat } from './DataLoadingChat'; @@ -55,9 +55,8 @@ import FolderOpenIcon from '@mui/icons-material/FolderOpen'; import CloudIcon from '@mui/icons-material/Cloud'; import LanguageIcon from '@mui/icons-material/Language'; import { useTranslation } from 'react-i18next'; -import { getEnabledPlugins, PluginHost } from '../plugins'; -export type UploadTabType = 'menu' | 'upload' | 'paste' | 'url' | 'database' | 'extract' | 'explore' | `plugin:${string}` | 'add-connection' | `connector:${string}`; +export type UploadTabType = 'menu' | 'upload' | 'paste' | 'url' | 'database' | 'extract' | 'explore' | 'add-connection' | `connector:${string}`; interface TabPanelProps { children?: React.ReactNode; @@ -243,8 +242,6 @@ export const DataLoadMenu: React.FC = ({ }, ].filter(source => !(hideSampleDatasets && source.value === 'explore')); - const enabledPlugins = getEnabledPlugins((serverConfig as any)?.PLUGINS); - // All connectors get cards — connected ones show status, disconnected show type const liveDataSources: Array<{ value: UploadTabType; title: string; description: string; icon: React.ReactNode; disabled: boolean; dashed?: boolean }> = [ { @@ -273,13 +270,6 @@ export const DataLoadMenu: React.FC = ({ disabled: false, dashed: true, }, - ...enabledPlugins.map(({ module, config }) => ({ - value: `plugin:${module.id}` as UploadTabType, - title: config.name, - description: t(`plugin.${module.id}.description`, { defaultValue: config.description || '' }), - icon: , - disabled: false, - })), ]; if (variant === 'page') { @@ -1308,15 +1298,8 @@ export const UnifiedDataUploadDialog: React.FC = ( const showUrlPreview = urlPreviewLoading || !!urlPreviewError || (urlPreviewTables && urlPreviewTables.length > 0); const hasPasteContent = (pasteContent || '').trim() !== ''; - const enabledPluginsForDialog = getEnabledPlugins(serverConfig?.PLUGINS as any); - // Get current tab title for header const getCurrentTabTitle = () => { - if (activeTab.startsWith('plugin:')) { - const pluginId = activeTab.slice(7); - const found = enabledPluginsForDialog.find(p => p.module.id === pluginId); - return found?.config.name || pluginId; - } if (activeTab.startsWith('connector:')) { const connId = activeTab.slice(10); const found = connectorInstances.find(c => c.id === connId); @@ -1940,21 +1923,6 @@ export const UnifiedDataUploadDialog: React.FC = ( /> - {/* Plugin Tabs */} - {enabledPluginsForDialog.map(({ module, config }) => ( - - { - dispatch(loadPluginTable({ tableName: info.tableName, pluginId: info.source })); - handleClose(); - }} - onClose={handleClose} - /> - - ))} - {/* Extract Data Tab */} diff --git a/tests/backend/README.md b/tests/backend/README.md index e83be085..e71ecece 100644 --- a/tests/backend/README.md +++ b/tests/backend/README.md @@ -1,59 +1,15 @@ # Backend Tests -This directory contains backend Python tests, organized by responsibility. -Try to keep future additions within the appropriate layer instead of mixing concerns. +Organized by concern. All tests run with `uv run pytest tests/backend/`. -## Directory Layout - -```text +``` tests/backend/ - README.md - unit/ - README.md - test_unicode_table_name_sanitization.py - integration/ - README.md - contract/ - README.md - fixtures/ - README.md + auth/ ← identity, providers, vault, session config + routes/ ← Flask endpoint tests (tables, agents, sessions, credentials) + data/ ← loaders, connectors, workspace, parquet, table names + agents/ ← AI agents, model registry, prompts + security/ ← code signing, sanitization, URL allowlist, sandbox + fixtures/ ← shared test data (CSV, JSON, parquet) ``` -## Directory Responsibilities - -- `tests/backend/unit` - - pure function tests - - name sanitization - - utility helpers - - no Flask app or external service dependency - -- `tests/backend/integration` - - Flask routes - - workspace / datalake behavior - - table import, refresh, and metadata read/write flows - - may use temp directories, temp files, and monkeypatching - -- `tests/backend/contract` - - API boundary contract tests - - focused on stable input/output fields and compatibility guarantees - - for example, a Chinese table name should not degrade into an empty placeholder - -- `tests/backend/fixtures` - - test data files - - sample JSON / CSV / parquet files - - shared fixture documentation - -## Recommended Expansion Order - -1. Use `unit` tests to lock down sanitization behavior first. -2. Add `contract` tests for route-level input/output guarantees next. -3. Add `integration` tests for full import flows last. - -## Current Scope - -This first round is focused on: - -- Chinese table names -- Chinese column names -- non-ASCII identifiers -- fallback behavior when sanitization would otherwise produce an empty name +Docker-gated database tests live separately in `tests/database-dockers/` and are not auto-discovered by pytest. diff --git a/tests/backend/unit/test_agent_diagnostics.py b/tests/backend/agents/test_agent_diagnostics.py similarity index 100% rename from tests/backend/unit/test_agent_diagnostics.py rename to tests/backend/agents/test_agent_diagnostics.py diff --git a/tests/backend/unit/test_agent_utils_sql_table_names.py b/tests/backend/agents/test_agent_utils_sql_table_names.py similarity index 100% rename from tests/backend/unit/test_agent_utils_sql_table_names.py rename to tests/backend/agents/test_agent_utils_sql_table_names.py diff --git a/tests/backend/unit/test_client_image_strip.py b/tests/backend/agents/test_client_image_strip.py similarity index 100% rename from tests/backend/unit/test_client_image_strip.py rename to tests/backend/agents/test_client_image_strip.py diff --git a/tests/backend/unit/test_duckdb_notes_prompt.py b/tests/backend/agents/test_duckdb_notes_prompt.py similarity index 100% rename from tests/backend/unit/test_duckdb_notes_prompt.py rename to tests/backend/agents/test_duckdb_notes_prompt.py diff --git a/tests/backend/unit/test_model_registry.py b/tests/backend/agents/test_model_registry.py similarity index 100% rename from tests/backend/unit/test_model_registry.py rename to tests/backend/agents/test_model_registry.py diff --git a/tests/backend/unit/test_provenance_models.py b/tests/backend/agents/test_provenance_models.py similarity index 100% rename from tests/backend/unit/test_provenance_models.py rename to tests/backend/agents/test_provenance_models.py diff --git a/tests/backend/security/test_auth.py b/tests/backend/auth/test_auth.py similarity index 96% rename from tests/backend/security/test_auth.py rename to tests/backend/auth/test_auth.py index 0592887c..3e4ee64b 100644 --- a/tests/backend/security/test_auth.py +++ b/tests/backend/auth/test_auth.py @@ -13,9 +13,9 @@ import flask import pytest -import data_formulator.security.auth as auth_module -from data_formulator.security.auth import get_identity_id, _validate_identity_value -from data_formulator.auth_providers.azure_easyauth import AzureEasyAuthProvider +import data_formulator.auth.identity as auth_module +from data_formulator.auth.identity import get_identity_id, _validate_identity_value +from data_formulator.auth.providers.azure_easyauth import AzureEasyAuthProvider pytestmark = [pytest.mark.backend] diff --git a/tests/backend/integration/test_auth_info_endpoint.py b/tests/backend/auth/test_auth_info_endpoint.py similarity index 89% rename from tests/backend/integration/test_auth_info_endpoint.py rename to tests/backend/auth/test_auth_info_endpoint.py index 47dee8ef..d4a5ad65 100644 --- a/tests/backend/integration/test_auth_info_endpoint.py +++ b/tests/backend/auth/test_auth_info_endpoint.py @@ -11,10 +11,10 @@ import flask import pytest -import data_formulator.security.auth as auth_module -from data_formulator.auth_providers.azure_easyauth import AzureEasyAuthProvider -from data_formulator.auth_providers.github_oauth import GitHubOAuthProvider -from data_formulator.auth_providers.oidc import OIDCProvider +import data_formulator.auth.identity as auth_module +from data_formulator.auth.providers.azure_easyauth import AzureEasyAuthProvider +from data_formulator.auth.providers.github_oauth import GitHubOAuthProvider +from data_formulator.auth.providers.oidc import OIDCProvider pytestmark = [pytest.mark.backend, pytest.mark.auth] diff --git a/tests/backend/security/test_auth_provider_chain.py b/tests/backend/auth/test_auth_provider_chain.py similarity index 94% rename from tests/backend/security/test_auth_provider_chain.py rename to tests/backend/auth/test_auth_provider_chain.py index a4f49a3f..8ee70301 100644 --- a/tests/backend/security/test_auth_provider_chain.py +++ b/tests/backend/auth/test_auth_provider_chain.py @@ -12,16 +12,16 @@ import flask import pytest -import data_formulator.security.auth as auth_module -from data_formulator.security.auth import ( +import data_formulator.auth.identity as auth_module +from data_formulator.auth.identity import ( get_auth_result, get_identity_id, get_sso_token, init_auth, ) -from data_formulator.auth_providers.base import AuthProvider, AuthResult, AuthenticationError -from data_formulator.auth_providers import get_provider_class, list_available_providers -from data_formulator.security.auth import _validate_identity_value +from data_formulator.auth.providers.base import AuthProvider, AuthResult, AuthenticationError +from data_formulator.auth.providers import get_provider_class, list_available_providers +from data_formulator.auth.identity import _validate_identity_value pytestmark = [pytest.mark.backend, pytest.mark.auth] @@ -43,6 +43,7 @@ def _reset_auth_state(monkeypatch): """Ensure each test starts with a clean auth module state.""" monkeypatch.setattr(auth_module, "_provider", None) monkeypatch.setattr(auth_module, "_allow_anonymous", True) + monkeypatch.setattr(auth_module, "_localhost_identity", None) # ------------------------------------------------------------------ diff --git a/tests/backend/unit/test_azure_easyauth_provider.py b/tests/backend/auth/test_azure_easyauth_provider.py similarity index 94% rename from tests/backend/unit/test_azure_easyauth_provider.py rename to tests/backend/auth/test_azure_easyauth_provider.py index f38723e7..2e50aa5d 100644 --- a/tests/backend/unit/test_azure_easyauth_provider.py +++ b/tests/backend/auth/test_azure_easyauth_provider.py @@ -12,7 +12,7 @@ import flask import pytest -from data_formulator.auth_providers.azure_easyauth import AzureEasyAuthProvider +from data_formulator.auth.providers.azure_easyauth import AzureEasyAuthProvider pytestmark = [pytest.mark.backend, pytest.mark.auth] diff --git a/tests/backend/unit/test_credential_vault.py b/tests/backend/auth/test_credential_vault.py similarity index 96% rename from tests/backend/unit/test_credential_vault.py rename to tests/backend/auth/test_credential_vault.py index 4b0a352c..99a12255 100644 --- a/tests/backend/unit/test_credential_vault.py +++ b/tests/backend/auth/test_credential_vault.py @@ -18,7 +18,7 @@ def _make_vault(tmp_path, key=None): - from data_formulator.credential_vault.local_vault import LocalCredentialVault + from data_formulator.auth.vault.local_vault import LocalCredentialVault if key is None: key = Fernet.generate_key().decode() diff --git a/tests/backend/unit/test_credential_vault_factory.py b/tests/backend/auth/test_credential_vault_factory.py similarity index 72% rename from tests/backend/unit/test_credential_vault_factory.py rename to tests/backend/auth/test_credential_vault_factory.py index 29f528bf..f9e59402 100644 --- a/tests/backend/unit/test_credential_vault_factory.py +++ b/tests/backend/auth/test_credential_vault_factory.py @@ -20,7 +20,7 @@ @pytest.fixture(autouse=True) def _reset_vault_singleton(): """Reset module-level singleton state between tests.""" - import data_formulator.credential_vault as vault_mod + import data_formulator.auth.vault as vault_mod vault_mod._vault = None vault_mod._initialized = False yield @@ -35,12 +35,12 @@ def test_env_key_takes_priority(self, tmp_path, monkeypatch): monkeypatch.setenv("CREDENTIAL_VAULT_KEY", key) monkeypatch.setenv("CREDENTIAL_VAULT", "local") monkeypatch.setattr( - "data_formulator.credential_vault.get_data_formulator_home", + "data_formulator.auth.vault.get_data_formulator_home", lambda: tmp_path, ) - from data_formulator.credential_vault import get_credential_vault - from data_formulator.credential_vault.local_vault import LocalCredentialVault + from data_formulator.auth.vault import get_credential_vault + from data_formulator.auth.vault.local_vault import LocalCredentialVault vault = get_credential_vault() assert isinstance(vault, LocalCredentialVault) @@ -52,12 +52,12 @@ def test_default_type_is_local(self, tmp_path, monkeypatch): monkeypatch.setenv("CREDENTIAL_VAULT_KEY", key) monkeypatch.delenv("CREDENTIAL_VAULT", raising=False) monkeypatch.setattr( - "data_formulator.credential_vault.get_data_formulator_home", + "data_formulator.auth.vault.get_data_formulator_home", lambda: tmp_path, ) - from data_formulator.credential_vault import get_credential_vault - from data_formulator.credential_vault.local_vault import LocalCredentialVault + from data_formulator.auth.vault import get_credential_vault + from data_formulator.auth.vault.local_vault import LocalCredentialVault vault = get_credential_vault() assert isinstance(vault, LocalCredentialVault) @@ -70,12 +70,12 @@ def test_auto_generates_key_when_no_env(self, tmp_path, monkeypatch): monkeypatch.delenv("CREDENTIAL_VAULT_KEY", raising=False) monkeypatch.delenv("CREDENTIAL_VAULT", raising=False) monkeypatch.setattr( - "data_formulator.credential_vault.get_data_formulator_home", + "data_formulator.auth.vault.get_data_formulator_home", lambda: tmp_path, ) - from data_formulator.credential_vault import get_credential_vault - from data_formulator.credential_vault.local_vault import LocalCredentialVault + from data_formulator.auth.vault import get_credential_vault + from data_formulator.auth.vault.local_vault import LocalCredentialVault vault = get_credential_vault() assert isinstance(vault, LocalCredentialVault) @@ -95,12 +95,12 @@ def test_reuses_existing_key_file(self, tmp_path, monkeypatch): monkeypatch.delenv("CREDENTIAL_VAULT_KEY", raising=False) monkeypatch.setattr( - "data_formulator.credential_vault.get_data_formulator_home", + "data_formulator.auth.vault.get_data_formulator_home", lambda: tmp_path, ) - from data_formulator.credential_vault import get_credential_vault - from data_formulator.credential_vault.local_vault import LocalCredentialVault + from data_formulator.auth.vault import get_credential_vault + from data_formulator.auth.vault.local_vault import LocalCredentialVault vault = get_credential_vault() assert isinstance(vault, LocalCredentialVault) @@ -111,11 +111,11 @@ def test_auto_generated_vault_is_functional(self, tmp_path, monkeypatch): """Auto-generated vault can actually store and retrieve credentials.""" monkeypatch.delenv("CREDENTIAL_VAULT_KEY", raising=False) monkeypatch.setattr( - "data_formulator.credential_vault.get_data_formulator_home", + "data_formulator.auth.vault.get_data_formulator_home", lambda: tmp_path, ) - from data_formulator.credential_vault import get_credential_vault + from data_formulator.auth.vault import get_credential_vault vault = get_credential_vault() vault.store("user:alice", "superset", {"password": "s3cret"}) @@ -129,11 +129,11 @@ def test_unknown_type_returns_none(self, tmp_path, monkeypatch): monkeypatch.setenv("CREDENTIAL_VAULT_KEY", key) monkeypatch.setenv("CREDENTIAL_VAULT", "hashicorp") monkeypatch.setattr( - "data_formulator.credential_vault.get_data_formulator_home", + "data_formulator.auth.vault.get_data_formulator_home", lambda: tmp_path, ) - from data_formulator.credential_vault import get_credential_vault + from data_formulator.auth.vault import get_credential_vault assert get_credential_vault() is None @@ -144,11 +144,11 @@ def test_multiple_calls_return_same_instance(self, tmp_path, monkeypatch): key = Fernet.generate_key().decode() monkeypatch.setenv("CREDENTIAL_VAULT_KEY", key) monkeypatch.setattr( - "data_formulator.credential_vault.get_data_formulator_home", + "data_formulator.auth.vault.get_data_formulator_home", lambda: tmp_path, ) - from data_formulator.credential_vault import get_credential_vault + from data_formulator.auth.vault import get_credential_vault v1 = get_credential_vault() v2 = get_credential_vault() @@ -157,12 +157,12 @@ def test_multiple_calls_return_same_instance(self, tmp_path, monkeypatch): def test_singleton_is_cached(self, tmp_path, monkeypatch): monkeypatch.delenv("CREDENTIAL_VAULT_KEY", raising=False) monkeypatch.setattr( - "data_formulator.credential_vault.get_data_formulator_home", + "data_formulator.auth.vault.get_data_formulator_home", lambda: tmp_path, ) - from data_formulator.credential_vault import get_credential_vault - import data_formulator.credential_vault as vault_mod + from data_formulator.auth.vault import get_credential_vault + import data_formulator.auth.vault as vault_mod v1 = get_credential_vault() assert vault_mod._initialized is True diff --git a/tests/backend/unit/test_flask_session_config.py b/tests/backend/auth/test_flask_session_config.py similarity index 100% rename from tests/backend/unit/test_flask_session_config.py rename to tests/backend/auth/test_flask_session_config.py diff --git a/tests/backend/unit/test_github_oauth_provider.py b/tests/backend/auth/test_github_oauth_provider.py similarity index 95% rename from tests/backend/unit/test_github_oauth_provider.py rename to tests/backend/auth/test_github_oauth_provider.py index d5f20cae..69b27ba6 100644 --- a/tests/backend/unit/test_github_oauth_provider.py +++ b/tests/backend/auth/test_github_oauth_provider.py @@ -11,7 +11,7 @@ import flask import pytest -from data_formulator.auth_providers.github_oauth import GitHubOAuthProvider +from data_formulator.auth.providers.github_oauth import GitHubOAuthProvider pytestmark = [pytest.mark.backend, pytest.mark.auth] diff --git a/tests/backend/unit/test_oidc_provider.py b/tests/backend/auth/test_oidc_provider.py similarity index 96% rename from tests/backend/unit/test_oidc_provider.py rename to tests/backend/auth/test_oidc_provider.py index 1245ee05..dd43d743 100644 --- a/tests/backend/unit/test_oidc_provider.py +++ b/tests/backend/auth/test_oidc_provider.py @@ -18,8 +18,8 @@ from cryptography.hazmat.primitives.asymmetric import rsa from cryptography.hazmat.primitives import serialization -from data_formulator.auth_providers.base import AuthenticationError -from data_formulator.auth_providers.oidc import OIDCProvider +from data_formulator.auth.providers.base import AuthenticationError +from data_formulator.auth.providers.oidc import OIDCProvider pytestmark = [pytest.mark.backend, pytest.mark.auth] diff --git a/tests/backend/contract/README.md b/tests/backend/contract/README.md deleted file mode 100644 index 251f397c..00000000 --- a/tests/backend/contract/README.md +++ /dev/null @@ -1,16 +0,0 @@ -# Backend Contract Tests - -This directory contains API contract tests. - -Contract tests focus on guarantees rather than implementation details: - -- what is accepted as input -- what is returned as output -- which fields must remain stable -- which compatibility guarantees must not regress - -For the current issue, the main guarantees to lock down are: - -- a Chinese table name must not be sanitized into an empty string -- Chinese column names must not disappear at the boundary layer -- the `table_name` returned to the frontend must remain traceable to the actual stored name diff --git a/tests/backend/unit/test_all_loader_verification.py b/tests/backend/data/test_all_loader_verification.py similarity index 100% rename from tests/backend/unit/test_all_loader_verification.py rename to tests/backend/data/test_all_loader_verification.py diff --git a/tests/backend/unit/test_atomic_metadata_update.py b/tests/backend/data/test_atomic_metadata_update.py similarity index 100% rename from tests/backend/unit/test_atomic_metadata_update.py rename to tests/backend/data/test_atomic_metadata_update.py diff --git a/tests/backend/unit/test_data_connector_config.py b/tests/backend/data/test_data_connector_config.py similarity index 100% rename from tests/backend/unit/test_data_connector_config.py rename to tests/backend/data/test_data_connector_config.py diff --git a/tests/backend/unit/test_data_connector_framework.py b/tests/backend/data/test_data_connector_framework.py similarity index 99% rename from tests/backend/unit/test_data_connector_framework.py rename to tests/backend/data/test_data_connector_framework.py index f43fd4eb..89bd0b08 100644 --- a/tests/backend/unit/test_data_connector_framework.py +++ b/tests/backend/data/test_data_connector_framework.py @@ -517,7 +517,7 @@ def test_import_success(self, connected_client): mock_meta.row_count = 5 with patch.object(DataConnector, "_get_identity", return_value="test-user"), \ - patch("data_formulator.security.auth.get_identity_id", return_value="test-user"), \ + patch("data_formulator.auth.identity.get_identity_id", return_value="test-user"), \ patch("data_formulator.workspace_factory.get_workspace") as mock_ws, \ patch.object(MockLoader, "ingest_to_workspace", return_value=mock_meta): resp = connected_client.post("/api/connectors/import-data", json={ diff --git a/tests/backend/unit/test_data_connector_vault.py b/tests/backend/data/test_data_connector_vault.py similarity index 99% rename from tests/backend/unit/test_data_connector_vault.py rename to tests/backend/data/test_data_connector_vault.py index 2d362f6c..4f710935 100644 --- a/tests/backend/unit/test_data_connector_vault.py +++ b/tests/backend/data/test_data_connector_vault.py @@ -27,7 +27,7 @@ DataConnector, connectors_bp, ) -from data_formulator.credential_vault.base import CredentialVault +from data_formulator.auth.vault.base import CredentialVault from data_formulator.data_loader.external_data_loader import ( CatalogNode, ExternalDataLoader, diff --git a/tests/backend/integration/test_excel_fixture_parsing.py b/tests/backend/data/test_excel_fixture_parsing.py similarity index 100% rename from tests/backend/integration/test_excel_fixture_parsing.py rename to tests/backend/data/test_excel_fixture_parsing.py diff --git a/tests/backend/unit/test_external_data_loader_table_names.py b/tests/backend/data/test_external_data_loader_table_names.py similarity index 100% rename from tests/backend/unit/test_external_data_loader_table_names.py rename to tests/backend/data/test_external_data_loader_table_names.py diff --git a/tests/backend/unit/test_file_manager_encoding.py b/tests/backend/data/test_file_manager_encoding.py similarity index 100% rename from tests/backend/unit/test_file_manager_encoding.py rename to tests/backend/data/test_file_manager_encoding.py diff --git a/tests/backend/unit/test_file_manager_table_names.py b/tests/backend/data/test_file_manager_table_names.py similarity index 100% rename from tests/backend/unit/test_file_manager_table_names.py rename to tests/backend/data/test_file_manager_table_names.py diff --git a/tests/backend/unit/test_json_chinese_serialization.py b/tests/backend/data/test_json_chinese_serialization.py similarity index 100% rename from tests/backend/unit/test_json_chinese_serialization.py rename to tests/backend/data/test_json_chinese_serialization.py diff --git a/tests/backend/unit/test_parquet_utils_table_names.py b/tests/backend/data/test_parquet_utils_table_names.py similarity index 100% rename from tests/backend/unit/test_parquet_utils_table_names.py rename to tests/backend/data/test_parquet_utils_table_names.py diff --git a/tests/backend/unit/test_safe_data_filename.py b/tests/backend/data/test_safe_data_filename.py similarity index 100% rename from tests/backend/unit/test_safe_data_filename.py rename to tests/backend/data/test_safe_data_filename.py diff --git a/tests/backend/contract/test_table_name_contracts.py b/tests/backend/data/test_table_name_contracts.py similarity index 89% rename from tests/backend/contract/test_table_name_contracts.py rename to tests/backend/data/test_table_name_contracts.py index ad8eab09..a2f87099 100644 --- a/tests/backend/contract/test_table_name_contracts.py +++ b/tests/backend/data/test_table_name_contracts.py @@ -3,7 +3,7 @@ import pytest from data_formulator.datalake.parquet_utils import sanitize_table_name as parquet_sanitize -from data_formulator.tables_routes import sanitize_table_name as route_sanitize +from data_formulator.routes.tables import sanitize_table_name as route_sanitize pytestmark = [pytest.mark.backend, pytest.mark.contract] diff --git a/tests/backend/unit/test_unicode_table_name_sanitization.py b/tests/backend/data/test_unicode_table_name_sanitization.py similarity index 100% rename from tests/backend/unit/test_unicode_table_name_sanitization.py rename to tests/backend/data/test_unicode_table_name_sanitization.py diff --git a/tests/backend/unit/test_workspace_fresh_names.py b/tests/backend/data/test_workspace_fresh_names.py similarity index 100% rename from tests/backend/unit/test_workspace_fresh_names.py rename to tests/backend/data/test_workspace_fresh_names.py diff --git a/tests/backend/unit/test_workspace_manager.py b/tests/backend/data/test_workspace_manager.py similarity index 100% rename from tests/backend/unit/test_workspace_manager.py rename to tests/backend/data/test_workspace_manager.py diff --git a/tests/backend/unit/test_workspace_source_file_ops.py b/tests/backend/data/test_workspace_source_file_ops.py similarity index 100% rename from tests/backend/unit/test_workspace_source_file_ops.py rename to tests/backend/data/test_workspace_source_file_ops.py diff --git a/tests/backend/integration/README.md b/tests/backend/integration/README.md deleted file mode 100644 index f1d66552..00000000 --- a/tests/backend/integration/README.md +++ /dev/null @@ -1,30 +0,0 @@ -# Backend Integration Tests - -This directory contains backend integration tests. - -Good candidates for this layer: - -- Flask route tests -- table create / ingest / refresh flows -- real workspace and datalake interactions -- sandbox execution (local and Docker) - -Data loader tests (MySQL, MongoDB, PostgreSQL, BigQuery) live in -`tests/plugin/` — see that directory's README for setup instructions. - -## Running - -```bash -# All integration tests -pytest tests/backend/integration/ -v - -# Sandbox tests only -pytest tests/backend/integration/test_sandbox.py -v -``` - -# Start + run all loader tests in one shot -./tests/run_test_dbs.sh test - -# Tear down -./tests/run_test_dbs.sh stop -``` diff --git a/tests/backend/integration/test_plugin_app_config.py b/tests/backend/integration/test_plugin_app_config.py deleted file mode 100644 index 749869a8..00000000 --- a/tests/backend/integration/test_plugin_app_config.py +++ /dev/null @@ -1,182 +0,0 @@ -"""Integration tests for plugin information in ``/api/app-config``. - -Verifies that enabled plugins appear under the ``PLUGINS`` key in the -``/api/app-config`` response, with their manifest + frontend config merged. -""" -from __future__ import annotations - -import types -from unittest.mock import patch - -import flask -import pytest - -import data_formulator.plugins as plugins_module -from data_formulator.plugins import ( - DISABLED_PLUGINS, - ENABLED_PLUGINS, - discover_and_register, -) -from data_formulator.plugins.base import DataSourcePlugin -import data_formulator.security.auth as auth_module - -pytestmark = [pytest.mark.backend, pytest.mark.plugin] - - -# ------------------------------------------------------------------ -# Stub plugin -# ------------------------------------------------------------------ - -class _DemoPlugin(DataSourcePlugin): - - @staticmethod - def manifest(): - return { - "id": "demo", - "name": "Demo Plugin", - "env_prefix": "PLG_DEMO", - "required_env": [], - "capabilities": ["datasets"], - "auth_modes": ["password"], - "icon": "demo-icon", - "description": "A demo plugin for testing", - } - - def create_blueprint(self): - return flask.Blueprint("plugin_demo", __name__, url_prefix="/api/plugins/demo/") - - def get_frontend_config(self): - return {"base_url": "http://demo.local", "login_url": "/api/plugins/demo/login"} - - -# ------------------------------------------------------------------ -# Fixtures -# ------------------------------------------------------------------ - -@pytest.fixture -def app(): - """Minimal Flask app with /api/app-config and plugin discovery.""" - _app = flask.Flask(__name__) - _app.config["TESTING"] = True - _app.config["CLI_ARGS"] = { - "sandbox": "local", - "disable_display_keys": False, - "project_front_page": False, - "max_display_rows": 10000, - "dev": False, - "workspace_backend": "ephemeral", - "available_languages": ["en"], - } - - @_app.route("/api/app-config") - def get_app_config(): - args = _app.config["CLI_ARGS"] - config = { - "SANDBOX": args["sandbox"], - "DISABLE_DISPLAY_KEYS": args["disable_display_keys"], - "PROJECT_FRONT_PAGE": args["project_front_page"], - "MAX_DISPLAY_ROWS": args["max_display_rows"], - "DEV_MODE": args.get("dev", False), - "WORKSPACE_BACKEND": args.get("workspace_backend", "local"), - "AVAILABLE_LANGUAGES": args.get("available_languages", ["en"]), - } - - from data_formulator.plugins import ENABLED_PLUGINS as ep - if ep: - plugins_info: dict[str, dict] = {} - for pid, plugin in ep.items(): - manifest = plugin.manifest() - frontend_cfg = plugin.get_frontend_config() - plugins_info[pid] = { - "id": manifest.get("id", pid), - "name": manifest.get("name", pid), - "icon": manifest.get("icon"), - "description": manifest.get("description"), - "capabilities": manifest.get("capabilities", []), - "auth_modes": manifest.get("auth_modes", []), - **frontend_cfg, - } - config["PLUGINS"] = plugins_info - - return flask.jsonify(config) - - return _app - - -@pytest.fixture -def client(app): - return app.test_client() - - -@pytest.fixture(autouse=True) -def _clean_plugin_state(): - ENABLED_PLUGINS.clear() - DISABLED_PLUGINS.clear() - yield - ENABLED_PLUGINS.clear() - DISABLED_PLUGINS.clear() - - -@pytest.fixture(autouse=True) -def _reset_auth(monkeypatch): - monkeypatch.setattr(auth_module, "_provider", None) - monkeypatch.setattr(auth_module, "_allow_anonymous", True) - - -# ------------------------------------------------------------------ -# Tests -# ------------------------------------------------------------------ - -class TestAppConfigPlugins: - - def test_no_plugins_key_when_none_enabled(self, client): - resp = client.get("/api/app-config") - data = resp.get_json() - assert resp.status_code == 200 - assert "PLUGINS" not in data - - def test_plugin_info_exposed_when_enabled(self, app, client): - plugin = _DemoPlugin() - ENABLED_PLUGINS["demo"] = plugin - - resp = client.get("/api/app-config") - data = resp.get_json() - - assert "PLUGINS" in data - demo = data["PLUGINS"]["demo"] - assert demo["id"] == "demo" - assert demo["name"] == "Demo Plugin" - assert demo["icon"] == "demo-icon" - assert demo["capabilities"] == ["datasets"] - assert demo["auth_modes"] == ["password"] - assert demo["base_url"] == "http://demo.local" - assert demo["login_url"] == "/api/plugins/demo/login" - - def test_multiple_plugins(self, app, client): - - class _SecondPlugin(DataSourcePlugin): - @staticmethod - def manifest(): - return { - "id": "second", - "name": "Second", - "env_prefix": "PLG_SECOND", - "required_env": [], - } - - def create_blueprint(self): - return flask.Blueprint("plugin_second", __name__, url_prefix="/api/plugins/second/") - - def get_frontend_config(self): - return {"mode": "read-only"} - - ENABLED_PLUGINS["demo"] = _DemoPlugin() - ENABLED_PLUGINS["second"] = _SecondPlugin() - - resp = client.get("/api/app-config") - data = resp.get_json() - - assert len(data["PLUGINS"]) == 2 - assert "demo" in data["PLUGINS"] - assert "second" in data["PLUGINS"] - assert data["PLUGINS"]["second"]["mode"] == "read-only" diff --git a/tests/backend/integration/test_plugin_auth_with_vault.py b/tests/backend/integration/test_plugin_auth_with_vault.py deleted file mode 100644 index f3e0cf38..00000000 --- a/tests/backend/integration/test_plugin_auth_with_vault.py +++ /dev/null @@ -1,170 +0,0 @@ -"""Integration tests for plugin authentication + CredentialVault interplay. - -Verifies: -- Vault has valid credentials → plugin auto-login succeeds (mode=vault) -- Vault has stale credentials (password changed) → returns vault_stale -- User manually logs in with remember=true → credentials stored in Vault -- User manually logs in with remember=false → old Vault credentials deleted -- Vault not configured → status endpoint still works (skips Vault step) -""" -from __future__ import annotations - -from unittest.mock import MagicMock, patch - -import flask -import pytest -from cryptography.fernet import Fernet - -pytestmark = [pytest.mark.backend, pytest.mark.vault, pytest.mark.plugin] - - -@pytest.fixture -def vault(tmp_path): - from data_formulator.credential_vault.local_vault import LocalCredentialVault - - key = Fernet.generate_key().decode() - return LocalCredentialVault(tmp_path / "test_creds.db", key) - - -@pytest.fixture -def superset_app(vault): - """Flask app with Superset plugin auth routes and a mocked bridge.""" - _app = flask.Flask(__name__) - _app.config["TESTING"] = True - _app.secret_key = "test-secret" - - mock_bridge = MagicMock() - _app.extensions = {"plugin_superset_bridge": mock_bridge} - - from data_formulator.plugins.superset.routes.auth import auth_bp - _app.register_blueprint(auth_bp) - - yield _app, mock_bridge, vault - - -class TestVaultAutoLogin: - - def test_vault_credentials_valid_auto_login(self, superset_app): - """Vault has valid credentials → auth/status returns authenticated + mode=vault.""" - app, mock_bridge, vault = superset_app - - mock_bridge.login.return_value = { - "access_token": "tok_valid", - "refresh_token": "ref_valid", - } - mock_bridge.get_user_info.return_value = { - "id": 1, "username": "alice", "first_name": "Alice", "last_name": "W", - } - - vault.store("user:alice", "superset", {"username": "alice", "password": "correct_pw"}) - - with app.test_client() as c, \ - patch("data_formulator.credential_vault.get_credential_vault", return_value=vault), \ - patch("data_formulator.security.auth.get_identity_id", return_value="user:alice"): - resp = c.get("/api/plugins/superset/auth/status") - data = resp.get_json() - - assert data["authenticated"] is True - assert data.get("mode") == "vault" - - def test_vault_credentials_stale(self, superset_app): - """Vault credentials are stale (password changed) → vault_stale=true.""" - app, mock_bridge, vault = superset_app - - mock_bridge.login.side_effect = Exception("Invalid credentials") - - vault.store("user:alice", "superset", {"username": "alice", "password": "old_pw"}) - - with app.test_client() as c, \ - patch("data_formulator.credential_vault.get_credential_vault", return_value=vault), \ - patch("data_formulator.security.auth.get_identity_id", return_value="user:alice"): - resp = c.get("/api/plugins/superset/auth/status") - data = resp.get_json() - - assert data["authenticated"] is False - assert data["vault_stale"] is True - - -class TestLoginWithRemember: - - def test_remember_true_stores_in_vault(self, superset_app): - """Login with remember=true → credentials written to Vault.""" - app, mock_bridge, vault = superset_app - - mock_bridge.login.return_value = {"access_token": "tok", "refresh_token": "ref"} - mock_bridge.get_user_info.return_value = { - "id": 1, "username": "alice", "first_name": "Alice", "last_name": "W", - } - - with app.test_client() as c, \ - patch("data_formulator.credential_vault.get_credential_vault", return_value=vault), \ - patch("data_formulator.security.auth.get_identity_id", return_value="user:alice"): - resp = c.post("/api/plugins/superset/auth/login", json={ - "username": "alice", - "password": "new_pw", - "remember": True, - }) - assert resp.status_code == 200 - - stored = vault.retrieve("user:alice", "superset") - assert stored is not None - assert stored["username"] == "alice" - assert stored["password"] == "new_pw" - - def test_remember_false_deletes_vault_credential(self, superset_app): - """Login with remember=false → old Vault credential removed.""" - app, mock_bridge, vault = superset_app - - vault.store("user:alice", "superset", {"username": "alice", "password": "old"}) - - mock_bridge.login.return_value = {"access_token": "tok", "refresh_token": "ref"} - mock_bridge.get_user_info.return_value = { - "id": 1, "username": "alice", "first_name": "Alice", "last_name": "W", - } - - with app.test_client() as c, \ - patch("data_formulator.credential_vault.get_credential_vault", return_value=vault), \ - patch("data_formulator.security.auth.get_identity_id", return_value="user:alice"): - resp = c.post("/api/plugins/superset/auth/login", json={ - "username": "alice", - "password": "new_pw", - "remember": False, - }) - assert resp.status_code == 200 - - assert vault.retrieve("user:alice", "superset") is None - - -class TestVaultNotConfigured: - - def test_status_works_without_vault(self, superset_app): - """No Vault configured → status still works, no crash.""" - app, mock_bridge, _ = superset_app - - with app.test_client() as c, \ - patch("data_formulator.credential_vault.get_credential_vault", return_value=None), \ - patch("data_formulator.security.auth.get_identity_id", return_value="user:alice"): - resp = c.get("/api/plugins/superset/auth/status") - data = resp.get_json() - - assert data["authenticated"] is False - assert "vault_stale" not in data - - def test_login_remember_without_vault_still_succeeds(self, superset_app): - """Login with remember=true but no Vault → login works, no crash.""" - app, mock_bridge, _ = superset_app - - mock_bridge.login.return_value = {"access_token": "tok", "refresh_token": "ref"} - mock_bridge.get_user_info.return_value = { - "id": 1, "username": "alice", "first_name": "Alice", "last_name": "W", - } - - with app.test_client() as c, \ - patch("data_formulator.credential_vault.get_credential_vault", return_value=None), \ - patch("data_formulator.security.auth.get_identity_id", return_value="user:alice"): - resp = c.post("/api/plugins/superset/auth/login", json={ - "username": "alice", - "password": "pw", - "remember": True, - }) - assert resp.status_code == 200 diff --git a/tests/backend/unit/test_agent_diagnostics_wiring.py b/tests/backend/routes/test_agent_diagnostics_wiring.py similarity index 100% rename from tests/backend/unit/test_agent_diagnostics_wiring.py rename to tests/backend/routes/test_agent_diagnostics_wiring.py diff --git a/tests/backend/integration/test_create_table_replace_source.py b/tests/backend/routes/test_create_table_replace_source.py similarity index 94% rename from tests/backend/integration/test_create_table_replace_source.py rename to tests/backend/routes/test_create_table_replace_source.py index 30587f10..e077a83f 100644 --- a/tests/backend/integration/test_create_table_replace_source.py +++ b/tests/backend/routes/test_create_table_replace_source.py @@ -15,7 +15,7 @@ from flask import Flask from data_formulator.datalake.workspace import Workspace -from data_formulator.tables_routes import tables_bp +from data_formulator.routes.tables import tables_bp pytestmark = [pytest.mark.backend] @@ -34,7 +34,7 @@ def client(tmp_workspace): app = Flask(__name__) app.config["TESTING"] = True app.register_blueprint(tables_bp) - with patch("data_formulator.tables_routes._get_workspace", return_value=tmp_workspace): + with patch("data_formulator.routes.tables._get_workspace", return_value=tmp_workspace): with app.test_client() as c: yield c diff --git a/tests/backend/integration/test_create_table_xls_upload.py b/tests/backend/routes/test_create_table_xls_upload.py similarity index 95% rename from tests/backend/integration/test_create_table_xls_upload.py rename to tests/backend/routes/test_create_table_xls_upload.py index 3f4bfc48..b9a9fc33 100644 --- a/tests/backend/integration/test_create_table_xls_upload.py +++ b/tests/backend/routes/test_create_table_xls_upload.py @@ -15,7 +15,7 @@ from flask import Flask from data_formulator.datalake.workspace import Workspace -from data_formulator.tables_routes import tables_bp +from data_formulator.routes.tables import tables_bp pytestmark = [pytest.mark.backend] @@ -35,7 +35,7 @@ def client(tmp_workspace): app.config["TESTING"] = True app.register_blueprint(tables_bp) - with patch("data_formulator.tables_routes._get_workspace", return_value=tmp_workspace): + with patch("data_formulator.routes.tables._get_workspace", return_value=tmp_workspace): with app.test_client() as c: yield c diff --git a/tests/backend/integration/test_credential_routes.py b/tests/backend/routes/test_credential_routes.py similarity index 88% rename from tests/backend/integration/test_credential_routes.py rename to tests/backend/routes/test_credential_routes.py index 9e3b1f39..c03f7b6d 100644 --- a/tests/backend/integration/test_credential_routes.py +++ b/tests/backend/routes/test_credential_routes.py @@ -19,7 +19,7 @@ @pytest.fixture def vault(tmp_path): - from data_formulator.credential_vault.local_vault import LocalCredentialVault + from data_formulator.auth.vault.local_vault import LocalCredentialVault key = Fernet.generate_key().decode() return LocalCredentialVault(tmp_path / "test_creds.db", key) @@ -32,11 +32,11 @@ def app_with_vault(vault): _app.config["TESTING"] = True _app.secret_key = "test-secret" - from data_formulator.credential_routes import credential_bp + from data_formulator.routes.credentials import credential_bp _app.register_blueprint(credential_bp) - with patch("data_formulator.credential_routes.get_credential_vault", return_value=vault), \ - patch("data_formulator.credential_routes.get_identity_id") as mock_id: + with patch("data_formulator.routes.credentials.get_credential_vault", return_value=vault), \ + patch("data_formulator.routes.credentials.get_identity_id") as mock_id: mock_id.return_value = "user:alice" yield _app, mock_id @@ -48,11 +48,11 @@ def app_no_vault(): _app.config["TESTING"] = True _app.secret_key = "test-secret" - from data_formulator.credential_routes import credential_bp + from data_formulator.routes.credentials import credential_bp _app.register_blueprint(credential_bp) - with patch("data_formulator.credential_routes.get_credential_vault", return_value=None), \ - patch("data_formulator.credential_routes.get_identity_id", return_value="user:alice"): + with patch("data_formulator.routes.credentials.get_credential_vault", return_value=None), \ + patch("data_formulator.routes.credentials.get_identity_id", return_value="user:alice"): yield _app diff --git a/tests/backend/integration/test_csv_encoding_roundtrip.py b/tests/backend/routes/test_csv_encoding_roundtrip.py similarity index 95% rename from tests/backend/integration/test_csv_encoding_roundtrip.py rename to tests/backend/routes/test_csv_encoding_roundtrip.py index ecfddb6d..11c05aa5 100644 --- a/tests/backend/integration/test_csv_encoding_roundtrip.py +++ b/tests/backend/routes/test_csv_encoding_roundtrip.py @@ -15,7 +15,7 @@ from flask import Flask from data_formulator.datalake.workspace import Workspace -from data_formulator.tables_routes import tables_bp +from data_formulator.routes.tables import tables_bp pytestmark = [pytest.mark.backend] @@ -34,7 +34,7 @@ def client(tmp_workspace): app = Flask(__name__) app.config["TESTING"] = True app.register_blueprint(tables_bp) - with patch("data_formulator.tables_routes._get_workspace", return_value=tmp_workspace): + with patch("data_formulator.routes.tables._get_workspace", return_value=tmp_workspace): with app.test_client() as c: yield c diff --git a/tests/backend/integration/test_derive_data_repair_loop.py b/tests/backend/routes/test_derive_data_repair_loop.py similarity index 97% rename from tests/backend/integration/test_derive_data_repair_loop.py rename to tests/backend/routes/test_derive_data_repair_loop.py index e771abe6..5cc34aaa 100644 --- a/tests/backend/integration/test_derive_data_repair_loop.py +++ b/tests/backend/routes/test_derive_data_repair_loop.py @@ -16,11 +16,11 @@ import pytest from flask import Flask -from data_formulator.agent_routes import agent_bp +from data_formulator.routes.agents import agent_bp pytestmark = [pytest.mark.backend] -MODULE = "data_formulator.agent_routes" +MODULE = "data_formulator.routes.agents" # --------------------------------------------------------------------------- diff --git a/tests/backend/unit/test_list_global_models_api.py b/tests/backend/routes/test_list_global_models_api.py similarity index 85% rename from tests/backend/unit/test_list_global_models_api.py rename to tests/backend/routes/test_list_global_models_api.py index 9547da8c..98ed29e9 100644 --- a/tests/backend/unit/test_list_global_models_api.py +++ b/tests/backend/routes/test_list_global_models_api.py @@ -42,7 +42,7 @@ class TestListGlobalModelsEndpoint: @patch.dict(os.environ, SAMPLE_ENV, clear=True) def test_returns_all_configured_models(self, flask_client): registry = ModelRegistry() - with patch("data_formulator.agent_routes.model_registry", registry): + with patch("data_formulator.routes.agents.model_registry", registry): resp = flask_client.get("/api/agent/list-global-models") assert resp.status_code == 200 data = json.loads(resp.data) @@ -52,7 +52,7 @@ def test_returns_all_configured_models(self, flask_client): def test_response_has_required_fields(self, flask_client): registry = ModelRegistry() required = {"id", "endpoint", "model", "api_base", "api_version", "is_global"} - with patch("data_formulator.agent_routes.model_registry", registry): + with patch("data_formulator.routes.agents.model_registry", registry): resp = flask_client.get("/api/agent/list-global-models") data = json.loads(resp.data) for item in data: @@ -61,7 +61,7 @@ def test_response_has_required_fields(self, flask_client): @patch.dict(os.environ, SAMPLE_ENV, clear=True) def test_no_api_key_in_response(self, flask_client): registry = ModelRegistry() - with patch("data_formulator.agent_routes.model_registry", registry): + with patch("data_formulator.routes.agents.model_registry", registry): resp = flask_client.get("/api/agent/list-global-models") raw = resp.data.decode("utf-8") assert "sk-secret-key-12345" not in raw @@ -71,7 +71,7 @@ def test_no_api_key_in_response(self, flask_client): @patch.dict(os.environ, SAMPLE_ENV, clear=True) def test_all_models_marked_global(self, flask_client): registry = ModelRegistry() - with patch("data_formulator.agent_routes.model_registry", registry): + with patch("data_formulator.routes.agents.model_registry", registry): resp = flask_client.get("/api/agent/list-global-models") data = json.loads(resp.data) assert all(m["is_global"] is True for m in data) @@ -79,7 +79,7 @@ def test_all_models_marked_global(self, flask_client): @patch.dict(os.environ, {}, clear=True) def test_empty_env_returns_empty_list(self, flask_client): registry = ModelRegistry() - with patch("data_formulator.agent_routes.model_registry", registry): + with patch("data_formulator.routes.agents.model_registry", registry): resp = flask_client.get("/api/agent/list-global-models") data = json.loads(resp.data) assert data == [] diff --git a/tests/backend/integration/test_parse_file_endpoint.py b/tests/backend/routes/test_parse_file_endpoint.py similarity index 94% rename from tests/backend/integration/test_parse_file_endpoint.py rename to tests/backend/routes/test_parse_file_endpoint.py index 393db11e..008d57d7 100644 --- a/tests/backend/integration/test_parse_file_endpoint.py +++ b/tests/backend/routes/test_parse_file_endpoint.py @@ -8,7 +8,7 @@ import pytest from flask import Flask -from data_formulator.tables_routes import tables_bp +from data_formulator.routes.tables import tables_bp pytestmark = [pytest.mark.backend] diff --git a/tests/backend/contract/test_same_basename_upload.py b/tests/backend/routes/test_same_basename_upload.py similarity index 96% rename from tests/backend/contract/test_same_basename_upload.py rename to tests/backend/routes/test_same_basename_upload.py index 5e7f88c7..aca525c3 100644 --- a/tests/backend/contract/test_same_basename_upload.py +++ b/tests/backend/routes/test_same_basename_upload.py @@ -27,7 +27,7 @@ from data_formulator.datalake.parquet_utils import ( sanitize_table_name as parquet_sanitize_table_name, ) -from data_formulator.tables_routes import tables_bp +from data_formulator.routes.tables import tables_bp pytestmark = [pytest.mark.backend, pytest.mark.contract] @@ -46,7 +46,7 @@ def client(tmp_workspace): app = Flask(__name__) app.config["TESTING"] = True app.register_blueprint(tables_bp) - with patch("data_formulator.tables_routes._get_workspace", return_value=tmp_workspace): + with patch("data_formulator.routes.tables._get_workspace", return_value=tmp_workspace): with app.test_client() as c: yield c diff --git a/tests/backend/unit/test_session_routes_migration.py b/tests/backend/routes/test_session_routes_migration.py similarity index 77% rename from tests/backend/unit/test_session_routes_migration.py rename to tests/backend/routes/test_session_routes_migration.py index 898b54c2..54c88529 100644 --- a/tests/backend/unit/test_session_routes_migration.py +++ b/tests/backend/routes/test_session_routes_migration.py @@ -8,7 +8,7 @@ import flask import pytest -from data_formulator.session_routes import session_bp +from data_formulator.routes.sessions import session_bp pytestmark = [pytest.mark.backend] @@ -36,9 +36,9 @@ def test_migrate_moves_and_cleans_source(self, client): target_mgr.move_workspaces_from.return_value = ["ws_a", "ws_b"] with ( - patch("data_formulator.session_routes.get_identity_id", return_value="user:alice"), + patch("data_formulator.routes.sessions.get_identity_id", return_value="user:alice"), patch( - "data_formulator.session_routes.get_workspace_manager", + "data_formulator.routes.sessions.get_workspace_manager", side_effect=[source_mgr, target_mgr], ), ): @@ -55,7 +55,7 @@ def test_migrate_moves_and_cleans_source(self, client): source_mgr.delete_all_workspaces.assert_called_once() def test_migrate_rejects_non_user(self, client): - with patch("data_formulator.session_routes.get_identity_id", return_value="browser:abc"): + with patch("data_formulator.routes.sessions.get_identity_id", return_value="browser:abc"): resp = client.post( "/api/sessions/migrate", json={"source_identity": "browser:xyz"}, @@ -71,8 +71,8 @@ def test_cleanup_success(self, client): source_mgr.delete_all_workspaces.return_value = 3 with ( - patch("data_formulator.session_routes.get_identity_id", return_value="user:alice"), - patch("data_formulator.session_routes.get_workspace_manager", return_value=source_mgr), + patch("data_formulator.routes.sessions.get_identity_id", return_value="user:alice"), + patch("data_formulator.routes.sessions.get_workspace_manager", return_value=source_mgr), ): resp = client.post( "/api/sessions/cleanup-anonymous", @@ -86,7 +86,7 @@ def test_cleanup_success(self, client): source_mgr.delete_all_workspaces.assert_called_once() def test_cleanup_rejects_non_user(self, client): - with patch("data_formulator.session_routes.get_identity_id", return_value="browser:abc"): + with patch("data_formulator.routes.sessions.get_identity_id", return_value="browser:abc"): resp = client.post( "/api/sessions/cleanup-anonymous", json={"source_identity": "browser:xyz"}, diff --git a/tests/backend/security/test_global_model_security.py b/tests/backend/security/test_global_model_security.py index e9330a32..d0961dac 100644 --- a/tests/backend/security/test_global_model_security.py +++ b/tests/backend/security/test_global_model_security.py @@ -37,8 +37,8 @@ def test_global_model_gets_real_api_key(self): resolved to the full config with the real api_key.""" registry = ModelRegistry() - with patch("data_formulator.agent_routes.model_registry", registry): - from data_formulator.agent_routes import get_client + with patch("data_formulator.routes.agents.model_registry", registry): + from data_formulator.routes.agents import get_client client = get_client({ "id": "global-openai-gpt-4o", @@ -55,8 +55,8 @@ def test_user_model_keeps_own_credentials(self): not touch the registry.""" registry = ModelRegistry() - with patch("data_formulator.agent_routes.model_registry", registry): - from data_formulator.agent_routes import get_client + with patch("data_formulator.routes.agents.model_registry", registry): + from data_formulator.routes.agents import get_client client = get_client({ "id": "user-custom-model", @@ -75,8 +75,8 @@ def test_global_model_without_registry_match_falls_through(self): still work (using whatever config was passed).""" registry = ModelRegistry() - with patch("data_formulator.agent_routes.model_registry", registry): - from data_formulator.agent_routes import get_client + with patch("data_formulator.routes.agents.model_registry", registry): + from data_formulator.routes.agents import get_client client = get_client({ "id": "global-nonexistent-model", diff --git a/tests/backend/integration/test_sandbox.py b/tests/backend/security/test_sandbox.py similarity index 100% rename from tests/backend/integration/test_sandbox.py rename to tests/backend/security/test_sandbox.py diff --git a/tests/backend/unit/README.md b/tests/backend/unit/README.md deleted file mode 100644 index 30f457cd..00000000 --- a/tests/backend/unit/README.md +++ /dev/null @@ -1,20 +0,0 @@ -# Backend Unit Tests - -This directory contains pure backend unit tests. - -Good candidates for this layer: - -- pure `sanitize_*` functions -- DataFrame / schema utility helpers -- logic that does not require a real Flask request context - -Naming guidelines: - -- prefer one file per concern -- name files by behavior or capability, not by large source filenames - -Examples: - -- `test_unicode_table_name_sanitization.py` -- `test_unicode_column_name_handling.py` -- `test_workspace_name_generation.py` diff --git a/tests/backend/unit/test_plugin_data_writer.py b/tests/backend/unit/test_plugin_data_writer.py deleted file mode 100644 index a3924fe7..00000000 --- a/tests/backend/unit/test_plugin_data_writer.py +++ /dev/null @@ -1,166 +0,0 @@ -"""Tests for :class:`PluginDataWriter`. - -Verifies that the writer: - -* Resolves workspace via identity and writes Parquet via ``workspace.write_parquet`` -* Stamps ``loader_type = "plugin:"`` on all tables -* Handles ``overwrite=True`` (default) and ``overwrite=False`` (auto-suffix) -* Returns the correct response dict shape -""" -from __future__ import annotations - -from dataclasses import dataclass, field -from typing import Any, Optional -from unittest.mock import MagicMock, patch - -import pandas as pd -import pytest - -from data_formulator.plugins.data_writer import PluginDataWriter - -pytestmark = [pytest.mark.backend, pytest.mark.plugin] - - -# ------------------------------------------------------------------ -# Fake workspace / table metadata -# ------------------------------------------------------------------ - -@dataclass -class _FakeColumnInfo: - name: str - dtype: str = "string" - semantic_type: Optional[str] = None - - -@dataclass -class _FakeTableMetadata: - name: str - row_count: int - columns: list[_FakeColumnInfo] = field(default_factory=list) - file_size: int = 0 - - -class _FakeWorkspace: - """Minimal stand-in for ``Workspace`` with a dict of known tables.""" - - def __init__(self, existing_tables: list[str] | None = None): - self._tables = set(existing_tables or []) - - def list_tables(self) -> list[str]: - return list(self._tables) - - def write_parquet( - self, - df: pd.DataFrame, - table_name: str, - *, - compression: str = "snappy", - source_info: dict[str, Any] | None = None, - ) -> _FakeTableMetadata: - self._tables.add(table_name) - return _FakeTableMetadata( - name=table_name, - row_count=len(df), - columns=[_FakeColumnInfo(name=c) for c in df.columns], - ) - - -# ------------------------------------------------------------------ -# Fixtures -# ------------------------------------------------------------------ - -@pytest.fixture -def writer(): - return PluginDataWriter("superset") - - -@pytest.fixture -def sample_df(): - return pd.DataFrame({"city": ["Seattle", "Portland"], "pop": [750_000, 650_000]}) - - -# ------------------------------------------------------------------ -# Tests — basic write -# ------------------------------------------------------------------ - -class TestBasicWrite: - - @patch("data_formulator.plugins.data_writer.get_identity_id", return_value="user:alice") - @patch("data_formulator.plugins.data_writer.get_workspace") - def test_write_returns_expected_shape(self, mock_get_ws, _mock_id, writer, sample_df): - ws = _FakeWorkspace() - mock_get_ws.return_value = ws - - result = writer.write_dataframe(sample_df, "cities") - - assert result["table_name"] == "cities" - assert result["row_count"] == 2 - assert len(result["columns"]) == 2 - assert result["is_renamed"] is False - mock_get_ws.assert_called_once_with("user:alice") - - @patch("data_formulator.plugins.data_writer.get_identity_id", return_value="user:bob") - @patch("data_formulator.plugins.data_writer.get_workspace") - def test_source_info_stamped(self, mock_get_ws, _mock_id, writer, sample_df): - ws = MagicMock(spec=_FakeWorkspace) - ws.write_parquet.return_value = _FakeTableMetadata( - name="sales", row_count=2, columns=[] - ) - mock_get_ws.return_value = ws - - writer.write_dataframe(sample_df, "sales") - - _, kwargs = ws.write_parquet.call_args - meta = kwargs["source_info"] - assert meta["loader_type"] == "plugin:superset" - assert meta["source_table"] == "sales" - - @patch("data_formulator.plugins.data_writer.get_identity_id", return_value="user:carol") - @patch("data_formulator.plugins.data_writer.get_workspace") - def test_source_metadata_forwarded(self, mock_get_ws, _mock_id, writer, sample_df): - ws = MagicMock(spec=_FakeWorkspace) - ws.write_parquet.return_value = _FakeTableMetadata( - name="t", row_count=2, columns=[] - ) - mock_get_ws.return_value = ws - - writer.write_dataframe( - sample_df, "t", source_metadata={"dashboard_id": 42} - ) - - _, kwargs = ws.write_parquet.call_args - assert kwargs["source_info"]["loader_params"] == {"dashboard_id": 42} - - -# ------------------------------------------------------------------ -# Tests — overwrite / collision avoidance -# ------------------------------------------------------------------ - -class TestCollisionAvoidance: - - @patch("data_formulator.plugins.data_writer.get_identity_id", return_value="user:x") - @patch("data_formulator.plugins.data_writer.get_workspace") - def test_overwrite_true_replaces(self, mock_get_ws, _mock_id, writer, sample_df): - ws = _FakeWorkspace(existing_tables=["cities"]) - mock_get_ws.return_value = ws - - result = writer.write_dataframe(sample_df, "cities", overwrite=True) - assert result["table_name"] == "cities" - - @patch("data_formulator.plugins.data_writer.get_identity_id", return_value="user:x") - @patch("data_formulator.plugins.data_writer.get_workspace") - def test_overwrite_false_auto_suffix(self, mock_get_ws, _mock_id, writer, sample_df): - ws = _FakeWorkspace(existing_tables=["cities", "cities_1"]) - mock_get_ws.return_value = ws - - result = writer.write_dataframe(sample_df, "cities", overwrite=False) - assert result["table_name"] == "cities_2" - - @patch("data_formulator.plugins.data_writer.get_identity_id", return_value="user:x") - @patch("data_formulator.plugins.data_writer.get_workspace") - def test_overwrite_false_no_collision(self, mock_get_ws, _mock_id, writer, sample_df): - ws = _FakeWorkspace(existing_tables=[]) - mock_get_ws.return_value = ws - - result = writer.write_dataframe(sample_df, "new_table", overwrite=False) - assert result["table_name"] == "new_table" diff --git a/tests/backend/unit/test_plugin_discovery.py b/tests/backend/unit/test_plugin_discovery.py deleted file mode 100644 index f8c8951c..00000000 --- a/tests/backend/unit/test_plugin_discovery.py +++ /dev/null @@ -1,289 +0,0 @@ -"""Tests for the data source plugin discovery and registration system. - -Verifies that :func:`discover_and_register`: - -* Discovers concrete ``DataSourcePlugin`` sub-packages under ``plugins/`` -* Gates enablement on ``required_env`` from the manifest -* Registers Flask Blueprints for enabled plugins -* Populates ``ENABLED_PLUGINS`` and ``DISABLED_PLUGINS`` dicts -* Gracefully handles missing env vars, import errors, and broken manifests -""" -from __future__ import annotations - -import types -from unittest.mock import MagicMock, patch - -import flask -import pytest - -from data_formulator.plugins.base import DataSourcePlugin -import data_formulator.plugins as plugins_module -from data_formulator.plugins import ( - DISABLED_PLUGINS, - ENABLED_PLUGINS, - discover_and_register, -) - -pytestmark = [pytest.mark.backend, pytest.mark.plugin] - - -# ------------------------------------------------------------------ -# Helpers — minimal concrete plugin subclass -# ------------------------------------------------------------------ - -class _StubPlugin(DataSourcePlugin): - """Minimal concrete plugin for test purposes.""" - - @staticmethod - def manifest(): - return { - "id": "stub", - "name": "Stub Plugin", - "env_prefix": "PLG_STUB", - "required_env": ["PLG_STUB_HOST"], - } - - def create_blueprint(self): - bp = flask.Blueprint("plugin_stub", __name__, url_prefix="/api/plugins/stub/") - - @bp.route("/ping") - def ping(): - return flask.jsonify({"ok": True}) - - return bp - - def get_frontend_config(self): - return {"base_url": "http://stub.local"} - - -class _NoEnvPlugin(DataSourcePlugin): - """Plugin with no required_env — always enabled.""" - - @staticmethod - def manifest(): - return { - "id": "no_env", - "name": "Always-On", - "env_prefix": "PLG_NOENV", - "required_env": [], - } - - def create_blueprint(self): - return flask.Blueprint("plugin_noenv", __name__, url_prefix="/api/plugins/no_env/") - - def get_frontend_config(self): - return {} - - -# ------------------------------------------------------------------ -# Fixtures -# ------------------------------------------------------------------ - -@pytest.fixture -def app(): - _app = flask.Flask(__name__) - _app.config["TESTING"] = True - return _app - - -@pytest.fixture(autouse=True) -def _clean_plugin_state(): - """Reset global dicts before every test.""" - ENABLED_PLUGINS.clear() - DISABLED_PLUGINS.clear() - yield - ENABLED_PLUGINS.clear() - DISABLED_PLUGINS.clear() - - -def _fake_iter_modules(path): - """Simulate pkgutil.iter_modules finding one sub-package.""" - yield None, "stub", True - - -def _fake_iter_modules_noenv(path): - yield None, "no_env", True - - -# ------------------------------------------------------------------ -# Tests — enablement gating -# ------------------------------------------------------------------ - -class TestPluginEnablement: - - def test_enabled_when_env_set(self, app, monkeypatch): - monkeypatch.setenv("PLG_STUB_HOST", "http://stub.example.com") - - stub_mod = types.ModuleType("data_formulator.plugins.stub") - stub_mod.plugin_class = _StubPlugin - - with ( - patch("pkgutil.iter_modules", side_effect=_fake_iter_modules), - patch("importlib.import_module", return_value=stub_mod), - ): - discover_and_register(app) - - assert "stub" in ENABLED_PLUGINS - assert isinstance(ENABLED_PLUGINS["stub"], _StubPlugin) - assert "stub" not in DISABLED_PLUGINS - - def test_disabled_when_env_missing(self, app, monkeypatch): - monkeypatch.delenv("PLG_STUB_HOST", raising=False) - - stub_mod = types.ModuleType("data_formulator.plugins.stub") - stub_mod.plugin_class = _StubPlugin - - with ( - patch("pkgutil.iter_modules", side_effect=_fake_iter_modules), - patch("importlib.import_module", return_value=stub_mod), - ): - discover_and_register(app) - - assert "stub" not in ENABLED_PLUGINS - assert "stub" in DISABLED_PLUGINS - assert "PLG_STUB_HOST" in DISABLED_PLUGINS["stub"] - - def test_always_enabled_when_no_required_env(self, app): - noenv_mod = types.ModuleType("data_formulator.plugins.no_env") - noenv_mod.plugin_class = _NoEnvPlugin - - with ( - patch("pkgutil.iter_modules", side_effect=_fake_iter_modules_noenv), - patch("importlib.import_module", return_value=noenv_mod), - ): - discover_and_register(app) - - assert "no_env" in ENABLED_PLUGINS - - -# ------------------------------------------------------------------ -# Tests — Blueprint registration -# ------------------------------------------------------------------ - -class TestBlueprintRegistration: - - def test_blueprint_registered_on_app(self, app, monkeypatch): - monkeypatch.setenv("PLG_STUB_HOST", "http://stub.example.com") - - stub_mod = types.ModuleType("data_formulator.plugins.stub") - stub_mod.plugin_class = _StubPlugin - - with ( - patch("pkgutil.iter_modules", side_effect=_fake_iter_modules), - patch("importlib.import_module", return_value=stub_mod), - ): - discover_and_register(app) - - assert "plugin_stub" in app.blueprints - - def test_ping_route_accessible(self, app, monkeypatch): - monkeypatch.setenv("PLG_STUB_HOST", "http://stub.example.com") - - stub_mod = types.ModuleType("data_formulator.plugins.stub") - stub_mod.plugin_class = _StubPlugin - - with ( - patch("pkgutil.iter_modules", side_effect=_fake_iter_modules), - patch("importlib.import_module", return_value=stub_mod), - ): - discover_and_register(app) - - with app.test_client() as client: - resp = client.get("/api/plugins/stub/ping") - assert resp.status_code == 200 - assert resp.get_json() == {"ok": True} - - -# ------------------------------------------------------------------ -# Tests — on_enable callback -# ------------------------------------------------------------------ - -class TestOnEnableCallback: - - def test_on_enable_called_with_app(self, app, monkeypatch): - monkeypatch.setenv("PLG_STUB_HOST", "http://stub.example.com") - - stub_mod = types.ModuleType("data_formulator.plugins.stub") - stub_mod.plugin_class = _StubPlugin - - with ( - patch("pkgutil.iter_modules", side_effect=_fake_iter_modules), - patch("importlib.import_module", return_value=stub_mod), - patch.object(_StubPlugin, "on_enable") as mock_on_enable, - ): - discover_and_register(app) - mock_on_enable.assert_called_once_with(app) - - -# ------------------------------------------------------------------ -# Tests — error handling -# ------------------------------------------------------------------ - -class TestErrorHandling: - - def test_import_error_is_caught(self, app): - def _fail_import(name): - raise ImportError(name="missing_lib") - - with ( - patch("pkgutil.iter_modules", side_effect=_fake_iter_modules), - patch("importlib.import_module", side_effect=_fail_import), - ): - discover_and_register(app) - - assert "stub" in DISABLED_PLUGINS - assert "Missing dependency" in DISABLED_PLUGINS["stub"] - assert len(ENABLED_PLUGINS) == 0 - - def test_broken_manifest_is_caught(self, app, monkeypatch): - monkeypatch.setenv("PLG_STUB_HOST", "http://stub.example.com") - - class _BrokenPlugin(DataSourcePlugin): - @staticmethod - def manifest(): - raise RuntimeError("boom") - - def create_blueprint(self): - return flask.Blueprint("x", __name__) - - def get_frontend_config(self): - return {} - - stub_mod = types.ModuleType("data_formulator.plugins.stub") - stub_mod.plugin_class = _BrokenPlugin - - with ( - patch("pkgutil.iter_modules", side_effect=_fake_iter_modules), - patch("importlib.import_module", return_value=stub_mod), - ): - discover_and_register(app) - - assert "stub" in DISABLED_PLUGINS - assert "manifest() failed" in DISABLED_PLUGINS["stub"] - - def test_non_package_modules_skipped(self, app): - """Files (not sub-packages) under plugins/ should be ignored.""" - - def _non_pkg(_path): - yield None, "helper_utils", False - - with patch("pkgutil.iter_modules", side_effect=_non_pkg): - discover_and_register(app) - - assert len(ENABLED_PLUGINS) == 0 - assert len(DISABLED_PLUGINS) == 0 - - def test_module_without_plugin_class_skipped(self, app): - empty_mod = types.ModuleType("data_formulator.plugins.empty") - - def _iter(_path): - yield None, "empty", True - - with ( - patch("pkgutil.iter_modules", side_effect=_iter), - patch("importlib.import_module", return_value=empty_mod), - ): - discover_and_register(app) - - assert len(ENABLED_PLUGINS) == 0 - assert len(DISABLED_PLUGINS) == 0 diff --git a/tests/backend/unit/test_superset_plugin.py b/tests/backend/unit/test_superset_plugin.py deleted file mode 100644 index 9ef740d7..00000000 --- a/tests/backend/unit/test_superset_plugin.py +++ /dev/null @@ -1,369 +0,0 @@ -"""Tests for the Superset data source plugin. - -Covers: -- Plugin manifest & discovery integration -- Blueprint registration and route accessibility -- Session helper isolation (plugin-namespaced keys) -- Auth routes (login, logout, me, status) -- Catalog route contract (mocked Superset API) -- Data route contract (mocked SQL Lab → PluginDataWriter) -""" -from __future__ import annotations - -import json -from unittest.mock import MagicMock, patch - -import flask -import pytest - -import data_formulator.plugins as plugins_module -from data_formulator.plugins import ( - DISABLED_PLUGINS, - ENABLED_PLUGINS, - discover_and_register, -) -from data_formulator.plugins.superset import SupersetPlugin -from data_formulator.plugins.superset.session_helpers import ( - KEY_REFRESH_TOKEN, - KEY_TOKEN, - KEY_USER, - clear_session, - get_token, - get_user, - save_session, -) - -pytestmark = [pytest.mark.backend, pytest.mark.plugin] - - -# ------------------------------------------------------------------ -# Fixtures -# ------------------------------------------------------------------ - -@pytest.fixture -def app(monkeypatch): - """Flask app with Superset plugin registered via on_enable. - - Vault is disabled (returns None) so that auth routes don't require - X-Identity-Id headers — Vault integration is tested separately in - test_plugin_auth_with_vault.py. - """ - monkeypatch.setenv("PLG_SUPERSET_URL", "http://superset.test") - - _app = flask.Flask(__name__) - _app.config["TESTING"] = True - _app.secret_key = "test-secret" - - plugin = SupersetPlugin() - bp = plugin.create_blueprint() - _app.register_blueprint(bp) - plugin.on_enable(_app) - - # Disable Vault so auth routes skip identity-dependent code paths - monkeypatch.setattr( - "data_formulator.credential_vault.get_credential_vault", - lambda: None, - ) - - return _app - - -@pytest.fixture -def client(app): - return app.test_client() - - -@pytest.fixture(autouse=True) -def _clean_plugin_state(): - ENABLED_PLUGINS.clear() - DISABLED_PLUGINS.clear() - yield - ENABLED_PLUGINS.clear() - DISABLED_PLUGINS.clear() - - -# ------------------------------------------------------------------ -# Manifest & discovery -# ------------------------------------------------------------------ - -class TestSupersetManifest: - - def test_manifest_keys(self): - m = SupersetPlugin.manifest() - assert m["id"] == "superset" - assert "PLG_SUPERSET_URL" in m["required_env"] - assert "datasets" in m["capabilities"] - assert "password" in m["auth_modes"] - - def test_discovery_disabled_without_env(self, monkeypatch): - monkeypatch.delenv("PLG_SUPERSET_URL", raising=False) - - _app = flask.Flask(__name__) - _app.config["TESTING"] = True - - with ( - patch("pkgutil.iter_modules") as mock_iter, - patch("importlib.import_module") as mock_import, - ): - import types - stub = types.ModuleType("data_formulator.plugins.superset") - stub.plugin_class = SupersetPlugin - mock_iter.return_value = [(None, "superset", True)] - mock_import.return_value = stub - - discover_and_register(_app) - - assert "superset" not in ENABLED_PLUGINS - assert "superset" in DISABLED_PLUGINS - assert "PLG_SUPERSET_URL" in DISABLED_PLUGINS["superset"] - - def test_discovery_enabled_with_env(self, monkeypatch): - monkeypatch.setenv("PLG_SUPERSET_URL", "http://superset.test") - - _app = flask.Flask(__name__) - _app.config["TESTING"] = True - _app.secret_key = "test" - - with ( - patch("pkgutil.iter_modules") as mock_iter, - patch("importlib.import_module") as mock_import, - ): - import types - stub = types.ModuleType("data_formulator.plugins.superset") - stub.plugin_class = SupersetPlugin - mock_iter.return_value = [(None, "superset", True)] - mock_import.return_value = stub - - discover_and_register(_app) - - assert "superset" in ENABLED_PLUGINS - - -# ------------------------------------------------------------------ -# Blueprint registration -# ------------------------------------------------------------------ - -class TestBlueprintRegistration: - - def test_auth_routes_registered(self, app): - rules = [r.rule for r in app.url_map.iter_rules()] - assert "/api/plugins/superset/auth/login" in rules - assert "/api/plugins/superset/auth/me" in rules - assert "/api/plugins/superset/auth/logout" in rules - assert "/api/plugins/superset/auth/status" in rules - - def test_catalog_routes_registered(self, app): - rules = [r.rule for r in app.url_map.iter_rules()] - assert "/api/plugins/superset/catalog/datasets" in rules - assert "/api/plugins/superset/catalog/dashboards" in rules - - def test_data_routes_registered(self, app): - rules = [r.rule for r in app.url_map.iter_rules()] - assert "/api/plugins/superset/data/load-dataset" in rules - - -# ------------------------------------------------------------------ -# Session helpers -# ------------------------------------------------------------------ - -class TestSessionHelpers: - - def test_save_and_read_session(self, app): - with app.test_request_context(): - save_session("tok123", {"id": 1, "username": "alice"}, "refresh456") - assert get_token() == "tok123" - assert get_user()["username"] == "alice" - - def test_clear_session(self, app): - with app.test_request_context(): - save_session("tok", {"id": 1}, "ref") - clear_session() - assert get_token() is None - assert get_user() is None - - def test_keys_are_namespaced(self): - assert "plugin_superset_" in KEY_TOKEN - assert "plugin_superset_" in KEY_USER - - -# ------------------------------------------------------------------ -# Auth routes -# ------------------------------------------------------------------ - -class TestAuthRoutes: - - def test_login_missing_credentials(self, client): - resp = client.post( - "/api/plugins/superset/auth/login", - json={"username": ""}, - ) - assert resp.status_code == 400 - - def test_login_success(self, client, app): - bridge = app.extensions["plugin_superset_bridge"] - bridge.login = MagicMock(return_value={ - "access_token": "jwt-token", - "refresh_token": "refresh-token", - }) - bridge.get_user_info = MagicMock(return_value={ - "id": 1, - "username": "admin", - "first_name": "Admin", - "last_name": "User", - }) - - resp = client.post( - "/api/plugins/superset/auth/login", - json={"username": "admin", "password": "admin"}, - ) - assert resp.status_code == 200 - data = resp.get_json() - assert data["status"] == "ok" - assert data["user"]["username"] == "admin" - - def test_me_unauthenticated(self, client): - resp = client.get("/api/plugins/superset/auth/me") - assert resp.status_code == 401 - - def test_status_unauthenticated(self, client): - resp = client.get("/api/plugins/superset/auth/status") - assert resp.status_code == 200 - data = resp.get_json() - assert data["authenticated"] is False - - def test_logout(self, client, app): - bridge = app.extensions["plugin_superset_bridge"] - bridge.login = MagicMock(return_value={"access_token": "tok", "refresh_token": "ref"}) - bridge.get_user_info = MagicMock(return_value={"id": 1, "username": "a"}) - client.post("/api/plugins/superset/auth/login", json={"username": "a", "password": "p"}) - - resp = client.post("/api/plugins/superset/auth/logout") - assert resp.status_code == 200 - - resp = client.get("/api/plugins/superset/auth/me") - assert resp.status_code == 401 - - -# ------------------------------------------------------------------ -# Catalog routes (mocked) -# ------------------------------------------------------------------ - -class TestCatalogRoutes: - - def _login(self, client, app): - bridge = app.extensions["plugin_superset_bridge"] - bridge.login = MagicMock(return_value={"access_token": "valid-jwt.eyJleHAiOjk5OTk5OTk5OTl9.sig", "refresh_token": "ref"}) - bridge.get_user_info = MagicMock(return_value={"id": 1, "username": "admin"}) - client.post("/api/plugins/superset/auth/login", json={"username": "admin", "password": "p"}) - - def test_datasets_unauthenticated(self, client): - resp = client.get("/api/plugins/superset/catalog/datasets") - assert resp.status_code == 401 - - def test_datasets_success(self, client, app): - self._login(client, app) - catalog = app.extensions["plugin_superset_catalog"] - catalog.get_catalog_summary = MagicMock(return_value=[ - {"id": 1, "name": "sales", "column_count": 5}, - ]) - - resp = client.get("/api/plugins/superset/catalog/datasets") - assert resp.status_code == 200 - data = resp.get_json() - assert data["status"] == "ok" - assert data["count"] == 1 - - -# ------------------------------------------------------------------ -# Data routes (mocked) -# ------------------------------------------------------------------ - -class TestDataRoutes: - - def _login(self, client, app): - bridge = app.extensions["plugin_superset_bridge"] - bridge.login = MagicMock(return_value={"access_token": "valid-jwt.eyJleHAiOjk5OTk5OTk5OTl9.sig", "refresh_token": "ref"}) - bridge.get_user_info = MagicMock(return_value={"id": 1, "username": "admin"}) - client.post("/api/plugins/superset/auth/login", json={"username": "admin", "password": "p"}) - - def test_load_unauthenticated(self, client): - resp = client.post( - "/api/plugins/superset/data/load-dataset", - json={"dataset_id": 1}, - ) - assert resp.status_code == 401 - - def test_load_missing_dataset_id(self, client, app): - self._login(client, app) - resp = client.post( - "/api/plugins/superset/data/load-dataset", - json={}, - ) - assert resp.status_code == 400 - - @patch("data_formulator.plugins.superset.routes.data.PluginDataWriter") - def test_load_success(self, MockWriter, client, app): - self._login(client, app) - - sc = app.extensions["plugin_superset_client"] - sc.get_dataset_detail = MagicMock(return_value={ - "table_name": "test_table", - "database": {"id": 1}, - "schema": "public", - "kind": "physical", - "columns": [{"column_name": "id"}, {"column_name": "name"}], - }) - sc.create_sql_session = MagicMock(return_value=MagicMock()) - sc.execute_sql_with_session = MagicMock(return_value={ - "data": [{"id": 1, "name": "a"}, {"id": 2, "name": "b"}], - "columns": [{"column_name": "id"}, {"column_name": "name"}], - }) - - mock_writer_inst = MagicMock() - mock_writer_inst.write_dataframe.return_value = { - "table_name": "test_table", - "row_count": 2, - "columns": [{"name": "id", "type": "int64"}, {"name": "name", "type": "object"}], - "is_renamed": False, - } - MockWriter.return_value = mock_writer_inst - - resp = client.post( - "/api/plugins/superset/data/load-dataset", - json={"dataset_id": 42}, - ) - assert resp.status_code == 200 - data = resp.get_json() - assert data["status"] == "ok" - assert data["row_count"] == 2 - assert data["table_name"] == "test_table" - mock_writer_inst.write_dataframe.assert_called_once() - - -# ------------------------------------------------------------------ -# Frontend config -# ------------------------------------------------------------------ - -class TestFrontendConfig: - - def test_config_has_required_urls(self, monkeypatch): - monkeypatch.setenv("PLG_SUPERSET_URL", "http://superset.test") - plugin = SupersetPlugin() - cfg = plugin.get_frontend_config() - assert cfg["base_url"] == "http://superset.test" - assert "/api/plugins/superset/auth/login" in cfg["auth_url"] - assert "/api/plugins/superset/catalog/datasets" in cfg["catalog_url"] - - def test_sso_url_derived_from_base(self, monkeypatch): - monkeypatch.setenv("PLG_SUPERSET_URL", "http://superset.test") - monkeypatch.delenv("PLG_SUPERSET_SSO_LOGIN_URL", raising=False) - plugin = SupersetPlugin() - cfg = plugin.get_frontend_config() - assert cfg["sso_login_url"] == "http://superset.test/df-sso-bridge/" - - def test_sso_url_override(self, monkeypatch): - monkeypatch.setenv("PLG_SUPERSET_URL", "http://superset.test") - monkeypatch.setenv("PLG_SUPERSET_SSO_LOGIN_URL", "http://custom-sso.test/bridge") - plugin = SupersetPlugin() - cfg = plugin.get_frontend_config() - assert cfg["sso_login_url"] == "http://custom-sso.test/bridge" diff --git a/tests/plugin/README.md b/tests/database-dockers/README.md similarity index 100% rename from tests/plugin/README.md rename to tests/database-dockers/README.md diff --git a/tests/plugin/test_bigquery/Dockerfile b/tests/database-dockers/bigquery/Dockerfile similarity index 100% rename from tests/plugin/test_bigquery/Dockerfile rename to tests/database-dockers/bigquery/Dockerfile diff --git a/tests/plugin/test_bigquery/README.md b/tests/database-dockers/bigquery/README.md similarity index 100% rename from tests/plugin/test_bigquery/README.md rename to tests/database-dockers/bigquery/README.md diff --git a/tests/plugin/test_bigquery/init_data.yaml b/tests/database-dockers/bigquery/init_data.yaml similarity index 100% rename from tests/plugin/test_bigquery/init_data.yaml rename to tests/database-dockers/bigquery/init_data.yaml diff --git a/tests/plugin/test_bigquery/test_bigquery_loader.py b/tests/database-dockers/bigquery/test_bigquery_loader.py similarity index 100% rename from tests/plugin/test_bigquery/test_bigquery_loader.py rename to tests/database-dockers/bigquery/test_bigquery_loader.py diff --git a/tests/plugin/test_mongodb/Dockerfile b/tests/database-dockers/mongodb/Dockerfile similarity index 100% rename from tests/plugin/test_mongodb/Dockerfile rename to tests/database-dockers/mongodb/Dockerfile diff --git a/tests/plugin/test_mongodb/README.md b/tests/database-dockers/mongodb/README.md similarity index 100% rename from tests/plugin/test_mongodb/README.md rename to tests/database-dockers/mongodb/README.md diff --git a/tests/plugin/test_mongodb/init_data.js b/tests/database-dockers/mongodb/init_data.js similarity index 100% rename from tests/plugin/test_mongodb/init_data.js rename to tests/database-dockers/mongodb/init_data.js diff --git a/tests/plugin/test_mongodb/test_mongodb_loader.py b/tests/database-dockers/mongodb/test_mongodb_loader.py similarity index 100% rename from tests/plugin/test_mongodb/test_mongodb_loader.py rename to tests/database-dockers/mongodb/test_mongodb_loader.py diff --git a/tests/plugin/test_mysql/Dockerfile b/tests/database-dockers/mysql/Dockerfile similarity index 100% rename from tests/plugin/test_mysql/Dockerfile rename to tests/database-dockers/mysql/Dockerfile diff --git a/tests/plugin/test_mysql/README.md b/tests/database-dockers/mysql/README.md similarity index 100% rename from tests/plugin/test_mysql/README.md rename to tests/database-dockers/mysql/README.md diff --git a/tests/plugin/test_mysql/init.sql b/tests/database-dockers/mysql/init.sql similarity index 100% rename from tests/plugin/test_mysql/init.sql rename to tests/database-dockers/mysql/init.sql diff --git a/tests/plugin/test_mysql_datalake.py b/tests/database-dockers/mysql/test_mysql_datalake.py similarity index 100% rename from tests/plugin/test_mysql_datalake.py rename to tests/database-dockers/mysql/test_mysql_datalake.py diff --git a/tests/plugin/test_mysql/test_mysql_loader.py b/tests/database-dockers/mysql/test_mysql_loader.py similarity index 100% rename from tests/plugin/test_mysql/test_mysql_loader.py rename to tests/database-dockers/mysql/test_mysql_loader.py diff --git a/tests/plugin/test_postgres/Dockerfile b/tests/database-dockers/postgres/Dockerfile similarity index 100% rename from tests/plugin/test_postgres/Dockerfile rename to tests/database-dockers/postgres/Dockerfile diff --git a/tests/plugin/test_postgres/README.md b/tests/database-dockers/postgres/README.md similarity index 100% rename from tests/plugin/test_postgres/README.md rename to tests/database-dockers/postgres/README.md diff --git a/tests/plugin/test_postgres/init.sql b/tests/database-dockers/postgres/init.sql similarity index 100% rename from tests/plugin/test_postgres/init.sql rename to tests/database-dockers/postgres/init.sql diff --git a/tests/plugin/test_postgres/test_postgresql_loader.py b/tests/database-dockers/postgres/test_postgresql_loader.py similarity index 100% rename from tests/plugin/test_postgres/test_postgresql_loader.py rename to tests/database-dockers/postgres/test_postgresql_loader.py diff --git a/tests/superset/.env.superset b/tests/database-dockers/superset/.env.superset similarity index 100% rename from tests/superset/.env.superset rename to tests/database-dockers/superset/.env.superset diff --git a/tests/superset/README.md b/tests/database-dockers/superset/README.md similarity index 100% rename from tests/superset/README.md rename to tests/database-dockers/superset/README.md diff --git a/tests/superset/docker-compose.yml b/tests/database-dockers/superset/docker-compose.yml similarity index 100% rename from tests/superset/docker-compose.yml rename to tests/database-dockers/superset/docker-compose.yml diff --git a/tests/superset/init-superset.sh b/tests/database-dockers/superset/init-superset.sh similarity index 100% rename from tests/superset/init-superset.sh rename to tests/database-dockers/superset/init-superset.sh diff --git a/tests/superset/sample_data.py b/tests/database-dockers/superset/sample_data.py similarity index 100% rename from tests/superset/sample_data.py rename to tests/database-dockers/superset/sample_data.py diff --git a/tests/superset/start.sh b/tests/database-dockers/superset/start.sh similarity index 100% rename from tests/superset/start.sh rename to tests/database-dockers/superset/start.sh diff --git a/tests/superset/superset_config.py b/tests/database-dockers/superset/superset_config.py similarity index 100% rename from tests/superset/superset_config.py rename to tests/database-dockers/superset/superset_config.py diff --git a/tests/backend/integration/test_superset_data_connector.py b/tests/database-dockers/superset/test_superset_data_connector.py similarity index 97% rename from tests/backend/integration/test_superset_data_connector.py rename to tests/database-dockers/superset/test_superset_data_connector.py index 647769c8..a35da222 100644 --- a/tests/backend/integration/test_superset_data_connector.py +++ b/tests/database-dockers/superset/test_superset_data_connector.py @@ -160,13 +160,13 @@ def refresh_token(self, refresh_token): @pytest.fixture(autouse=True) def _mock_superset_imports(): - """Patch the lazy-imported Superset helpers.""" + """Patch the Superset helpers in the loader module.""" import data_formulator.data_loader.superset_data_loader as sdl - old_client, old_bridge = sdl._SupersetClient, sdl._SupersetAuthBridge - sdl._SupersetClient = MockSupersetClient - sdl._SupersetAuthBridge = MockAuthBridge + old_client, old_bridge = sdl.SupersetClient, sdl.SupersetAuthBridge + sdl.SupersetClient = MockSupersetClient + sdl.SupersetAuthBridge = MockAuthBridge yield - sdl._SupersetClient, sdl._SupersetAuthBridge = old_client, old_bridge + sdl.SupersetClient, sdl.SupersetAuthBridge = old_client, old_bridge @pytest.fixture @@ -351,7 +351,7 @@ def test_import(self, connected_client): mock_meta.row_count = 3 with patch.object(DataConnector, "_get_identity", return_value="test-user"), \ - patch("data_formulator.security.auth.get_identity_id", return_value="test-user"), \ + patch("data_formulator.auth.identity.get_identity_id", return_value="test-user"), \ patch("data_formulator.workspace_factory.get_workspace") as mock_ws: from data_formulator.data_loader.superset_data_loader import SupersetLoader diff --git a/tests/plugin/test_mysql/test_mysql_data_connector.py b/tests/plugin/test_mysql/test_mysql_data_connector.py deleted file mode 100644 index d158a6de..00000000 --- a/tests/plugin/test_mysql/test_mysql_data_connector.py +++ /dev/null @@ -1,263 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""End-to-end integration tests for MySQL via DataConnector routes. - -Tests the full lifecycle: connect → browse hierarchy → scope pinning → -import → preview → refresh → disconnect. - -Requires MySQL running (e.g. ./tests/run_test_dbs.sh start mysql). -Environment: MYSQL_HOST, MYSQL_PORT (default 3307), MYSQL_USER, MYSQL_PASSWORD, MYSQL_DATABASE. -""" -from __future__ import annotations - -import os -import shutil -import tempfile -import unittest -from pathlib import Path -from typing import Any, Dict -from unittest.mock import patch - -import flask -import pytest - -pytestmark = [pytest.mark.backend, pytest.mark.plugin] - - -def get_mysql_config() -> Dict[str, Any]: - return { - "host": os.getenv("MYSQL_HOST", "localhost"), - "port": os.getenv("MYSQL_PORT", "3307"), - "user": os.getenv("MYSQL_USER", "root"), - "password": os.getenv("MYSQL_PASSWORD", "rootpassword"), - "database": os.getenv("MYSQL_DATABASE", "testdb"), - } - - -def mysql_available() -> bool: - import socket - cfg = get_mysql_config() - host = cfg.get("host", "localhost") - port = int(cfg.get("port", "3307")) - if host in ("localhost", "127.0.0.1"): - host = "127.0.0.1" - sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) - sock.settimeout(2) - try: - sock.connect((host, port)) - sock.close() - return True - except (socket.error, OSError): - return False - - -def _make_app_and_client(source_id="mysql", default_params=None): - from data_formulator.data_connector import DataConnector - from data_formulator.data_loader.mysql_data_loader import MySQLDataLoader - - app = flask.Flask(__name__) - app.config["TESTING"] = True - app.secret_key = "test-secret" - - source = DataConnector.from_loader( - MySQLDataLoader, - source_id=source_id, - display_name="Test MySQL", - default_params=default_params or {}, - ) - app.register_blueprint(source.create_blueprint()) - return app, app.test_client(), source - - -@unittest.skipUnless( - mysql_available(), - "MySQL not available (start with ./tests/run_test_dbs.sh start mysql).", -) -class TestMySQLConnectedSourceE2E(unittest.TestCase): - """End-to-end tests: DataConnector routes → real MySQL.""" - - def setUp(self): - self._workspace_root = None - self._identity = "test-user-mysql-e2e" - - def tearDown(self): - if self._workspace_root and Path(self._workspace_root).exists(): - shutil.rmtree(self._workspace_root, ignore_errors=True) - - def _workspace_root_path(self): - if self._workspace_root is None: - self._workspace_root = tempfile.mkdtemp(prefix="df_test_mysql_e2e_") - return self._workspace_root - - # ============================================================== - # Auth lifecycle - # ============================================================== - - def test_connect_success(self): - app, client, source = _make_app_and_client() - cfg = get_mysql_config() - with patch.object(type(source), "_get_identity", return_value=self._identity): - resp = client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) - self.assertEqual(resp.status_code, 200) - data = resp.get_json() - self.assertEqual(data["status"], "connected") - # MySQL hierarchy: database → table - keys = [h["key"] for h in data["hierarchy"]] - self.assertEqual(keys, ["database", "table"]) - - def test_disconnect_and_status(self): - app, client, source = _make_app_and_client() - cfg = get_mysql_config() - with patch.object(type(source), "_get_identity", return_value=self._identity): - client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) - resp = client.post("/api/connectors/mysql/auth/disconnect") - self.assertEqual(resp.get_json()["status"], "disconnected") - - resp = client.get("/api/connectors/mysql/auth/status") - self.assertFalse(resp.get_json()["connected"]) - - # ============================================================== - # Catalog browsing — database pinned (default test config) - # ============================================================== - - def test_browse_with_database_pinned(self): - """With database param set, ls([]) should show tables directly.""" - cfg = get_mysql_config() - app, client, source = _make_app_and_client() - - with patch.object(type(source), "_get_identity", return_value=self._identity): - resp = client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) - data = resp.get_json() - # database is pinned → effective_hierarchy only has "table" - eff_keys = [h["key"] for h in data["effective_hierarchy"]] - self.assertNotIn("database", eff_keys) - self.assertIn("table", eff_keys) - - # ls([]) should return tables directly - resp = client.post("/api/connectors/mysql/catalog/ls", json={"path": []}) - nodes = resp.get_json()["nodes"] - self.assertTrue(len(nodes) > 0) - table_names = [n["name"] for n in nodes] - self.assertIn("products", table_names) - for n in nodes: - self.assertEqual(n["node_type"], "table") - - def test_browse_without_database_pinned(self): - """Without database param, ls([]) should list databases first.""" - cfg = get_mysql_config() - unpinned_cfg = {k: v for k, v in cfg.items() if k != "database"} - app, client, source = _make_app_and_client() - - with patch.object(type(source), "_get_identity", return_value=self._identity): - resp = client.post("/api/connectors/mysql/auth/connect", json={ - "params": unpinned_cfg, - }) - data = resp.get_json() - # Should not be pinned - eff_keys = [h["key"] for h in data["effective_hierarchy"]] - self.assertIn("database", eff_keys) - - # ls([]) → databases - resp = client.post("/api/connectors/mysql/catalog/ls", json={"path": []}) - nodes = resp.get_json()["nodes"] - db_names = [n["name"] for n in nodes] - self.assertIn("testdb", db_names) - for n in nodes: - self.assertEqual(n["node_type"], "namespace") - - # ls(["testdb"]) → tables - resp = client.post("/api/connectors/mysql/catalog/ls", json={"path": ["testdb"]}) - nodes = resp.get_json()["nodes"] - table_names = [n["name"] for n in nodes] - self.assertIn("products", table_names) - - # ============================================================== - # Data preview + import - # ============================================================== - - def test_preview(self): - cfg = get_mysql_config() - app, client, source = _make_app_and_client() - - with patch.object(type(source), "_get_identity", return_value=self._identity): - client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) - resp = client.post("/api/connectors/mysql/data/preview", json={ - "source_table": "products", - "size": 5, - }) - data = resp.get_json() - self.assertEqual(data["status"], "success") - self.assertLessEqual(data["row_count"], 5) - - def test_import_and_refresh(self): - cfg = get_mysql_config() - app, client, source = _make_app_and_client() - workspace_root = self._workspace_root_path() - - from data_formulator.datalake.workspace import Workspace - workspace = Workspace(self._identity, root_dir=workspace_root) - - with patch.object(type(source), "_get_identity", return_value=self._identity), \ - patch("data_formulator.data_connector.get_identity_id", return_value=self._identity), \ - patch("data_formulator.data_connector.get_workspace", return_value=workspace): - client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) - - resp = client.post("/api/connectors/mysql/data/import", json={ - "source_table": "products", - "table_name": "mysql_products", - "import_options": {"size": 50}, - }) - data = resp.get_json() - self.assertEqual(data["status"], "success") - self.assertEqual(data["table_name"], "mysql_products") - self.assertGreater(data["row_count"], 0) - - # Verify workspace - self.assertIn("mysql_products", workspace.list_tables()) - - # Refresh - resp = client.post("/api/connectors/mysql/data/refresh", json={ - "table_name": "mysql_products", - }) - self.assertEqual(resp.get_json()["status"], "success") - - # ============================================================== - # Flat listing - # ============================================================== - - def test_list_tables_flat(self): - cfg = get_mysql_config() - app, client, source = _make_app_and_client() - - with patch.object(type(source), "_get_identity", return_value=self._identity): - client.post("/api/connectors/mysql/auth/connect", json={"params": cfg}) - resp = client.post("/api/connectors/mysql/catalog/list_tables", json={}) - data = resp.get_json() - names = [t["name"] for t in data["tables"]] - self.assertTrue(any("products" in n for n in names)) - - -class TestMySQLConnectedSourceStatic(unittest.TestCase): - - def test_hierarchy(self): - from data_formulator.data_loader.mysql_data_loader import MySQLDataLoader - h = MySQLDataLoader.catalog_hierarchy() - keys = [l["key"] for l in h] - self.assertEqual(keys, ["database", "table"]) - - def test_frontend_config_with_pinned_database(self): - from data_formulator.data_connector import DataConnector - from data_formulator.data_loader.mysql_data_loader import MySQLDataLoader - - source = DataConnector.from_loader( - MySQLDataLoader, - source_id="mysql_test", - display_name="MySQL Test", - default_params={"host": "db.corp", "database": "analytics"}, - ) - cfg = source.get_frontend_config() - self.assertEqual(cfg["pinned_params"]["database"], "analytics") - eff_keys = [h["key"] for h in cfg["effective_hierarchy"]] - self.assertNotIn("database", eff_keys) - self.assertIn("table", eff_keys) diff --git a/tests/plugin/test_postgres/test_postgresql_data_connector.py b/tests/plugin/test_postgres/test_postgresql_data_connector.py deleted file mode 100644 index 426f8b18..00000000 --- a/tests/plugin/test_postgres/test_postgresql_data_connector.py +++ /dev/null @@ -1,430 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -"""End-to-end integration tests for PostgreSQL via DataConnector routes. - -Tests the full lifecycle: connect → browse hierarchy → scope pinning → -import → preview → refresh → disconnect → reconnect. - -Requires PostgreSQL running (e.g. ./tests/run_test_dbs.sh start postgres). -Environment: PG_HOST, PG_PORT (default 5433), PG_USER, PG_PASSWORD, PG_DATABASE. -""" -from __future__ import annotations - -import os -import shutil -import tempfile -import unittest -from pathlib import Path -from typing import Any, Dict -from unittest.mock import patch - -import flask -import pytest - -pytestmark = [pytest.mark.backend, pytest.mark.plugin] - -# ------------------------------------------------------------------ -# Helpers -# ------------------------------------------------------------------ - -def get_pg_config() -> Dict[str, Any]: - return { - "host": os.getenv("PG_HOST", "localhost"), - "port": os.getenv("PG_PORT", "5433"), - "user": os.getenv("PG_USER", "postgres"), - "password": os.getenv("PG_PASSWORD", "postgres"), - "database": os.getenv("PG_DATABASE", "testdb"), - } - - -def postgres_available() -> bool: - import socket - cfg = get_pg_config() - host = cfg.get("host", "localhost") - port = int(cfg.get("port", "5433")) - if host in ("localhost", "127.0.0.1"): - host = "127.0.0.1" - sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) - sock.settimeout(2) - try: - sock.connect((host, port)) - sock.close() - return True - except (socket.error, OSError): - return False - - -def _make_app_and_client(source_id="postgresql", default_params=None): - """Create a Flask test app with DataConnector for PostgreSQL.""" - from data_formulator.data_connector import DataConnector, DATA_CONNECTORS - from data_formulator.data_loader.postgresql_data_loader import PostgreSQLDataLoader - - app = flask.Flask(__name__) - app.config["TESTING"] = True - app.secret_key = "test-secret-key" - - source = DataConnector.from_loader( - PostgreSQLDataLoader, - source_id=source_id, - display_name="Test PostgreSQL", - default_params=default_params or {}, - ) - bp = source.create_blueprint() - app.register_blueprint(bp) - return app, app.test_client(), source - - -# ------------------------------------------------------------------ -# Tests -# ------------------------------------------------------------------ - -@unittest.skipUnless( - postgres_available(), - "PostgreSQL not available (start with ./tests/run_test_dbs.sh start postgres).", -) -class TestPostgreSQLConnectedSourceE2E(unittest.TestCase): - """End-to-end tests: DataConnector routes → real PostgreSQL.""" - - def setUp(self): - self._workspace_root = None - self._identity = "test-user-pg-e2e" - - def tearDown(self): - if self._workspace_root and Path(self._workspace_root).exists(): - shutil.rmtree(self._workspace_root, ignore_errors=True) - - def _workspace_root_path(self): - if self._workspace_root is None: - self._workspace_root = tempfile.mkdtemp(prefix="df_test_pg_e2e_") - return self._workspace_root - - # ============================================================== - # Auth lifecycle - # ============================================================== - - def test_connect_and_status(self): - """Connect → status shows connected with hierarchy.""" - app, client, source = _make_app_and_client() - cfg = get_pg_config() - with patch.object(type(source), "_get_identity", return_value=self._identity): - resp = client.post("/api/connectors/postgresql/auth/connect", json={ - "params": cfg, - }) - self.assertEqual(resp.status_code, 200) - data = resp.get_json() - self.assertEqual(data["status"], "connected") - self.assertIn("hierarchy", data) - # PostgreSQL: database → schema → table - keys = [h["key"] for h in data["hierarchy"]] - self.assertEqual(keys, ["database", "schema", "table"]) - - # Status should show connected - resp = client.get("/api/connectors/postgresql/auth/status") - self.assertEqual(resp.status_code, 200) - self.assertTrue(resp.get_json()["connected"]) - - def test_connect_bad_credentials(self): - """Bad credentials return error without leaking secrets.""" - app, client, source = _make_app_and_client() - cfg = get_pg_config() - cfg["password"] = "wrong-password-xyz" - with patch.object(type(source), "_get_identity", return_value=self._identity): - resp = client.post("/api/connectors/postgresql/auth/connect", json={ - "params": cfg, - }) - self.assertIn(resp.status_code, (400, 500, 502)) - data = resp.get_json() - self.assertEqual(data["status"], "error") - # Must NOT leak the password - import json - self.assertNotIn("wrong-password-xyz", json.dumps(data)) - - def test_disconnect_and_reconnect(self): - app, client, source = _make_app_and_client() - cfg = get_pg_config() - with patch.object(type(source), "_get_identity", return_value=self._identity): - # Connect - client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) - - # Disconnect - resp = client.post("/api/connectors/postgresql/auth/disconnect") - self.assertEqual(resp.get_json()["status"], "disconnected") - - # Status shows disconnected - resp = client.get("/api/connectors/postgresql/auth/status") - self.assertFalse(resp.get_json()["connected"]) - - # Reconnect - resp = client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) - self.assertEqual(resp.get_json()["status"], "connected") - - # ============================================================== - # Catalog browsing — full hierarchy - # ============================================================== - - def test_browse_full_hierarchy(self): - """Connect without database pinned → browse database → schema → table.""" - cfg = get_pg_config() - unpinned_cfg = {k: v for k, v in cfg.items() if k != "database"} - app, client, source = _make_app_and_client(default_params={}) - - with patch.object(type(source), "_get_identity", return_value=self._identity): - # Connect without database pinned - client.post("/api/connectors/postgresql/auth/connect", json={"params": unpinned_cfg}) - - # Level 1: list databases - resp = client.post("/api/connectors/postgresql/catalog/ls", json={"path": []}) - data = resp.get_json() - self.assertEqual(resp.status_code, 200) - db_names = [n["name"] for n in data["nodes"]] - self.assertIn("testdb", db_names) - for node in data["nodes"]: - self.assertEqual(node["node_type"], "namespace") - - # Level 2: list schemas in testdb - resp = client.post("/api/connectors/postgresql/catalog/ls", json={"path": ["testdb"]}) - schemas = resp.get_json()["nodes"] - schema_names = [s["name"] for s in schemas] - self.assertIn("sample", schema_names) - self.assertIn("public", schema_names) - - # Level 3: list tables in sample schema - resp = client.post("/api/connectors/postgresql/catalog/ls", json={ - "path": ["testdb", "sample"], - }) - tables = resp.get_json()["nodes"] - table_names = [t["name"] for t in tables] - self.assertIn("products", table_names) - self.assertIn("customers", table_names) - for t in tables: - self.assertEqual(t["node_type"], "table") - - # ============================================================== - # Catalog browsing — scope pinning - # ============================================================== - - def test_scope_pinning_database(self): - """Connect with database pinned → ls([]) starts at schema level.""" - cfg = get_pg_config() - app, client, source = _make_app_and_client( - default_params={"host": cfg["host"], "port": cfg["port"], "database": cfg["database"]}, - ) - - with patch.object(type(source), "_get_identity", return_value=self._identity): - client.post("/api/connectors/postgresql/auth/connect", json={ - "params": {"user": cfg["user"], "password": cfg["password"]}, - }) - - # Connect response should show pinned scope - resp = client.get("/api/connectors/postgresql/auth/status") - status = resp.get_json() - eff_keys = [h["key"] for h in status["effective_hierarchy"]] - self.assertNotIn("database", eff_keys) - self.assertIn("schema", eff_keys) - self.assertEqual(status["pinned_scope"]["database"], cfg["database"]) - - # ls([]) should show schemas, not databases - resp = client.post("/api/connectors/postgresql/catalog/ls", json={"path": []}) - nodes = resp.get_json()["nodes"] - self.assertTrue(len(nodes) > 0) - # These should be schemas (namespace) not databases - for n in nodes: - self.assertEqual(n["node_type"], "namespace") - names = [n["name"] for n in nodes] - self.assertIn("sample", names) - - # ============================================================== - # Catalog metadata - # ============================================================== - - def test_table_metadata(self): - cfg = get_pg_config() - app, client, source = _make_app_and_client() - - with patch.object(type(source), "_get_identity", return_value=self._identity): - client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) - resp = client.post("/api/connectors/postgresql/catalog/metadata", json={ - "path": ["sample", "products"], - }) - data = resp.get_json() - self.assertEqual(resp.status_code, 200) - meta = data["metadata"] - self.assertIn("columns", meta) - col_names = [c["name"] for c in meta["columns"]] - self.assertIn("name", col_names) - self.assertIn("price", col_names) - self.assertIn("category", col_names) - - # ============================================================== - # Flat list_tables - # ============================================================== - - def test_list_tables_flat(self): - cfg = get_pg_config() - app, client, source = _make_app_and_client() - - with patch.object(type(source), "_get_identity", return_value=self._identity): - client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) - resp = client.post("/api/connectors/postgresql/catalog/list_tables", json={}) - data = resp.get_json() - self.assertEqual(resp.status_code, 200) - names = [t["name"] for t in data["tables"]] - self.assertTrue(any("products" in n for n in names)) - - def test_list_tables_with_filter(self): - cfg = get_pg_config() - app, client, source = _make_app_and_client() - - with patch.object(type(source), "_get_identity", return_value=self._identity): - client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) - resp = client.post("/api/connectors/postgresql/catalog/list_tables", json={ - "filter": "product", - }) - tables = resp.get_json()["tables"] - for t in tables: - self.assertIn("product", t["name"].lower()) - - # ============================================================== - # Data preview - # ============================================================== - - def test_preview(self): - cfg = get_pg_config() - app, client, source = _make_app_and_client() - - with patch.object(type(source), "_get_identity", return_value=self._identity): - client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) - resp = client.post("/api/connectors/postgresql/data/preview", json={ - "source_table": "sample.products", - "size": 5, - }) - data = resp.get_json() - self.assertEqual(resp.status_code, 200) - self.assertEqual(data["status"], "success") - self.assertLessEqual(data["row_count"], 5) - col_names = {c["name"] for c in data["columns"]} - self.assertIn("name", col_names) - self.assertIn("price", col_names) - - # ============================================================== - # Data import + refresh - # ============================================================== - - def test_import_and_refresh(self): - """Import a table via connected source routes, then refresh it.""" - cfg = get_pg_config() - app, client, source = _make_app_and_client() - workspace_root = self._workspace_root_path() - - from data_formulator.datalake.workspace import Workspace - workspace = Workspace(self._identity, root_dir=workspace_root) - - with patch.object(type(source), "_get_identity", return_value=self._identity), \ - patch("data_formulator.data_connector.get_identity_id", return_value=self._identity), \ - patch("data_formulator.data_connector.get_workspace", return_value=workspace): - # Connect - client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) - - # Import - resp = client.post("/api/connectors/postgresql/data/import", json={ - "source_table": "sample.products", - "table_name": "products", - "import_options": {"size": 100}, - }) - data = resp.get_json() - self.assertEqual(resp.status_code, 200) - self.assertEqual(data["status"], "success") - self.assertEqual(data["table_name"], "products") - self.assertGreater(data["row_count"], 0) - self.assertTrue(data["refreshable"]) - - # Verify table exists in workspace - self.assertIn("products", workspace.list_tables()) - - # Refresh - resp = client.post("/api/connectors/postgresql/data/refresh", json={ - "table_name": "products", - }) - data = resp.get_json() - self.assertEqual(resp.status_code, 200) - self.assertEqual(data["status"], "success") - self.assertIn("data_changed", data) - - # ============================================================== - # ls() filter - # ============================================================== - - def test_ls_filter(self): - cfg = get_pg_config() - app, client, source = _make_app_and_client() - - with patch.object(type(source), "_get_identity", return_value=self._identity): - client.post("/api/connectors/postgresql/auth/connect", json={"params": cfg}) - resp = client.post("/api/connectors/postgresql/catalog/ls", json={ - "path": ["sample"], - "filter": "product", - }) - nodes = resp.get_json()["nodes"] - for n in nodes: - self.assertIn("product", n["name"].lower()) - - # ============================================================== - # Operations without connection return error - # ============================================================== - - def test_ls_without_connect_returns_error(self): - app, client, source = _make_app_and_client() - with patch.object(type(source), "_get_identity", return_value="nobody"): - resp = client.post("/api/connectors/postgresql/catalog/ls", json={"path": []}) - self.assertIn(resp.status_code, (400, 500)) - - def test_import_without_connect_returns_error(self): - app, client, source = _make_app_and_client() - with patch.object(type(source), "_get_identity", return_value="nobody"): - resp = client.post("/api/connectors/postgresql/data/import", json={ - "source_table": "sample.products", - }) - self.assertIn(resp.status_code, (400, 500)) - - -# ------------------------------------------------------------------ -# Static tests (no DB required) -# ------------------------------------------------------------------ - -class TestPostgreSQLConnectedSourceStatic(unittest.TestCase): - - def test_frontend_config(self): - from data_formulator.data_connector import DataConnector - from data_formulator.data_loader.postgresql_data_loader import PostgreSQLDataLoader - - source = DataConnector.from_loader( - PostgreSQLDataLoader, - source_id="pg_test", - display_name="PG Test", - default_params={"host": "db.corp", "database": "prod"}, - ) - cfg = source.get_frontend_config() - - # Pinned params should show host and database - self.assertEqual(cfg["pinned_params"]["host"], "db.corp") - self.assertEqual(cfg["pinned_params"]["database"], "prod") - - # Form should not include pinned params - form_names = {f["name"] for f in cfg["params_form"]} - self.assertNotIn("host", form_names) - self.assertNotIn("database", form_names) - self.assertIn("user", form_names) - self.assertIn("password", form_names) - - # Effective hierarchy should exclude database - eff_keys = [h["key"] for h in cfg["effective_hierarchy"]] - self.assertNotIn("database", eff_keys) - self.assertIn("schema", eff_keys) - self.assertIn("table", eff_keys) - - def test_hierarchy(self): - from data_formulator.data_loader.postgresql_data_loader import PostgreSQLDataLoader - h = PostgreSQLDataLoader.catalog_hierarchy() - keys = [l["key"] for l in h] - self.assertEqual(keys, ["database", "schema", "table"]) From 249f87765356bfd5bc308b14fdbef88eb87da992 Mon Sep 17 00:00:00 2001 From: Chenglong Wang Date: Wed, 15 Apr 2026 22:21:36 -0700 Subject: [PATCH 6/6] cleanup --- design-docs/9.3-promoted-data-source-cards.md | 20 +++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-) diff --git a/design-docs/9.3-promoted-data-source-cards.md b/design-docs/9.3-promoted-data-source-cards.md index bf1c7f9f..82fef66a 100644 --- a/design-docs/9.3-promoted-data-source-cards.md +++ b/design-docs/9.3-promoted-data-source-cards.md @@ -24,9 +24,9 @@ Parent: [9-generalized-data-source-plugins.md](9-generalized-data-source-plugins - [x] Removed legacy "Database" tab from UI - [x] `DBTableManager` uses only `serverConfig.SOURCES` (DataConnector) for source discovery -**Phase C — Cleanup (deferred to doc 9 Phase 3):** -- [ ] Unify Superset plugin into `/api/connectors/` flow (done via SupersetLoader) -- [ ] Disconnect / Delete actions on each card +**Phase C — Cleanup:** +- [x] Unify Superset plugin into `/api/connectors/` flow (done via SupersetLoader) +- [x] Disconnect / Delete actions on each card > **Note:** `dataLoaderConnectParams` stays in Redux — it manages transient form field state (partially filled connection forms). Registered connection metadata lives server-side via the connectors API. @@ -334,23 +334,23 @@ New work: - [x] `GET /api/connectors` — lists registered instances with connection status. - [x] `POST /api/connectors` — creates a `DataConnector` in `DATA_CONNECTORS` dict (no blueprint registration). Auto-connects if params provided. - [x] `DELETE /api/connectors/{id}` — tears down instance, clears vault. -- [ ] Move all per-instance route handlers to shared routes on `connectors_bp` (`/api/connectors/connect`, `/get-status`, `/get-catalog`, `/preview-data`, `/import-data`, `/refresh-data`, etc.) that accept `connector_id` in JSON body. -- [ ] Delete `create_blueprint()`, `_register_connection_routes()`, `_register_catalog_routes()`, `_register_data_routes()`. +- [x] Move all per-instance route handlers to shared routes on `connectors_bp` (`/api/connectors/connect`, `/get-status`, `/get-catalog`, `/preview-data`, `/import-data`, `/refresh-data`, etc.) that accept `connector_id` in JSON body. +- [x] Delete `create_blueprint()`, `_register_connection_routes()`, `_register_catalog_routes()`, `_register_data_routes()`. - [x] Make `/status` side-effect-free (move auto-reconnect logic to `/connect`). - [x] Merge `/auth/token-connect` into `/connect` with `mode` field. -- [ ] Simplify `register_data_connectors()` — register `connectors_bp` once, populate `DATA_CONNECTORS` from config (no per-instance blueprint). +- [x] Simplify `register_data_connectors()` — register `connectors_bp` once, populate `DATA_CONNECTORS` from config (no per-instance blueprint). - [x] Admin-provisioned connectors (YAML/env) auto-create instances at startup. ### Phase B — Frontend: menu page cards + generic URLs -- [ ] Replace `getConnectorUrls(id)` with static `CONNECTOR_ACTION_URLS` constants (no ID in URL path). -- [ ] Update all frontend call sites to send `connector_id` in POST body instead of URL path. +- [x] Replace `getConnectorUrls(id)` with static `CONNECTOR_ACTION_URLS` constants (no ID in URL path). +- [x] Update all frontend call sites to send `connector_id` in POST body instead of URL path. - [x] Render connected connectors as promoted cards on the Load Data menu page. - [x] "Add Connection" card → left/right layout: pick type, fill params + display name, "Add & Connect" → `POST /api/connectors`. - [x] Each card click → opens `DataLoaderForm` (browse-only when connected). -- [ ] Disconnect / Delete actions on each card. +- [x] Disconnect / Delete actions on each card. ### Phase C — Cleanup -- [ ] Unify Superset plugin (`/api/plugins/superset/`) into the `/api/connectors/` flow. +- [x] Unify Superset plugin (`/api/plugins/superset/`) into the `/api/connectors/` flow. - [x] Remove legacy "Database" tab.