diff --git a/INTEGRATIONS_CREDENTIALS.md b/INTEGRATIONS_CREDENTIALS.md index 108e33a5c0..c68ed7244b 100644 --- a/INTEGRATIONS_CREDENTIALS.md +++ b/INTEGRATIONS_CREDENTIALS.md @@ -2,16 +2,42 @@ ## Overview -The integrations system enables Deepnote notebooks to connect to external data sources (PostgreSQL, BigQuery, etc.) by securely managing credentials and exposing them to SQL blocks. The system handles: +The integrations system enables Deepnote notebooks to connect to external data sources (PostgreSQL, BigQuery, Snowflake, etc.) by securely managing credentials and exposing them to SQL blocks. The system handles: 1. **Credential Storage**: Secure storage using VSCode's SecretStorage API 2. **Integration Detection**: Automatic discovery of integrations used in notebooks 3. **UI Management**: Webview-based configuration interface 4. **Kernel Integration**: Injection of credentials into Jupyter kernel environment 5. **Toolkit Exposure**: Making credentials available to `deepnote-toolkit` for SQL execution +6. **Format Conversion**: Uses `@deepnote/database-integrations` package for standardized credential formatting ## Architecture +### External Dependencies + +#### **@deepnote/database-integrations Package** (v1.1.0) + +The system uses the `@deepnote/database-integrations` package as the source of truth for: + +- **Type Definitions**: `DatabaseIntegrationConfig`, `DatabaseIntegrationType` +- **Metadata Schemas**: Validation schemas for each integration type (`databaseMetadataSchemasByType`) +- **Environment Variable Generation**: `getEnvironmentVariablesForIntegrations()` function +- **Auth Method Constants**: `BigQueryAuthMethods`, `SnowflakeAuthMethods` + +This ensures consistency between the VSCode extension and Deepnote's cloud platform. + +**Key Functions:** + +- `getEnvironmentVariablesForIntegrations(configs)`: Converts integration configs to environment variables +- `databaseMetadataSchemasByType[type].safeParse(metadata)`: Validates integration metadata + +**Supported Integration Types:** + +- `'pgsql'` - PostgreSQL +- `'big-query'` - BigQuery +- `'snowflake'` - Snowflake +- `'pandas-dataframe'` - DuckDB (internal) + ### Core Components #### 1. **Integration Storage** (`integrationStorage.ts`) @@ -25,11 +51,15 @@ Manages persistent storage of integration configurations using VSCode's encrypte - In-memory caching for performance - Event-driven updates via `onDidChangeIntegrations` event - Index-based storage for efficient retrieval +- Automatic upgrade of legacy configurations to new format +- Uses `@deepnote/database-integrations` package for type definitions and validation **Storage Format:** - Each integration config is stored as JSON under key: `deepnote-integrations.{integrationId}` - An index is maintained at key: `deepnote-integrations.index` containing all integration IDs +- Configs are versioned (currently version 1) to support future migrations +- Internal DuckDB integration (`deepnote-dataframe-sql`) is filtered out and not stored **Key Methods:** @@ -39,33 +69,92 @@ Manages persistent storage of integration configurations using VSCode's encrypte - `save(config)`: Save or update an integration configuration - `delete(integrationId)`: Remove an integration configuration - `exists(integrationId)`: Check if an integration is configured +- `clear()`: Remove all stored integrations **Integration Config Types:** +The system uses `DatabaseIntegrationConfig` from `@deepnote/database-integrations` package: + ```typescript -// PostgreSQL +// PostgreSQL (type: 'pgsql') { id: string; name: string; - type: 'postgres'; - host: string; - port: number; - database: string; - username: string; - password: string; - ssl?: boolean; + type: 'pgsql'; + metadata: { + host: string; + port: string; + database: string; + username: string; + password: string; + sslEnabled: boolean; + } } -// BigQuery +// BigQuery (type: 'big-query') { id: string; name: string; - type: 'bigquery'; - projectId: string; - credentials: string; // JSON string of service account credentials + type: 'big-query'; + metadata: { + authMethod: 'service-account'; + projectId: string; + credentials: object; // Service account JSON + } +} + +// Snowflake (type: 'snowflake') +{ + id: string; + name: string; + type: 'snowflake'; + metadata: { + authMethod: 'password' | 'service-account-key-pair'; + accountName: string; + warehouse?: string; + database?: string; + role?: string; + username: string; + // For password auth: + password: string; + // For key-pair auth: + privateKey: string; + privateKeyPassphrase?: string; + } } ``` +**Legacy Config Upgrade:** + +When loading configurations from storage, the system automatically detects and upgrades legacy configs (pre-`@deepnote/database-integrations`) using `upgradeLegacyIntegrationConfig()`. Invalid or unsupported configs are filtered out during loading. + +#### 1a. **Legacy Integration Config Utils** (`legacyIntegrationConfigUtils.ts`) + +Handles migration of legacy integration configurations to the new `@deepnote/database-integrations` format. + +**Key Function:** + +- `upgradeLegacyIntegrationConfig(legacyConfig)`: Converts legacy config to new format + +**Upgrade Process:** + +1. Detects legacy config format (missing `version` field) +2. Maps legacy type names to new type names: + - `'postgres'` → `'pgsql'` + - `'bigquery'` → `'big-query'` + - `'snowflake'` → `'snowflake'` +3. Restructures config to use `metadata` field +4. Converts Snowflake auth methods to new constants +5. Validates using `databaseMetadataSchemasByType` +6. Returns `null` for invalid or unsupported configs + +**Unsupported Snowflake Auth Methods:** + +- `'OKTA'` - User-specific, not supported in VSCode +- `'NATIVE_SNOWFLAKE'` - User-specific, not supported in VSCode +- `'AZURE_AD'` - User-specific, not supported in VSCode +- `'KEY_PAIR'` - Legacy, replaced by `'SERVICE_ACCOUNT_KEY_PAIR'` + #### 2. **Integration Detector** (`integrationDetector.ts`) Scans Deepnote projects to discover which integrations are used in SQL blocks. @@ -75,8 +164,9 @@ Scans Deepnote projects to discover which integrations are used in SQL blocks. 1. Retrieves the Deepnote project from `IDeepnoteNotebookManager` 2. Scans all notebooks in the project 3. Examines each code block for `metadata.sql_integration_id` -4. Checks if each integration is configured (has credentials) -5. Returns a map of integration IDs to their status +4. Maps Deepnote integration types to `DatabaseIntegrationType` using the project's integration list +5. Checks if each integration is configured (has credentials) +6. Returns a map of integration IDs to their status **Integration Status:** @@ -88,6 +178,7 @@ Scans Deepnote projects to discover which integrations are used in SQL blocks. - Excludes `deepnote-dataframe-sql` (internal DuckDB integration) - Only processes code blocks with SQL integration metadata +- Uses project integration metadata to determine integration types #### 3. **Integration Manager** (`integrationManager.ts`) @@ -202,11 +293,30 @@ Type-specific forms for entering integration credentials. - Project ID - Service Account Credentials (JSON textarea) +**Snowflake Form Fields:** + +- Name (display name) +- Account Name +- Warehouse (optional) +- Database (optional) +- Role (optional) +- Username +- Authentication Method (dropdown): + - Password + - Service Account Key Pair +- For Password auth: + - Password +- For Key Pair auth: + - Private Key (textarea) + - Private Key Passphrase (optional) + **Validation:** -- All fields are required +- All required fields must be filled - BigQuery credentials must be valid JSON - Port must be a valid number +- Snowflake private key must be in PEM format +- Forms use the metadata structure from `@deepnote/database-integrations` ### Kernel Integration @@ -216,10 +326,14 @@ Provides environment variables containing integration credentials for the Jupyte **Process:** -1. Scans the notebook for SQL cells with `sql_integration_id` metadata -2. Retrieves credentials for each detected integration -3. Converts credentials to the format expected by `deepnote-toolkit` -4. Returns environment variables to be injected into the kernel process +1. Identifies the Deepnote project from the notebook resource +2. Retrieves project integrations from the notebook manager +3. Fetches configured credentials from `IIntegrationStorage` for each integration +4. Always includes the internal DuckDB integration (`deepnote-dataframe-sql`) +5. Uses `getEnvironmentVariablesForIntegrations()` from `@deepnote/database-integrations` to convert credentials +6. Returns environment variables to be injected into the kernel process + +**Note:** This provider makes credentials for ALL integrations in the Deepnote project available as environment variables. This ensures that integrations are available project-wide, matching Deepnote's behavior where integrations are project-scoped. **Environment Variable Format:** @@ -229,6 +343,8 @@ Example: Integration ID `my-postgres-db` → Environment variable `SQL_MY_POSTGR **Credential JSON Format:** +The `@deepnote/database-integrations` package generates the credential JSON in the format expected by `deepnote-toolkit`: + PostgreSQL: ```json @@ -254,11 +370,45 @@ BigQuery: } ``` +Snowflake (password auth): + +```json +{ + "url": "snowflake://username:password@account/database?warehouse=wh&role=role&application=Deepnote", + "params": {}, + "param_style": "pyformat" +} +``` + +Snowflake (key-pair auth): + +```json +{ + "url": "snowflake://username@account/database?warehouse=wh&role=role&authenticator=snowflake_jwt&application=Deepnote", + "params": { + "snowflake_private_key": "base64_encoded_key", + "snowflake_private_key_passphrase": "passphrase" + }, + "param_style": "pyformat" +} +``` + +DuckDB (internal): + +```json +{ + "url": "duckdb:///:memory:", + "params": {}, + "param_style": "qmark" +} +``` + **Integration Points:** - Registered as an environment variable provider in the kernel environment service - Called when starting a Jupyter kernel for a Deepnote notebook - Environment variables are passed to the kernel process at startup +- Fires `onDidChangeEnvironmentVariables` event when integration storage changes #### 8. **SQL Integration Startup Code Provider** (`sqlIntegrationStartupCodeProvider.ts`) @@ -325,7 +475,11 @@ User → IntegrationPanel (UI) → vscodeApi.postMessage({ type: 'save', config }) → IntegrationWebviewProvider.onMessage() → IntegrationStorage.save(config) - → EncryptedStorage.store() [VSCode SecretStorage API] + → Validates config using @deepnote/database-integrations schemas + → Adds version field (version: 1) + → EncryptedStorage.store() [VSCode SecretStorage API] + → Updates in-memory cache + → Updates index → IntegrationStorage fires onDidChangeIntegrations event → SqlIntegrationEnvironmentVariablesProvider fires onDidChangeEnvironmentVariables event ``` @@ -336,9 +490,13 @@ User → IntegrationPanel (UI) User executes SQL cell → Kernel startup triggered → SqlIntegrationEnvironmentVariablesProvider.getEnvironmentVariables() - → Scans notebook for SQL cells - → Retrieves credentials from IntegrationStorage - → Converts to JSON format + → Identifies Deepnote project from notebook resource + → Retrieves project integrations from notebook manager + → Fetches configured credentials from IntegrationStorage + → Adds internal DuckDB integration + → Calls getEnvironmentVariablesForIntegrations() from @deepnote/database-integrations + → Converts configs to environment variable format + → Generates SQL_* environment variables → Returns environment variables → Environment variables passed to Jupyter server process → SqlIntegrationStartupCodeProvider.getCode() @@ -349,66 +507,125 @@ User executes SQL cell → Results returned to notebook ``` +## Key Architectural Changes + +### Migration to @deepnote/database-integrations + +The system was refactored to use the `@deepnote/database-integrations` package (v1.1.0) as the source of truth for integration types and credential formatting. This provides: + +**Benefits:** + +1. **Consistency**: Same type definitions and validation as Deepnote's cloud platform +2. **Maintainability**: Credential formatting logic is centralized in one package +3. **Type Safety**: Strong TypeScript types from the package +4. **Extensibility**: New integration types can be added by updating the package + +**Key Changes:** + +1. **Type Definitions**: + + - Old: `IntegrationType` enum with `'postgres'`, `'bigquery'`, `'snowflake'` + - New: `DatabaseIntegrationType` from package with `'pgsql'`, `'big-query'`, `'snowflake'`, `'pandas-dataframe'` + +2. **Config Structure**: + + - Old: Flat structure with credentials at top level + - New: Nested structure with `metadata` field containing credentials + +3. **Environment Variable Generation**: + + - Old: Manual conversion logic in `sqlIntegrationEnvironmentVariablesProvider.ts` + - New: Delegated to `getEnvironmentVariablesForIntegrations()` from package + +4. **Validation**: + + - Old: Manual validation in forms + - New: Schema-based validation using `databaseMetadataSchemasByType` + +5. **Legacy Support**: + - Automatic upgrade of old configs via `upgradeLegacyIntegrationConfig()` + - Versioned storage format for future migrations + ## Security Considerations 1. **Encrypted Storage**: All credentials are stored using VSCode's SecretStorage API, which uses the OS keychain 2. **No Plaintext**: Credentials are never written to disk in plaintext 3. **Scoped Access**: Storage is scoped to the VSCode extension -4. **Environment Isolation**: Each notebook gets only the credentials it needs +4. **Environment Isolation**: Each project gets credentials for all configured integrations 5. **No Logging**: Credential values are never logged; only non-sensitive metadata (key names, counts) is logged ## Adding New Integration Types -To add a new integration type (e.g., MySQL, Snowflake): +To add a new integration type (e.g., MySQL): -1. **Add type to `integrationTypes.ts`**: +1. **Add support to `@deepnote/database-integrations` package** (if not already supported): - ```typescript - export enum IntegrationType { - Postgres = 'postgres', - BigQuery = 'bigquery', - MySQL = 'mysql' // New type - } - - export interface MySQLIntegrationConfig extends BaseIntegrationConfig { - type: IntegrationType.MySQL; - host: string; - port: number; - database: string; - username: string; - password: string; - } - - export type IntegrationConfig = PostgresIntegrationConfig | BigQueryIntegrationConfig | MySQLIntegrationConfig; - ``` + - Add type definition and metadata schema + - Add conversion logic for environment variables + - This is the source of truth for integration types + +2. **Create UI form component** (`MySQLForm.tsx`): + + - Follow the pattern of `PostgresForm.tsx` or `BigQueryForm.tsx` + - Use the metadata structure from `@deepnote/database-integrations` + - Validate inputs according to the package's schema -2. **Add conversion logic in `sqlIntegrationEnvironmentVariablesProvider.ts`**: +3. **Update `ConfigurationForm.tsx`** to render the new form: ```typescript - case IntegrationType.MySQL: { - const url = `mysql://${config.username}:${config.password}@${config.host}:${config.port}/${config.database}`; - return JSON.stringify({ url, params: {}, param_style: 'format' }); - } + case 'mysql': + return ; ``` -3. **Create UI form component** (`MySQLForm.tsx`) +4. **Update webview types** (`src/webviews/webview-side/integrations/types.ts`): -4. **Update `ConfigurationForm.tsx`** to render the new form + - Import types from `@deepnote/database-integrations` + - Add any UI-specific types needed -5. **Update webview types** (`src/webviews/webview-side/integrations/types.ts`) +5. **Add localization strings** for the new integration type: -6. **Add localization strings** for the new integration type + - Integration name + - Form field labels + - Error messages -## Testing +6. **Add tests**: + - Unit tests for the form component + - Integration tests for storage and environment variable generation -Unit tests are located in: +**Note:** The credential-to-environment-variable conversion is handled automatically by `@deepnote/database-integrations`, so no manual conversion logic is needed in the VSCode extension. -- `sqlIntegrationEnvironmentVariablesProvider.unit.test.ts` +## Testing -Tests cover: +Unit tests are located in: -- Environment variable generation for each integration type -- Multiple integrations in a single notebook -- Missing credentials handling -- Integration ID to environment variable name conversion -- JSON format validation +- `sqlIntegrationEnvironmentVariablesProvider.unit.test.ts` - Environment variable provider tests +- `integrationStorage.unit.test.ts` - Storage and persistence tests +- `legacyIntegrationConfigUtils.unit.test.ts` - Legacy config upgrade tests + +**Environment Variables Provider Tests** cover: + +- Environment variable generation for each integration type (PostgreSQL, BigQuery, Snowflake) +- Project integration retrieval and filtering +- DuckDB integration inclusion +- Integration config retrieval from storage +- Event emission when integrations change +- Real environment variable format validation + +**Integration Storage Tests** cover: + +- CRUD operations (create, read, update, delete) +- Loading from encrypted storage +- Filtering out invalid configs +- Filtering out pandas-dataframe type +- Event emission on changes +- Cache management +- Index handling (empty, missing, corrupted) + +**Legacy Config Upgrade Tests** cover: + +- PostgreSQL config upgrade +- BigQuery config upgrade +- Snowflake config upgrade (all auth methods) +- Unsupported auth method handling +- Invalid metadata handling +- Unknown integration type handling diff --git a/package-lock.json b/package-lock.json index 612c821116..a725fa6165 100644 --- a/package-lock.json +++ b/package-lock.json @@ -13,6 +13,7 @@ "@c4312/evt": "^0.1.1", "@deepnote/blocks": "^1.3.5", "@deepnote/convert": "^1.2.0", + "@deepnote/database-integrations": "^1.1.1", "@enonic/fnv-plus": "^1.3.0", "@jupyter-widgets/base": "^6.0.8", "@jupyter-widgets/controls": "^5.0.9", @@ -1417,6 +1418,24 @@ "url": "https://github.com/chalk/chalk?sponsor=1" } }, + "node_modules/@deepnote/database-integrations": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/@deepnote/database-integrations/-/database-integrations-1.1.1.tgz", + "integrity": "sha512-OTnye3O/si4anxlwrRYVY4LaQOZHDJY2Iyy8Cbigoklrxx1uKxwvSTnNzx4wIZMzGHV07lLeqyPBYoIBUeZM4w==", + "license": "Apache-2.0", + "dependencies": { + "zod": "3.25.76" + } + }, + "node_modules/@deepnote/database-integrations/node_modules/zod": { + "version": "3.25.76", + "resolved": "https://registry.npmjs.org/zod/-/zod-3.25.76.tgz", + "integrity": "sha512-gzUt/qt81nXsFGKIFcC3YnfEAx5NkunCfnDlvuBSSFS02bcXu4Lmea0AFIUwbLWxWPx3d9p8S5QoaujKcNQxcQ==", + "license": "MIT", + "funding": { + "url": "https://github.com/sponsors/colinhacks" + } + }, "node_modules/@enonic/fnv-plus": { "version": "1.3.0", "resolved": "https://registry.npmjs.org/@enonic/fnv-plus/-/fnv-plus-1.3.0.tgz", @@ -21610,6 +21629,21 @@ } } }, + "@deepnote/database-integrations": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/@deepnote/database-integrations/-/database-integrations-1.1.1.tgz", + "integrity": "sha512-OTnye3O/si4anxlwrRYVY4LaQOZHDJY2Iyy8Cbigoklrxx1uKxwvSTnNzx4wIZMzGHV07lLeqyPBYoIBUeZM4w==", + "requires": { + "zod": "3.25.76" + }, + "dependencies": { + "zod": { + "version": "3.25.76", + "resolved": "https://registry.npmjs.org/zod/-/zod-3.25.76.tgz", + "integrity": "sha512-gzUt/qt81nXsFGKIFcC3YnfEAx5NkunCfnDlvuBSSFS02bcXu4Lmea0AFIUwbLWxWPx3d9p8S5QoaujKcNQxcQ==" + } + } + }, "@enonic/fnv-plus": { "version": "1.3.0", "resolved": "https://registry.npmjs.org/@enonic/fnv-plus/-/fnv-plus-1.3.0.tgz", diff --git a/package.json b/package.json index 626012ddab..d18fcc9657 100644 --- a/package.json +++ b/package.json @@ -2455,6 +2455,7 @@ "@c4312/evt": "^0.1.1", "@deepnote/blocks": "^1.3.5", "@deepnote/convert": "^1.2.0", + "@deepnote/database-integrations": "^1.1.1", "@enonic/fnv-plus": "^1.3.0", "@jupyter-widgets/base": "^6.0.8", "@jupyter-widgets/controls": "^5.0.9", diff --git a/src/messageTypes.ts b/src/messageTypes.ts index 501714e1db..6c90a08cbd 100644 --- a/src/messageTypes.ts +++ b/src/messageTypes.ts @@ -235,6 +235,7 @@ export type LocalizedMessages = { integrationsRequiredField: string; integrationsOptionalField: string; integrationsUnnamedIntegration: string; + integrationsUnsupportedIntegrationType: string; // Select input settings strings selectInputSettingsTitle: string; allowMultipleValues: string; diff --git a/src/notebooks/deepnote/integrations/integrationDetector.ts b/src/notebooks/deepnote/integrations/integrationDetector.ts index 55e627b734..a17fc2ba30 100644 --- a/src/notebooks/deepnote/integrations/integrationDetector.ts +++ b/src/notebooks/deepnote/integrations/integrationDetector.ts @@ -2,14 +2,9 @@ import { inject, injectable } from 'inversify'; import { logger } from '../../../platform/logging'; import { IDeepnoteNotebookManager } from '../../types'; -import { - DATAFRAME_SQL_INTEGRATION_ID, - DEEPNOTE_TO_INTEGRATION_TYPE, - IntegrationStatus, - IntegrationWithStatus, - RawIntegrationType -} from '../../../platform/notebooks/deepnote/integrationTypes'; +import { IntegrationStatus, IntegrationWithStatus } from '../../../platform/notebooks/deepnote/integrationTypes'; import { IIntegrationDetector, IIntegrationStorage } from './types'; +import { DatabaseIntegrationType, databaseIntegrationTypes } from '@deepnote/database-integrations'; /** * Service for detecting integrations used in Deepnote notebooks @@ -40,39 +35,25 @@ export class IntegrationDetector implements IIntegrationDetector { const integrations = new Map(); // Use the project's integrations field as the source of truth - const projectIntegrations = project.project.integrations || []; + const projectIntegrations = project.project.integrations?.slice() ?? []; logger.debug(`IntegrationDetector: Found ${projectIntegrations.length} integrations in project.integrations`); for (const projectIntegration of projectIntegrations) { const integrationId = projectIntegration.id; - - // Skip the internal DuckDB integration - if (integrationId === DATAFRAME_SQL_INTEGRATION_ID) { - continue; - } - - logger.debug(`IntegrationDetector: Found integration: ${integrationId} (${projectIntegration.type})`); - - // Map the Deepnote integration type to our IntegrationType - const integrationType = DEEPNOTE_TO_INTEGRATION_TYPE[projectIntegration.type as RawIntegrationType]; - - // Skip unknown integration types - if (!integrationType) { - logger.warn( - `IntegrationDetector: Unknown integration type '${projectIntegration.type}' for integration ID '${integrationId}'. Skipping.` - ); + const integrationType = projectIntegration.type; + if (!(databaseIntegrationTypes as readonly string[]).includes(integrationType)) { + logger.debug(`IntegrationDetector: Skipping unsupported integration type: ${integrationType}`); continue; } // Check if the integration is configured const config = await this.integrationStorage.getIntegrationConfig(integrationId); - const status: IntegrationWithStatus = { - config: config || null, + config: config ?? null, status: config ? IntegrationStatus.Connected : IntegrationStatus.Disconnected, // Include integration metadata from project for prefilling when config is null integrationName: projectIntegration.name, - integrationType: integrationType + integrationType: integrationType as DatabaseIntegrationType }; integrations.set(integrationId, status); diff --git a/src/notebooks/deepnote/integrations/integrationManager.ts b/src/notebooks/deepnote/integrations/integrationManager.ts index e9dd729c47..24456c59b7 100644 --- a/src/notebooks/deepnote/integrations/integrationManager.ts +++ b/src/notebooks/deepnote/integrations/integrationManager.ts @@ -1,19 +1,13 @@ import { inject, injectable } from 'inversify'; -import { commands, l10n, NotebookDocument, window, workspace } from 'vscode'; +import { commands, l10n, window, workspace } from 'vscode'; import { IExtensionContext } from '../../../platform/common/types'; import { Commands } from '../../../platform/common/constants'; import { logger } from '../../../platform/logging'; import { IIntegrationDetector, IIntegrationManager, IIntegrationStorage, IIntegrationWebviewProvider } from './types'; -import { - DEEPNOTE_TO_INTEGRATION_TYPE, - IntegrationStatus, - IntegrationType, - IntegrationWithStatus, - RawIntegrationType -} from '../../../platform/notebooks/deepnote/integrationTypes'; -import { BlockWithIntegration, scanBlocksForIntegrations } from './integrationUtils'; +import { IntegrationStatus } from '../../../platform/notebooks/deepnote/integrationTypes'; import { IDeepnoteNotebookManager } from '../../types'; +import { DatabaseIntegrationType, databaseIntegrationTypes } from '@deepnote/database-integrations'; /** * Manages integration UI and commands for Deepnote notebooks @@ -143,14 +137,6 @@ export class IntegrationManager implements IIntegrationManager { // First try to detect integrations from the stored project let integrations = await this.integrationDetector.detectIntegrations(projectId); - - // If no integrations found in stored project, scan cells directly - // This handles the case where the notebook was already open when the extension loaded - if (integrations.size === 0) { - logger.debug(`IntegrationManager: No integrations found in stored project, scanning cells directly`); - integrations = await this.detectIntegrationsFromCells(activeNotebook); - } - logger.debug(`IntegrationManager: Found ${integrations.size} integrations`); // If a specific integration was requested (e.g., from status bar click), @@ -164,20 +150,15 @@ export class IntegrationManager implements IIntegrationManager { const projectIntegration = project?.project.integrations?.find((i) => i.id === selectedIntegrationId); let integrationName: string | undefined; - let integrationType: IntegrationType | undefined; + let integrationType: DatabaseIntegrationType | undefined; - if (projectIntegration) { + // Validate that projectIntegration.type against supported types + if ( + projectIntegration && + (databaseIntegrationTypes as readonly string[]).includes(projectIntegration.type) + ) { integrationName = projectIntegration.name; - - // Validate that projectIntegration.type exists in the mapping before lookup - if (projectIntegration.type in DEEPNOTE_TO_INTEGRATION_TYPE) { - // Map the Deepnote integration type to our IntegrationType - integrationType = DEEPNOTE_TO_INTEGRATION_TYPE[projectIntegration.type as RawIntegrationType]; - } else { - logger.warn( - `IntegrationManager: Unknown integration type '${projectIntegration.type}' for integration ID '${selectedIntegrationId}' in project '${projectId}'. Integration type will be undefined.` - ); - } + integrationType = projectIntegration.type as DatabaseIntegrationType; } integrations.set(selectedIntegrationId, { @@ -196,35 +177,4 @@ export class IntegrationManager implements IIntegrationManager { // Show the webview with optional selected integration await this.webviewProvider.show(projectId, integrations, selectedIntegrationId); } - - /** - * Detect integrations by scanning cells directly (fallback method) - * This is used when the project isn't stored in the notebook manager - */ - private async detectIntegrationsFromCells(notebook: NotebookDocument): Promise> { - // Collect all cells with SQL integration metadata - const blocksWithIntegrations: BlockWithIntegration[] = []; - - for (const cell of notebook.getCells()) { - const metadata = cell.metadata; - logger.trace(`IntegrationManager: Cell ${cell.index} metadata:`, metadata); - - // Check cell metadata for sql_integration_id - if (metadata && typeof metadata === 'object') { - const integrationId = (metadata as Record).sql_integration_id; - if (typeof integrationId === 'string') { - logger.debug(`IntegrationManager: Found integration ${integrationId} in cell ${cell.index}`); - blocksWithIntegrations.push({ - id: `cell-${cell.index}`, - sql_integration_id: integrationId - }); - } - } - } - - logger.debug(`IntegrationManager: Found ${blocksWithIntegrations.length} cells with integrations`); - - // Use the shared utility to scan blocks and build the status map - return scanBlocksForIntegrations(blocksWithIntegrations, this.integrationStorage, 'IntegrationManager'); - } } diff --git a/src/notebooks/deepnote/integrations/integrationUtils.ts b/src/notebooks/deepnote/integrations/integrationUtils.ts deleted file mode 100644 index 9e9015ca9e..0000000000 --- a/src/notebooks/deepnote/integrations/integrationUtils.ts +++ /dev/null @@ -1,68 +0,0 @@ -import { logger } from '../../../platform/logging'; -import { IIntegrationStorage } from './types'; -import { - DATAFRAME_SQL_INTEGRATION_ID, - IntegrationStatus, - IntegrationWithStatus -} from '../../../platform/notebooks/deepnote/integrationTypes'; - -/** - * Represents a block with SQL integration metadata - */ -export interface BlockWithIntegration { - id: string; - sql_integration_id: string; -} - -/** - * Scans blocks for SQL integrations and builds a status map. - * This is the core logic shared between IntegrationDetector and IntegrationManager. - * - * @param blocks - Iterator of blocks to scan (can be from Deepnote project or VSCode notebook cells) - * @param integrationStorage - Storage service to check configuration status - * @param logContext - Context string for logging (e.g., "IntegrationDetector", "IntegrationManager") - * @returns Map of integration IDs to their status - */ -export async function scanBlocksForIntegrations( - blocks: Iterable, - integrationStorage: IIntegrationStorage, - logContext: string -): Promise> { - const integrations = new Map(); - - for (const block of blocks) { - const integrationId = block.sql_integration_id; - - // Skip blocks without integration IDs - if (!integrationId) { - continue; - } - - // Skip excluded integrations (e.g., internal DuckDB integration) - if (integrationId === DATAFRAME_SQL_INTEGRATION_ID) { - logger.trace(`${logContext}: Skipping excluded integration: ${integrationId} in block ${block.id}`); - continue; - } - - // Skip if we've already detected this integration - if (integrations.has(integrationId)) { - continue; - } - - logger.debug(`${logContext}: Found integration: ${integrationId} in block ${block.id}`); - - // Check if the integration is configured - const config = await integrationStorage.getIntegrationConfig(integrationId); - - const status: IntegrationWithStatus = { - config: config || null, - status: config ? IntegrationStatus.Connected : IntegrationStatus.Disconnected - }; - - integrations.set(integrationId, status); - } - - logger.debug(`${logContext}: Found ${integrations.size} integrations`); - - return integrations; -} diff --git a/src/notebooks/deepnote/integrations/integrationWebview.ts b/src/notebooks/deepnote/integrations/integrationWebview.ts index c94e87ade3..d6be608a77 100644 --- a/src/notebooks/deepnote/integrations/integrationWebview.ts +++ b/src/notebooks/deepnote/integrations/integrationWebview.ts @@ -7,13 +7,8 @@ import { logger } from '../../../platform/logging'; import { LocalizedMessages, SharedMessages } from '../../../messageTypes'; import { IDeepnoteNotebookManager, ProjectIntegration } from '../../types'; import { IIntegrationStorage, IIntegrationWebviewProvider } from './types'; -import { - INTEGRATION_TYPE_TO_DEEPNOTE, - IntegrationConfig, - IntegrationStatus, - IntegrationWithStatus, - RawIntegrationType -} from '../../../platform/notebooks/deepnote/integrationTypes'; +import { IntegrationStatus, IntegrationWithStatus } from '../../../platform/notebooks/deepnote/integrationTypes'; +import { DatabaseIntegrationConfig } from '@deepnote/database-integrations'; /** * Manages the webview panel for integration configuration @@ -182,7 +177,8 @@ export class IntegrationWebviewProvider implements IIntegrationWebviewProvider { integrationsSnowflakeRolePlaceholder: localize.Integrations.snowflakeRolePlaceholder, integrationsSnowflakeWarehouseLabel: localize.Integrations.snowflakeWarehouseLabel, integrationsSnowflakeWarehousePlaceholder: localize.Integrations.snowflakeWarehousePlaceholder, - integrationsUnnamedIntegration: localize.Integrations.unnamedIntegration('{0}') + integrationsUnnamedIntegration: localize.Integrations.unnamedIntegration('{0}'), + integrationsUnsupportedIntegrationType: localize.Integrations.unsupportedIntegrationType('{0}') }; await this.currentPanel.webview.postMessage({ @@ -221,7 +217,7 @@ export class IntegrationWebviewProvider implements IIntegrationWebviewProvider { private async handleMessage(message: { type: string; integrationId?: string; - config?: IntegrationConfig; + config?: DatabaseIntegrationConfig; }): Promise { switch (message.type) { case 'configure': @@ -263,7 +259,7 @@ export class IntegrationWebviewProvider implements IIntegrationWebviewProvider { /** * Save the configuration for an integration */ - private async saveConfiguration(integrationId: string, config: IntegrationConfig): Promise { + private async saveConfiguration(integrationId: string, config: DatabaseIntegrationConfig): Promise { try { await this.integrationStorage.save(config); @@ -349,17 +345,16 @@ export class IntegrationWebviewProvider implements IIntegrationWebviewProvider { return null; } - // Map to Deepnote integration type - const deepnoteType: RawIntegrationType | undefined = INTEGRATION_TYPE_TO_DEEPNOTE[type]; - if (!deepnoteType) { - logger.warn(`IntegrationWebviewProvider: Cannot map type ${type} for integration ${id}, skipping`); + // Skip DuckDB integration (internal, not a real Deepnote integration) + if (type === 'pandas-dataframe') { + logger.trace(`IntegrationWebviewProvider: Skipping internal DuckDB integration ${id}`); return null; } return { id, name: integration.config?.name || integration.integrationName || id, - type: deepnoteType + type }; }) .filter((integration): integration is ProjectIntegration => integration !== null); diff --git a/src/notebooks/deepnote/sqlCellStatusBarProvider.ts b/src/notebooks/deepnote/sqlCellStatusBarProvider.ts index b0ba8897ce..20ca8f45ec 100644 --- a/src/notebooks/deepnote/sqlCellStatusBarProvider.ts +++ b/src/notebooks/deepnote/sqlCellStatusBarProvider.ts @@ -22,13 +22,9 @@ import { IExtensionSyncActivationService } from '../../platform/activation/types import { IDisposableRegistry } from '../../platform/common/types'; import { IIntegrationStorage } from './integrations/types'; import { Commands } from '../../platform/common/constants'; -import { - DATAFRAME_SQL_INTEGRATION_ID, - DEEPNOTE_TO_INTEGRATION_TYPE, - IntegrationType, - RawIntegrationType -} from '../../platform/notebooks/deepnote/integrationTypes'; +import { DATAFRAME_SQL_INTEGRATION_ID } from '../../platform/notebooks/deepnote/integrationTypes'; import { IDeepnoteNotebookManager } from '../types'; +import { DatabaseIntegrationType, databaseIntegrationTypes } from '@deepnote/database-integrations'; /** * QuickPick item with an integration ID @@ -347,8 +343,11 @@ export class SqlCellStatusBarProvider implements NotebookCellStatusBarItemProvid continue; } - const integrationType = DEEPNOTE_TO_INTEGRATION_TYPE[projectIntegration.type as RawIntegrationType]; - const typeLabel = integrationType ? this.getIntegrationTypeLabel(integrationType) : projectIntegration.type; + const integrationType = projectIntegration.type; + const typeLabel = + integrationType && (databaseIntegrationTypes as readonly string[]).includes(integrationType) + ? this.getIntegrationTypeLabel(integrationType as DatabaseIntegrationType) + : projectIntegration.type; const item: LocalQuickPickItem = { label: projectIntegration.name || projectIntegration.id, @@ -431,13 +430,13 @@ export class SqlCellStatusBarProvider implements NotebookCellStatusBarItemProvid this._onDidChangeCellStatusBarItems.fire(); } - private getIntegrationTypeLabel(type: IntegrationType): string { + private getIntegrationTypeLabel(type: DatabaseIntegrationType): string { switch (type) { - case IntegrationType.Postgres: + case 'pgsql': return l10n.t('PostgreSQL'); - case IntegrationType.BigQuery: + case 'big-query': return l10n.t('BigQuery'); - case IntegrationType.Snowflake: + case 'snowflake': return l10n.t('Snowflake'); default: return String(type); diff --git a/src/notebooks/deepnote/sqlCellStatusBarProvider.unit.test.ts b/src/notebooks/deepnote/sqlCellStatusBarProvider.unit.test.ts index 51289fed34..40a72119b0 100644 --- a/src/notebooks/deepnote/sqlCellStatusBarProvider.unit.test.ts +++ b/src/notebooks/deepnote/sqlCellStatusBarProvider.unit.test.ts @@ -14,7 +14,7 @@ import { import { IDisposableRegistry } from '../../platform/common/types'; import { IIntegrationStorage } from './integrations/types'; import { SqlCellStatusBarProvider } from './sqlCellStatusBarProvider'; -import { DATAFRAME_SQL_INTEGRATION_ID, IntegrationType } from '../../platform/notebooks/deepnote/integrationTypes'; +import { DATAFRAME_SQL_INTEGRATION_ID } from '../../platform/notebooks/deepnote/integrationTypes'; import { mockedVSCodeNamespaces, resetVSCodeMocks } from '../../test/vscode-mock'; import { createEventHandler } from '../../test/common'; import { Commands } from '../../platform/common/constants'; @@ -134,12 +134,15 @@ suite('SqlCellStatusBarProvider', () => { when(integrationStorage.getProjectIntegrationConfig(anything(), anything())).thenResolve({ id: integrationId, name: 'My Postgres DB', - type: IntegrationType.Postgres, - host: 'localhost', - port: 5432, - database: 'test', - username: 'user', - password: 'pass' + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'test', + user: 'user', + password: 'pass', + sslEnabled: false + } }); const result = await provider.provideCellStatusBarItems(cell, cancellationToken); @@ -282,12 +285,15 @@ suite('SqlCellStatusBarProvider', () => { when(integrationStorage.getProjectIntegrationConfig(anything(), anything())).thenResolve({ id: integrationId, name: 'My Postgres DB', - type: IntegrationType.Postgres, - host: 'localhost', - port: 5432, - database: 'test', - username: 'user', - password: 'pass' + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'test', + user: 'user', + password: 'pass', + sslEnabled: false + } }); const result = await provider.provideCellStatusBarItems(cell, cancellationToken); diff --git a/src/notebooks/serviceRegistry.node.ts b/src/notebooks/serviceRegistry.node.ts index bf9ba6186b..c316bc97ae 100644 --- a/src/notebooks/serviceRegistry.node.ts +++ b/src/notebooks/serviceRegistry.node.ts @@ -53,6 +53,10 @@ import { IIntegrationStorage, IIntegrationWebviewProvider } from './deepnote/integrations/types'; +import { + IPlatformNotebookEditorProvider, + IPlatformDeepnoteNotebookManager +} from '../platform/notebooks/deepnote/types'; import { SqlCellStatusBarProvider } from './deepnote/sqlCellStatusBarProvider'; import { IDeepnoteToolkitInstaller, @@ -78,6 +82,8 @@ export function registerTypes(serviceManager: IServiceManager, isDevMode: boolea serviceManager.addSingleton(INotebookCommandHandler, NotebookCommandListener); serviceManager.addBinding(INotebookCommandHandler, IExtensionSyncActivationService); serviceManager.addSingleton(INotebookEditorProvider, NotebookEditorProvider); + // Bind the platform-layer interface to the same implementation + serviceManager.addBinding(INotebookEditorProvider, IPlatformNotebookEditorProvider); serviceManager.addSingleton( IExtensionSyncActivationService, RemoteKernelControllerWatcher @@ -148,6 +154,8 @@ export function registerTypes(serviceManager: IServiceManager, isDevMode: boolea DeepnoteNotebookCommandListener ); serviceManager.addSingleton(IDeepnoteNotebookManager, DeepnoteNotebookManager); + // Bind the platform-layer interface to the same implementation + serviceManager.addBinding(IDeepnoteNotebookManager, IPlatformDeepnoteNotebookManager); serviceManager.addSingleton(IIntegrationStorage, IntegrationStorage); serviceManager.addSingleton(IIntegrationDetector, IntegrationDetector); serviceManager.addSingleton(IIntegrationWebviewProvider, IntegrationWebviewProvider); diff --git a/src/platform/common/utils/localize.ts b/src/platform/common/utils/localize.ts index 0e65f409bd..73ae5d2c09 100644 --- a/src/platform/common/utils/localize.ts +++ b/src/platform/common/utils/localize.ts @@ -831,6 +831,7 @@ export namespace Integrations { export const requiredField = l10n.t('*'); export const optionalField = l10n.t('(optional)'); export const unnamedIntegration = (id: string) => l10n.t('Unnamed Integration ({0})', id); + export const unsupportedIntegrationType = (type: string) => l10n.t('Unsupported integration type: {0}', type); // Integration type labels export const postgresTypeLabel = l10n.t('PostgreSQL'); diff --git a/src/platform/notebooks/deepnote/integrationStorage.ts b/src/platform/notebooks/deepnote/integrationStorage.ts index 6b86d52f1b..2d57fbf132 100644 --- a/src/platform/notebooks/deepnote/integrationStorage.ts +++ b/src/platform/notebooks/deepnote/integrationStorage.ts @@ -4,11 +4,28 @@ import { EventEmitter } from 'vscode'; import { IEncryptedStorage } from '../../common/application/types'; import { IAsyncDisposableRegistry } from '../../common/types'; import { logger } from '../../logging'; -import { IntegrationConfig, IntegrationType } from './integrationTypes'; import { IIntegrationStorage } from './types'; +import { upgradeLegacyIntegrationConfig } from './legacyIntegrationConfigUtils'; +import { + DatabaseIntegrationConfig, + databaseIntegrationTypes, + databaseMetadataSchemasByType +} from '@deepnote/database-integrations'; +import { DATAFRAME_SQL_INTEGRATION_ID } from './integrationTypes'; const INTEGRATION_SERVICE_NAME = 'deepnote-integrations'; +// NOTE: We need a way to upgrade existing configurations to the new format of deepnote/database-integrations. +type VersionedDatabaseIntegrationConfig = DatabaseIntegrationConfig & { version: 1 }; + +function storeEncryptedIntegrationConfig( + encryptedStorage: IEncryptedStorage, + integrationId: string, + config: VersionedDatabaseIntegrationConfig +): Promise { + return encryptedStorage.store(INTEGRATION_SERVICE_NAME, integrationId, JSON.stringify(config)); +} + /** * Storage service for integration configurations. * Uses VSCode's SecretStorage API to securely store credentials. @@ -16,7 +33,7 @@ const INTEGRATION_SERVICE_NAME = 'deepnote-integrations'; */ @injectable() export class IntegrationStorage implements IIntegrationStorage { - private readonly cache: Map = new Map(); + private readonly cache: Map = new Map(); private cacheLoaded = false; @@ -33,9 +50,9 @@ export class IntegrationStorage implements IIntegrationStorage { } /** - * Get all stored integration configurations + * Get all stored integration configurations. */ - async getAll(): Promise { + async getAll(): Promise { await this.ensureCacheLoaded(); return Array.from(this.cache.values()); } @@ -43,7 +60,7 @@ export class IntegrationStorage implements IIntegrationStorage { /** * Get a specific integration configuration by ID */ - async getIntegrationConfig(integrationId: string): Promise { + async getIntegrationConfig(integrationId: string): Promise { await this.ensureCacheLoaded(); return this.cache.get(integrationId); } @@ -56,28 +73,23 @@ export class IntegrationStorage implements IIntegrationStorage { async getProjectIntegrationConfig( _projectId: string, integrationId: string - ): Promise { + ): Promise { return this.getIntegrationConfig(integrationId); } - /** - * Get all integrations of a specific type - */ - async getByType(type: IntegrationType): Promise { - await this.ensureCacheLoaded(); - return Array.from(this.cache.values()).filter((config) => config.type === type); - } - /** * Save or update an integration configuration */ - async save(config: IntegrationConfig): Promise { + async save(config: DatabaseIntegrationConfig): Promise { + if (config.type === 'pandas-dataframe' || config.id === DATAFRAME_SQL_INTEGRATION_ID) { + logger.warn(`IntegrationStorage: Skipping save for internal DuckDB integration ${config.id}`); + return; + } + await this.ensureCacheLoaded(); // Store the configuration as JSON in encrypted storage - const configJson = JSON.stringify(config); - await this.encryptedStorage.store(INTEGRATION_SERVICE_NAME, config.id, configJson); - + await storeEncryptedIntegrationConfig(this.encryptedStorage, config.id, { ...config, version: 1 }); // Update cache this.cache.set(config.id, config); @@ -154,19 +166,83 @@ export class IntegrationStorage implements IIntegrationStorage { try { const integrationIds: string[] = JSON.parse(indexJson); + const idsToDelete: string[] = []; // Load each integration configuration for (const id of integrationIds) { + if (id === DATAFRAME_SQL_INTEGRATION_ID) { + continue; + } + const configJson = await this.encryptedStorage.retrieve(INTEGRATION_SERVICE_NAME, id); if (configJson) { try { - const config: IntegrationConfig = JSON.parse(configJson); - this.cache.set(id, config); + const parsedData = JSON.parse(configJson); + + // Check if this is a legacy config (missing 'version' field) + if (!('version' in parsedData)) { + logger.info(`Upgrading legacy integration config for ${id}`); + + // Attempt to upgrade the legacy config + const upgradedConfig = await upgradeLegacyIntegrationConfig(parsedData); + + if (upgradedConfig) { + if (upgradedConfig.type === 'pandas-dataframe') { + logger.warn(`IntegrationStorage: Skipping internal DuckDB integration ${id}`); + continue; + } + + // Successfully upgraded - save the new config + logger.info(`Successfully upgraded integration config for ${id}`); + await storeEncryptedIntegrationConfig(this.encryptedStorage, id, { + ...upgradedConfig, + version: 1 + }); + this.cache.set(id, upgradedConfig); + } else { + // Upgrade failed - mark for deletion + logger.warn(`Failed to upgrade integration ${id}, marking for deletion`); + idsToDelete.push(id); + } + } else { + // Already versioned config - validate against current schema + const { version: _version, ...rawConfig } = parsedData; + const config = + databaseIntegrationTypes.includes(rawConfig.type) && + rawConfig.type !== 'pandas-dataframe' + ? (rawConfig as DatabaseIntegrationConfig) + : null; + const validMetadata = config + ? databaseMetadataSchemasByType[config.type].safeParse(config.metadata).data + : null; + if (config && validMetadata) { + this.cache.set( + id, + // NOTE: We must cast here because there is no union-wide schema parser at the moment. + { ...config, metadata: validMetadata } as DatabaseIntegrationConfig + ); + } else { + logger.warn(`Invalid integration config for ${id}, marking for deletion`); + idsToDelete.push(id); + } + } } catch (error) { logger.error(`Failed to parse integration config for ${id}:`, error); + // Mark corrupted configs for deletion + idsToDelete.push(id); } } } + + // Delete any configs that failed to upgrade or were corrupted + if (idsToDelete.length > 0) { + logger.info(`Deleting ${idsToDelete.length} invalid integration config(s)`); + for (const id of idsToDelete) { + await this.encryptedStorage.store(INTEGRATION_SERVICE_NAME, id, undefined); + } + // Update the index to remove deleted IDs + await this.updateIndex(); + } } catch (error) { logger.error('Failed to parse integration index:', error); } diff --git a/src/platform/notebooks/deepnote/integrationStorage.unit.test.ts b/src/platform/notebooks/deepnote/integrationStorage.unit.test.ts new file mode 100644 index 0000000000..6808269ed2 --- /dev/null +++ b/src/platform/notebooks/deepnote/integrationStorage.unit.test.ts @@ -0,0 +1,689 @@ +import assert from 'assert'; +import { anything, instance, mock, when } from 'ts-mockito'; + +import { IEncryptedStorage } from '../../common/application/types'; +import { IAsyncDisposableRegistry } from '../../common/types'; +import { IntegrationStorage } from './integrationStorage'; +import { DatabaseIntegrationConfig } from '@deepnote/database-integrations'; +import { DATAFRAME_SQL_INTEGRATION_ID } from './integrationTypes'; + +suite('IntegrationStorage', () => { + let storage: IntegrationStorage; + let encryptedStorage: IEncryptedStorage; + let asyncRegistry: IAsyncDisposableRegistry; + let storageData: Map; + + setup(() => { + // Create a mock encrypted storage with in-memory data + storageData = new Map(); + encryptedStorage = mock(); + asyncRegistry = mock(); + + // Mock the store and retrieve methods to use our in-memory map + when(encryptedStorage.store(anything(), anything(), anything())).thenCall( + async (_serviceName: string, key: string, value: string | undefined) => { + if (value === undefined) { + storageData.delete(key); + } else { + storageData.set(key, value); + } + } + ); + + when(encryptedStorage.retrieve(anything(), anything())).thenCall(async (_serviceName: string, key: string) => { + return storageData.get(key); + }); + + storage = new IntegrationStorage(instance(encryptedStorage), instance(asyncRegistry)); + }); + + teardown(() => { + storage.dispose(); + }); + + suite('getAll', () => { + test('Returns empty array when no integrations are stored', async () => { + const result = await storage.getAll(); + assert.deepStrictEqual(result, []); + }); + + test('Returns all stored integrations', async () => { + const config1: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } + }; + + const config2: DatabaseIntegrationConfig = { + id: 'bigquery-1', + name: 'My BigQuery', + type: 'big-query', + metadata: { + authMethod: 'service-account', + service_account: '{"type":"service_account","project_id":"test"}' + } + }; + + await storage.save(config1); + await storage.save(config2); + + const result = await storage.getAll(); + assert.strictEqual(result.length, 2); + assert.ok(result.find((c) => c.id === 'postgres-1')); + assert.ok(result.find((c) => c.id === 'bigquery-1')); + }); + }); + + suite('getIntegrationConfig', () => { + test('Returns undefined when integration does not exist', async () => { + const result = await storage.getIntegrationConfig('non-existent'); + assert.strictEqual(result, undefined); + }); + + test('Returns the integration config when it exists', async () => { + const config: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } + }; + + await storage.save(config); + const result = await storage.getIntegrationConfig('postgres-1'); + + assert.ok(result); + assert.strictEqual(result.id, 'postgres-1'); + assert.strictEqual(result.name, 'My Postgres'); + assert.strictEqual(result.type, 'pgsql'); + }); + }); + + suite('getProjectIntegrationConfig', () => { + test('Returns the same result as getIntegrationConfig (ignores projectId)', async () => { + const config: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } + }; + + await storage.save(config); + const result = await storage.getProjectIntegrationConfig('any-project-id', 'postgres-1'); + + assert.ok(result); + assert.strictEqual(result.id, 'postgres-1'); + }); + }); + + suite('save', () => { + test('Saves a new integration config', async () => { + const config: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } + }; + + await storage.save(config); + const result = await storage.getIntegrationConfig('postgres-1'); + + assert.ok(result); + assert.strictEqual(result.id, 'postgres-1'); + }); + + test('Updates an existing integration config', async () => { + const config: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } + }; + + await storage.save(config); + + const updatedConfig: DatabaseIntegrationConfig = { + ...config, + name: 'Updated Postgres' + }; + + await storage.save(updatedConfig); + const result = await storage.getIntegrationConfig('postgres-1'); + + assert.ok(result); + assert.strictEqual(result.name, 'Updated Postgres'); + }); + + test('Does not save pandas-dataframe type integrations', async () => { + const config: DatabaseIntegrationConfig = { + id: 'dataframe-1', + name: 'DataFrame', + type: 'pandas-dataframe', + metadata: {} + }; + + await storage.save(config); + const result = await storage.getIntegrationConfig('dataframe-1'); + + assert.strictEqual(result, undefined); + }); + + test('Does not save DATAFRAME_SQL_INTEGRATION_ID', async () => { + const config: DatabaseIntegrationConfig = { + id: DATAFRAME_SQL_INTEGRATION_ID, + name: 'DuckDB', + type: 'pandas-dataframe', + metadata: {} + }; + + await storage.save(config); + const result = await storage.getIntegrationConfig(DATAFRAME_SQL_INTEGRATION_ID); + + assert.strictEqual(result, undefined); + }); + + test('Fires onDidChangeIntegrations event when saving', async () => { + const config: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } + }; + + let eventCount = 0; + storage.onDidChangeIntegrations(() => { + eventCount++; + }); + + await storage.save(config); + assert.strictEqual(eventCount, 1); + }); + }); + + suite('delete', () => { + test('Deletes an existing integration', async () => { + const config: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } + }; + + await storage.save(config); + await storage.delete('postgres-1'); + const result = await storage.getIntegrationConfig('postgres-1'); + + assert.strictEqual(result, undefined); + }); + + test('Fires onDidChangeIntegrations event when deleting', async () => { + const config: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } + }; + + let eventCount = 0; + storage.onDidChangeIntegrations(() => { + eventCount++; + }); + + await storage.save(config).then(() => storage.delete('postgres-1')); + assert.strictEqual(eventCount, 2); + }); + }); + + suite('exists', () => { + test('Returns false when integration does not exist', async () => { + const result = await storage.exists('non-existent'); + assert.strictEqual(result, false); + }); + + test('Returns true when integration exists', async () => { + const config: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } + }; + + await storage.save(config); + const result = await storage.exists('postgres-1'); + assert.strictEqual(result, true); + }); + }); + + suite('clear', () => { + test('Clears all integrations', async () => { + const config1: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } + }; + + const config2: DatabaseIntegrationConfig = { + id: 'bigquery-1', + name: 'My BigQuery', + type: 'big-query', + metadata: { + authMethod: 'service-account', + service_account: '{"type":"service_account","project_id":"test"}' + } + }; + + await storage.save(config1); + await storage.save(config2); + await storage.clear(); + + const result = await storage.getAll(); + assert.deepStrictEqual(result, []); + }); + + test('Fires onDidChangeIntegrations event when clearing', async () => { + let eventCount = 0; + storage.onDidChangeIntegrations(() => { + eventCount++; + }); + + await storage.clear(); + assert.strictEqual(eventCount, 1); + }); + }); + + suite('Loading from encrypted storage', () => { + test('Loads valid integration configs from storage on first access', async () => { + // Manually populate the storage with valid configs + const config: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } + }; + + storageData.set('index', JSON.stringify(['postgres-1'])); + storageData.set('postgres-1', JSON.stringify({ ...config, version: 1 })); + + // Create a new storage instance to test loading + const newStorage = new IntegrationStorage(instance(encryptedStorage), instance(asyncRegistry)); + + const result = await newStorage.getIntegrationConfig('postgres-1'); + assert.ok(result); + assert.strictEqual(result.id, 'postgres-1'); + assert.strictEqual(result.name, 'My Postgres'); + + newStorage.dispose(); + }); + + test('Skips DATAFRAME_SQL_INTEGRATION_ID when loading from storage', async () => { + storageData.set('index', JSON.stringify([DATAFRAME_SQL_INTEGRATION_ID, 'postgres-1'])); + storageData.set( + DATAFRAME_SQL_INTEGRATION_ID, + JSON.stringify({ + id: DATAFRAME_SQL_INTEGRATION_ID, + name: 'DuckDB', + type: 'pandas-dataframe', + metadata: {}, + version: 1 + }) + ); + storageData.set( + 'postgres-1', + JSON.stringify({ + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + }, + version: 1 + }) + ); + + const newStorage = new IntegrationStorage(instance(encryptedStorage), instance(asyncRegistry)); + + const all = await newStorage.getAll(); + assert.strictEqual(all.length, 1); + assert.strictEqual(all[0].id, 'postgres-1'); + + const duckdb = await newStorage.getIntegrationConfig(DATAFRAME_SQL_INTEGRATION_ID); + assert.strictEqual(duckdb, undefined); + + newStorage.dispose(); + }); + + test('Filters out pandas-dataframe type integrations when loading', async () => { + storageData.set('index', JSON.stringify(['dataframe-1', 'postgres-1'])); + storageData.set( + 'dataframe-1', + JSON.stringify({ + id: 'dataframe-1', + name: 'DataFrame', + type: 'pandas-dataframe', + metadata: {}, + version: 1 + }) + ); + storageData.set( + 'postgres-1', + JSON.stringify({ + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + }, + version: 1 + }) + ); + + const newStorage = new IntegrationStorage(instance(encryptedStorage), instance(asyncRegistry)); + + const all = await newStorage.getAll(); + assert.strictEqual(all.length, 1); + assert.strictEqual(all[0].id, 'postgres-1'); + + newStorage.dispose(); + }); + + test('Filters out invalid integration configs when loading', async () => { + storageData.set('index', JSON.stringify(['invalid-1', 'postgres-1'])); + storageData.set( + 'invalid-1', + JSON.stringify({ + id: 'invalid-1', + name: 'Invalid', + type: 'unknown-type', + metadata: {}, + version: 1 + }) + ); + storageData.set( + 'postgres-1', + JSON.stringify({ + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + }, + version: 1 + }) + ); + + const newStorage = new IntegrationStorage(instance(encryptedStorage), instance(asyncRegistry)); + + const all = await newStorage.getAll(); + assert.strictEqual(all.length, 1); + assert.strictEqual(all[0].id, 'postgres-1'); + + newStorage.dispose(); + }); + + test('Filters out configs with invalid metadata when loading', async () => { + storageData.set('index', JSON.stringify(['invalid-metadata', 'postgres-1'])); + storageData.set( + 'invalid-metadata', + JSON.stringify({ + id: 'invalid-metadata', + name: 'Invalid Metadata', + type: 'pgsql', + metadata: { + // Missing required fields like host, port, database, etc. + invalidField: 'value' + }, + version: 1 + }) + ); + storageData.set( + 'postgres-1', + JSON.stringify({ + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + }, + version: 1 + }) + ); + + const newStorage = new IntegrationStorage(instance(encryptedStorage), instance(asyncRegistry)); + + const all = await newStorage.getAll(); + assert.strictEqual(all.length, 1); + assert.strictEqual(all[0].id, 'postgres-1'); + + newStorage.dispose(); + }); + + test('Filters out configs with corrupted JSON when loading', async () => { + storageData.set('index', JSON.stringify(['corrupted', 'postgres-1'])); + storageData.set('corrupted', 'invalid json {{{'); + storageData.set( + 'postgres-1', + JSON.stringify({ + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + }, + version: 1 + }) + ); + + const newStorage = new IntegrationStorage(instance(encryptedStorage), instance(asyncRegistry)); + + const all = await newStorage.getAll(); + assert.strictEqual(all.length, 1); + assert.strictEqual(all[0].id, 'postgres-1'); + + newStorage.dispose(); + }); + + test('Handles empty index gracefully', async () => { + storageData.set('index', JSON.stringify([])); + + const newStorage = new IntegrationStorage(instance(encryptedStorage), instance(asyncRegistry)); + + const all = await newStorage.getAll(); + assert.deepStrictEqual(all, []); + + newStorage.dispose(); + }); + + test('Handles missing index gracefully', async () => { + // Don't set any index + + const newStorage = new IntegrationStorage(instance(encryptedStorage), instance(asyncRegistry)); + + const all = await newStorage.getAll(); + assert.deepStrictEqual(all, []); + + newStorage.dispose(); + }); + + test('Handles corrupted index JSON gracefully', async () => { + storageData.set('index', 'invalid json {{{'); + + const newStorage = new IntegrationStorage(instance(encryptedStorage), instance(asyncRegistry)); + + const all = await newStorage.getAll(); + assert.deepStrictEqual(all, []); + + newStorage.dispose(); + }); + + test('Loads multiple valid integrations of different types', async () => { + const postgresConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + }, + version: 1 + }; + + const bigqueryConfig = { + id: 'bigquery-1', + name: 'My BigQuery', + type: 'big-query', + metadata: { + authMethod: 'service-account', + service_account: '{"type":"service_account","project_id":"test"}' + }, + version: 1 + }; + + const postgres2Config = { + id: 'postgres-2', + name: 'My Second Postgres', + type: 'pgsql', + metadata: { + host: 'remote.example.com', + port: '5433', + database: 'proddb', + user: 'produser', + password: 'prodpass', + sslEnabled: true + }, + version: 1 + }; + + storageData.set('index', JSON.stringify(['postgres-1', 'bigquery-1', 'postgres-2'])); + storageData.set('postgres-1', JSON.stringify(postgresConfig)); + storageData.set('bigquery-1', JSON.stringify(bigqueryConfig)); + storageData.set('postgres-2', JSON.stringify(postgres2Config)); + + const newStorage = new IntegrationStorage(instance(encryptedStorage), instance(asyncRegistry)); + + const all = await newStorage.getAll(); + assert.strictEqual(all.length, 3); + + const postgres1 = all.find((c) => c.id === 'postgres-1'); + const bigquery = all.find((c) => c.id === 'bigquery-1'); + const postgres2 = all.find((c) => c.id === 'postgres-2'); + + assert.ok(postgres1); + assert.strictEqual(postgres1.type, 'pgsql'); + + assert.ok(bigquery); + assert.strictEqual(bigquery.type, 'big-query'); + + assert.ok(postgres2); + assert.strictEqual(postgres2.type, 'pgsql'); + + newStorage.dispose(); + }); + }); +}); diff --git a/src/platform/notebooks/deepnote/integrationTypes.ts b/src/platform/notebooks/deepnote/integrationTypes.ts index 8c4442e8bc..19da67bcdf 100644 --- a/src/platform/notebooks/deepnote/integrationTypes.ts +++ b/src/platform/notebooks/deepnote/integrationTypes.ts @@ -7,46 +7,49 @@ export const DATAFRAME_SQL_INTEGRATION_ID = 'deepnote-dataframe-sql'; /** * Supported integration types */ -export enum IntegrationType { +export enum LegacyIntegrationType { Postgres = 'postgres', BigQuery = 'bigquery', - Snowflake = 'snowflake' + Snowflake = 'snowflake', + DuckDB = 'duckdb' } /** * Map our IntegrationType enum to Deepnote integration type strings + * Note: DuckDB is not included as it's an internal integration that doesn't exist in Deepnote */ -export const INTEGRATION_TYPE_TO_DEEPNOTE = { - [IntegrationType.Postgres]: 'pgsql', - [IntegrationType.BigQuery]: 'big-query', - [IntegrationType.Snowflake]: 'snowflake' -} as const satisfies { [type in IntegrationType]: string }; +export const LEGACY_INTEGRATION_TYPE_TO_DEEPNOTE = { + [LegacyIntegrationType.Postgres]: 'pgsql', + [LegacyIntegrationType.BigQuery]: 'big-query', + [LegacyIntegrationType.Snowflake]: 'snowflake' +} as const satisfies { [type in Exclude]: string }; -export type RawIntegrationType = (typeof INTEGRATION_TYPE_TO_DEEPNOTE)[keyof typeof INTEGRATION_TYPE_TO_DEEPNOTE]; +export type RawLegacyIntegrationType = + (typeof LEGACY_INTEGRATION_TYPE_TO_DEEPNOTE)[keyof typeof LEGACY_INTEGRATION_TYPE_TO_DEEPNOTE]; /** * Map Deepnote integration type strings to our IntegrationType enum */ -export const DEEPNOTE_TO_INTEGRATION_TYPE: Record = { - pgsql: IntegrationType.Postgres, - 'big-query': IntegrationType.BigQuery, - snowflake: IntegrationType.Snowflake +export const DEEPNOTE_TO_LEGACY_INTEGRATION_TYPE: Record = { + pgsql: LegacyIntegrationType.Postgres, + 'big-query': LegacyIntegrationType.BigQuery, + snowflake: LegacyIntegrationType.Snowflake }; /** * Base interface for all integration configurations */ -export interface BaseIntegrationConfig { +export interface BaseLegacyIntegrationConfig { id: string; name: string; - type: IntegrationType; + type: LegacyIntegrationType; } /** * PostgreSQL integration configuration */ -export interface PostgresIntegrationConfig extends BaseIntegrationConfig { - type: IntegrationType.Postgres; +export interface LegacyPostgresIntegrationConfig extends BaseLegacyIntegrationConfig { + type: LegacyIntegrationType.Postgres; host: string; port: number; database: string; @@ -58,12 +61,20 @@ export interface PostgresIntegrationConfig extends BaseIntegrationConfig { /** * BigQuery integration configuration */ -export interface BigQueryIntegrationConfig extends BaseIntegrationConfig { - type: IntegrationType.BigQuery; +export interface LegacyBigQueryIntegrationConfig extends BaseLegacyIntegrationConfig { + type: LegacyIntegrationType.BigQuery; projectId: string; credentials: string; // JSON string of service account credentials } +/** + * DuckDB integration configuration (internal, always available) + */ +export interface LegacyDuckDBIntegrationConfig extends BaseLegacyIntegrationConfig { + type: LegacyIntegrationType.DuckDB; +} + +import { DatabaseIntegrationConfig, DatabaseIntegrationType } from '@deepnote/database-integrations'; // Import and re-export Snowflake auth constants from shared module import { type SnowflakeAuthMethod, @@ -81,8 +92,8 @@ export { /** * Base Snowflake configuration with common fields */ -interface BaseSnowflakeConfig extends BaseIntegrationConfig { - type: IntegrationType.Snowflake; +interface BaseLegacySnowflakeConfig extends BaseLegacyIntegrationConfig { + type: LegacyIntegrationType.Snowflake; account: string; warehouse?: string; database?: string; @@ -92,7 +103,7 @@ interface BaseSnowflakeConfig extends BaseIntegrationConfig { /** * Snowflake integration configuration (discriminated union) */ -export type SnowflakeIntegrationConfig = BaseSnowflakeConfig & +export type LegacySnowflakeIntegrationConfig = BaseLegacySnowflakeConfig & ( | { authMethod: typeof SnowflakeAuthMethods.PASSWORD | null; @@ -119,7 +130,11 @@ export type SnowflakeIntegrationConfig = BaseSnowflakeConfig & /** * Union type of all integration configurations */ -export type IntegrationConfig = PostgresIntegrationConfig | BigQueryIntegrationConfig | SnowflakeIntegrationConfig; +export type LegacyIntegrationConfig = + | LegacyPostgresIntegrationConfig + | LegacyBigQueryIntegrationConfig + | LegacySnowflakeIntegrationConfig + | LegacyDuckDBIntegrationConfig; /** * Integration connection status @@ -134,7 +149,7 @@ export enum IntegrationStatus { * Integration with its current status */ export interface IntegrationWithStatus { - config: IntegrationConfig | null; + config: DatabaseIntegrationConfig | null; status: IntegrationStatus; error?: string; /** @@ -144,5 +159,5 @@ export interface IntegrationWithStatus { /** * Type from the project's integrations list (used for prefilling when config is null) */ - integrationType?: IntegrationType; + integrationType?: DatabaseIntegrationType; } diff --git a/src/platform/notebooks/deepnote/legacyIntegrationConfigUtils.ts b/src/platform/notebooks/deepnote/legacyIntegrationConfigUtils.ts new file mode 100644 index 0000000000..aebf2448fd --- /dev/null +++ b/src/platform/notebooks/deepnote/legacyIntegrationConfigUtils.ts @@ -0,0 +1,89 @@ +import { LegacyIntegrationConfig, LegacyIntegrationType } from './integrationTypes'; +import { + BigQueryAuthMethods, + DatabaseIntegrationConfig, + databaseMetadataSchemasByType, + SnowflakeAuthMethods +} from '@deepnote/database-integrations'; +import { SnowflakeAuthMethods as LegacySnowflakeAuthMethods } from './snowflakeAuthConstants'; + +export async function upgradeLegacyIntegrationConfig( + config: LegacyIntegrationConfig +): Promise { + switch (config.type) { + case LegacyIntegrationType.Postgres: { + const metadata = databaseMetadataSchemasByType.pgsql.safeParse({ + host: config.host, + port: config.port ? String(config.port) : undefined, + database: config.database, + user: config.username, + password: config.password, + sslEnabled: Boolean(config.ssl) + }).data; + + return metadata + ? { + id: config.id, + name: config.name, + type: 'pgsql', + metadata + } + : null; + } + case LegacyIntegrationType.BigQuery: { + const metadata = databaseMetadataSchemasByType['big-query'].safeParse({ + authMethod: BigQueryAuthMethods.ServiceAccount, + service_account: config.credentials + }).data; + + return metadata + ? { + id: config.id, + name: config.name, + type: 'big-query', + metadata + } + : null; + } + case LegacyIntegrationType.Snowflake: { + const metadata = (() => { + switch (config.authMethod) { + case LegacySnowflakeAuthMethods.PASSWORD: + return databaseMetadataSchemasByType.snowflake.safeParse({ + authMethod: SnowflakeAuthMethods.Password, + accountName: config.account, + warehouse: config.warehouse, + database: config.database, + role: config.role, + username: config.username, + password: config.password + }).data; + case LegacySnowflakeAuthMethods.SERVICE_ACCOUNT_KEY_PAIR: + return databaseMetadataSchemasByType.snowflake.safeParse({ + authMethod: SnowflakeAuthMethods.ServiceAccountKeyPair, + accountName: config.account, + warehouse: config.warehouse, + database: config.database, + role: config.role, + username: config.username, + privateKey: config.privateKey, + privateKeyPassphrase: config.privateKeyPassphrase + }).data; + default: + return null; + } + })(); + + return metadata + ? { + id: config.id, + name: config.name, + type: 'snowflake', + metadata + } + : null; + } + default: + return null; + } +} diff --git a/src/platform/notebooks/deepnote/legacyIntegrationConfigUtils.unit.test.ts b/src/platform/notebooks/deepnote/legacyIntegrationConfigUtils.unit.test.ts new file mode 100644 index 0000000000..7ae5607c2d --- /dev/null +++ b/src/platform/notebooks/deepnote/legacyIntegrationConfigUtils.unit.test.ts @@ -0,0 +1,363 @@ +import assert from 'assert'; + +import { upgradeLegacyIntegrationConfig } from './legacyIntegrationConfigUtils'; +import { + LegacyBigQueryIntegrationConfig, + LegacyIntegrationType, + LegacyPostgresIntegrationConfig, + LegacySnowflakeIntegrationConfig, + SnowflakeAuthMethods +} from './integrationTypes'; + +suite('upgradeLegacyIntegrationConfig', () => { + suite('PostgreSQL', () => { + test('Upgrades valid PostgreSQL config with all fields', async () => { + const legacyConfig: LegacyPostgresIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres', + type: LegacyIntegrationType.Postgres, + host: 'localhost', + port: 5432, + database: 'testdb', + username: 'testuser', + password: 'testpass', + ssl: true + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.ok(result); + assert.strictEqual(result.id, 'postgres-1'); + assert.strictEqual(result.name, 'My Postgres'); + assert.strictEqual(result.type, 'pgsql'); + assert.strictEqual(result.metadata.host, 'localhost'); + assert.strictEqual(result.metadata.port, '5432'); + assert.strictEqual(result.metadata.database, 'testdb'); + assert.strictEqual(result.metadata.user, 'testuser'); + assert.strictEqual(result.metadata.password, 'testpass'); + assert.strictEqual(result.metadata.sslEnabled, true); + }); + + test('Upgrades PostgreSQL config without SSL (defaults to false)', async () => { + const legacyConfig: LegacyPostgresIntegrationConfig = { + id: 'postgres-2', + name: 'My Postgres No SSL', + type: LegacyIntegrationType.Postgres, + host: 'localhost', + port: 5432, + database: 'testdb', + username: 'testuser', + password: 'testpass' + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.ok(result); + assert.strictEqual(result.type, 'pgsql'); + assert.strictEqual(result.metadata.sslEnabled, false); + }); + + test('Converts port number to string', async () => { + const legacyConfig: LegacyPostgresIntegrationConfig = { + id: 'postgres-3', + name: 'My Postgres', + type: LegacyIntegrationType.Postgres, + host: 'localhost', + port: 5433, + database: 'testdb', + username: 'testuser', + password: 'testpass' + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.ok(result); + assert.strictEqual(result.type, 'pgsql'); + assert.strictEqual(result.metadata.port, '5433'); + assert.strictEqual(typeof result.metadata.port, 'string'); + }); + + test('Returns null for PostgreSQL config with missing required fields', async () => { + const legacyConfig = { + id: 'postgres-invalid', + name: 'Invalid Postgres', + type: LegacyIntegrationType.Postgres, + host: 'localhost' + // Missing port, database, username, password + } as LegacyPostgresIntegrationConfig; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.strictEqual(result, null); + }); + }); + + suite('BigQuery', () => { + test('Upgrades valid BigQuery config', async () => { + const credentials = JSON.stringify({ + type: 'service_account', + project_id: 'my-project', + private_key_id: 'key123', + private_key: '-----BEGIN PRIVATE KEY-----\ntest\n-----END PRIVATE KEY-----\n', + client_email: 'test@my-project.iam.gserviceaccount.com', + client_id: '123456789', + auth_uri: 'https://accounts.google.com/o/oauth2/auth', + token_uri: 'https://oauth2.googleapis.com/token', + auth_provider_x509_cert_url: 'https://www.googleapis.com/oauth2/v1/certs', + client_x509_cert_url: + 'https://www.googleapis.com/robot/v1/metadata/x509/test%40my-project.iam.gserviceaccount.com' + }); + + const legacyConfig: LegacyBigQueryIntegrationConfig = { + id: 'bigquery-1', + name: 'My BigQuery', + type: LegacyIntegrationType.BigQuery, + projectId: 'my-project', + credentials + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.ok(result); + assert.strictEqual(result.id, 'bigquery-1'); + assert.strictEqual(result.name, 'My BigQuery'); + assert.strictEqual(result.type, 'big-query'); + assert.strictEqual(result.metadata.authMethod, 'service-account'); + assert.strictEqual(result.metadata.service_account, credentials); + }); + + test('Returns null for BigQuery config with missing credentials', async () => { + const legacyConfig = { + id: 'bigquery-invalid-2', + name: 'Invalid BigQuery', + type: LegacyIntegrationType.BigQuery, + projectId: 'my-project' + } as LegacyBigQueryIntegrationConfig; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.strictEqual(result, null); + }); + }); + + suite('Snowflake - PASSWORD auth', () => { + test('Upgrades valid Snowflake config with PASSWORD auth', async () => { + const legacyConfig: LegacySnowflakeIntegrationConfig = { + id: 'snowflake-1', + name: 'My Snowflake', + type: LegacyIntegrationType.Snowflake, + authMethod: SnowflakeAuthMethods.PASSWORD, + account: 'myaccount', + warehouse: 'mywarehouse', + database: 'mydb', + role: 'myrole', + username: 'myuser', + password: 'mypass' + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.ok(result); + assert.strictEqual(result.id, 'snowflake-1'); + assert.strictEqual(result.name, 'My Snowflake'); + assert.strictEqual(result.type, 'snowflake'); + // The authMethod is converted to the database-integrations format (lowercase/kebab-case) + assert.strictEqual(result.metadata.authMethod, 'password'); + assert.strictEqual(result.metadata.accountName, 'myaccount'); + assert.strictEqual(result.metadata.warehouse, 'mywarehouse'); + assert.strictEqual(result.metadata.database, 'mydb'); + assert.strictEqual(result.metadata.role, 'myrole'); + assert.strictEqual(result.metadata.username, 'myuser'); + assert.strictEqual(result.metadata.password, 'mypass'); + }); + + test('Upgrades Snowflake config with PASSWORD auth and minimal fields', async () => { + const legacyConfig: LegacySnowflakeIntegrationConfig = { + id: 'snowflake-2', + name: 'My Snowflake Minimal', + type: LegacyIntegrationType.Snowflake, + authMethod: SnowflakeAuthMethods.PASSWORD, + account: 'myaccount', + username: 'myuser', + password: 'mypass' + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.ok(result); + assert.strictEqual(result.type, 'snowflake'); + assert.strictEqual(result.metadata.authMethod, 'password'); + assert.strictEqual(result.metadata.accountName, 'myaccount'); + assert.strictEqual(result.metadata.username, 'myuser'); + assert.strictEqual(result.metadata.password, 'mypass'); + }); + + test('Returns null for Snowflake PASSWORD config with missing required fields', async () => { + const legacyConfig = { + id: 'snowflake-invalid', + name: 'Invalid Snowflake', + type: LegacyIntegrationType.Snowflake, + authMethod: SnowflakeAuthMethods.PASSWORD, + account: 'myaccount' + // Missing username and password + } as LegacySnowflakeIntegrationConfig; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.strictEqual(result, null); + }); + }); + + suite('Snowflake - SERVICE_ACCOUNT_KEY_PAIR auth', () => { + test('Upgrades valid Snowflake config with SERVICE_ACCOUNT_KEY_PAIR auth', async () => { + const legacyConfig: LegacySnowflakeIntegrationConfig = { + id: 'snowflake-keypair-1', + name: 'My Snowflake KeyPair', + type: LegacyIntegrationType.Snowflake, + authMethod: SnowflakeAuthMethods.SERVICE_ACCOUNT_KEY_PAIR, + account: 'myaccount', + warehouse: 'mywarehouse', + database: 'mydb', + role: 'myrole', + username: 'myuser', + privateKey: '-----BEGIN PRIVATE KEY-----\ntest\n-----END PRIVATE KEY-----\n', + privateKeyPassphrase: 'passphrase' + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.ok(result); + assert.strictEqual(result.id, 'snowflake-keypair-1'); + assert.strictEqual(result.name, 'My Snowflake KeyPair'); + assert.strictEqual(result.type, 'snowflake'); + assert.strictEqual(result.metadata.authMethod, 'service-account-key-pair'); + assert.strictEqual(result.metadata.accountName, 'myaccount'); + assert.strictEqual(result.metadata.username, 'myuser'); + assert.strictEqual( + result.metadata.privateKey, + '-----BEGIN PRIVATE KEY-----\ntest\n-----END PRIVATE KEY-----\n' + ); + assert.strictEqual(result.metadata.privateKeyPassphrase, 'passphrase'); + }); + + test('Upgrades Snowflake SERVICE_ACCOUNT_KEY_PAIR config without passphrase', async () => { + const legacyConfig: LegacySnowflakeIntegrationConfig = { + id: 'snowflake-keypair-2', + name: 'My Snowflake KeyPair No Passphrase', + type: LegacyIntegrationType.Snowflake, + authMethod: SnowflakeAuthMethods.SERVICE_ACCOUNT_KEY_PAIR, + account: 'myaccount', + username: 'myuser', + privateKey: '-----BEGIN PRIVATE KEY-----\ntest\n-----END PRIVATE KEY-----\n' + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.ok(result); + assert.strictEqual(result.type, 'snowflake'); + assert.ok(result.metadata.authMethod); + }); + + test('Returns null for Snowflake SERVICE_ACCOUNT_KEY_PAIR config with missing private key', async () => { + const legacyConfig = { + id: 'snowflake-keypair-invalid', + name: 'Invalid Snowflake KeyPair', + type: LegacyIntegrationType.Snowflake, + authMethod: SnowflakeAuthMethods.SERVICE_ACCOUNT_KEY_PAIR, + account: 'myaccount', + username: 'myuser' + // Missing privateKey + } as LegacySnowflakeIntegrationConfig; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.strictEqual(result, null); + }); + }); + + suite('Snowflake - Unsupported auth methods', () => { + test('Returns null for Snowflake config with OKTA auth', async () => { + const legacyConfig: LegacySnowflakeIntegrationConfig = { + id: 'snowflake-okta', + name: 'Snowflake OKTA', + type: LegacyIntegrationType.Snowflake, + authMethod: SnowflakeAuthMethods.OKTA, + account: 'myaccount', + oktaUrl: 'https://mycompany.okta.com' + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.strictEqual(result, null); + }); + + test('Returns null for Snowflake config with NATIVE_SNOWFLAKE auth', async () => { + const legacyConfig: LegacySnowflakeIntegrationConfig = { + id: 'snowflake-native', + name: 'Snowflake Native', + type: LegacyIntegrationType.Snowflake, + authMethod: SnowflakeAuthMethods.NATIVE_SNOWFLAKE, + account: 'myaccount' + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.strictEqual(result, null); + }); + + test('Returns null for Snowflake config with AZURE_AD auth', async () => { + const legacyConfig: LegacySnowflakeIntegrationConfig = { + id: 'snowflake-azure', + name: 'Snowflake Azure AD', + type: LegacyIntegrationType.Snowflake, + authMethod: SnowflakeAuthMethods.AZURE_AD, + account: 'myaccount', + tenantId: 'my-tenant-id' + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.strictEqual(result, null); + }); + + test('Returns null for Snowflake config with KEY_PAIR auth', async () => { + const legacyConfig: LegacySnowflakeIntegrationConfig = { + id: 'snowflake-keypair', + name: 'Snowflake KeyPair', + type: LegacyIntegrationType.Snowflake, + authMethod: SnowflakeAuthMethods.KEY_PAIR, + account: 'myaccount' + }; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.strictEqual(result, null); + }); + }); + + suite('Unknown integration types', () => { + test('Returns null for unknown integration type', async () => { + const legacyConfig = { + id: 'unknown-1', + name: 'Unknown Integration', + type: 'unknown-type' + } as any; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.strictEqual(result, null); + }); + + test('Returns null for DuckDB integration type', async () => { + const legacyConfig = { + id: 'duckdb-1', + name: 'DuckDB', + type: LegacyIntegrationType.DuckDB + } as any; + + const result = await upgradeLegacyIntegrationConfig(legacyConfig); + + assert.strictEqual(result, null); + }); + }); +}); diff --git a/src/platform/notebooks/deepnote/sqlIntegrationEnvironmentVariablesProvider.ts b/src/platform/notebooks/deepnote/sqlIntegrationEnvironmentVariablesProvider.ts index cfd9df1bdd..54f2a69295 100644 --- a/src/platform/notebooks/deepnote/sqlIntegrationEnvironmentVariablesProvider.ts +++ b/src/platform/notebooks/deepnote/sqlIntegrationEnvironmentVariablesProvider.ts @@ -1,154 +1,21 @@ import { inject, injectable } from 'inversify'; -import { CancellationToken, Event, EventEmitter, l10n, NotebookDocument, workspace } from 'vscode'; +import { CancellationToken, Event, EventEmitter } from 'vscode'; import { IDisposableRegistry, Resource } from '../../common/types'; import { EnvironmentVariables } from '../../common/variables/types'; -import { UnsupportedIntegrationError } from '../../errors/unsupportedIntegrationError'; import { logger } from '../../logging'; -import { IIntegrationStorage, ISqlIntegrationEnvVarsProvider } from './types'; import { - DATAFRAME_SQL_INTEGRATION_ID, - IntegrationConfig, - IntegrationType, - SnowflakeAuthMethods -} from './integrationTypes'; - -/** - * Converts an integration ID to the environment variable name format expected by SQL blocks. - * Example: 'my-postgres-db' -> 'SQL_MY_POSTGRES_DB' - */ -function convertToEnvironmentVariableName(str: string): string { - return (/^\d/.test(str) ? `_${str}` : str).toUpperCase().replace(/[^\w]/g, '_'); -} - -function getSqlEnvVarName(integrationId: string): string { - return `SQL_${integrationId}`; -} - -/** - * Converts integration configuration to the JSON format expected by the SQL execution code. - * The format must match what deepnote_toolkit expects: - * { - * "url": "sqlalchemy_connection_url", - * "params": {}, - * "param_style": "qmark" | "format" | etc. - * } - */ -function convertIntegrationConfigToJson(config: IntegrationConfig): string { - switch (config.type) { - case IntegrationType.Postgres: { - // Build PostgreSQL connection URL - // Format: postgresql://username:password@host:port/database - const encodedUsername = encodeURIComponent(config.username); - const encodedPassword = encodeURIComponent(config.password); - const encodedDatabase = encodeURIComponent(config.database); - const url = `postgresql://${encodedUsername}:${encodedPassword}@${config.host}:${config.port}/${encodedDatabase}`; - - return JSON.stringify({ - url: url, - params: config.ssl ? { sslmode: 'require' } : {}, - param_style: 'format' - }); - } - - case IntegrationType.BigQuery: { - // BigQuery uses a special URL format - return JSON.stringify({ - url: 'bigquery://?user_supplied_client=true', - params: { - project_id: config.projectId, - credentials: JSON.parse(config.credentials) - }, - param_style: 'format' - }); - } - - case IntegrationType.Snowflake: { - // Build Snowflake connection URL - // Format depends on auth method: - // Username+password: snowflake://{username}:{password}@{account}/{database}?warehouse={warehouse}&role={role}&application=YourApp - // Service account key-pair: snowflake://{username}@{account}/{database}?warehouse={warehouse}&role={role}&authenticator=snowflake_jwt&application=YourApp - const encodedAccount = encodeURIComponent(config.account); - - let url: string; - const params: Record = {}; - - if (config.authMethod === null || config.authMethod === SnowflakeAuthMethods.PASSWORD) { - // Username+password authentication - const encodedUsername = encodeURIComponent(config.username); - const encodedPassword = encodeURIComponent(config.password); - const database = config.database ? `/${encodeURIComponent(config.database)}` : ''; - url = `snowflake://${encodedUsername}:${encodedPassword}@${encodedAccount}${database}`; - - const queryParams = new URLSearchParams(); - if (config.warehouse) { - queryParams.set('warehouse', config.warehouse); - } - if (config.role) { - queryParams.set('role', config.role); - } - queryParams.set('application', 'Deepnote'); - - const queryString = queryParams.toString(); - if (queryString) { - url += `?${queryString}`; - } - } else { - // Service account key-pair authentication (the only other supported method) - // TypeScript needs help narrowing the type here - if (config.authMethod !== SnowflakeAuthMethods.SERVICE_ACCOUNT_KEY_PAIR) { - // This should never happen due to the type guard above, but TypeScript needs this - throw new UnsupportedIntegrationError( - l10n.t( - "Snowflake integration with auth method '{0}' is not supported in VSCode", - config.authMethod - ) - ); - } - - const encodedUsername = encodeURIComponent(config.username); - const database = config.database ? `/${encodeURIComponent(config.database)}` : ''; - url = `snowflake://${encodedUsername}@${encodedAccount}${database}`; - - const queryParams = new URLSearchParams(); - if (config.warehouse) { - queryParams.set('warehouse', config.warehouse); - } - if (config.role) { - queryParams.set('role', config.role); - } - queryParams.set('authenticator', 'snowflake_jwt'); - queryParams.set('application', 'Deepnote'); - - const queryString = queryParams.toString(); - if (queryString) { - url += `?${queryString}`; - } - - // For key-pair auth, pass the private key and passphrase as params - params.snowflake_private_key = btoa(config.privateKey); - if (config.privateKeyPassphrase) { - params.snowflake_private_key_passphrase = config.privateKeyPassphrase; - } - } - - return JSON.stringify({ - url: url, - params: params, - param_style: 'pyformat' - }); - } - - default: - throw new UnsupportedIntegrationError( - l10n.t('Unsupported integration type: {0}', (config as IntegrationConfig).type) - ); - } -} + IIntegrationStorage, + ISqlIntegrationEnvVarsProvider, + IPlatformNotebookEditorProvider, + IPlatformDeepnoteNotebookManager +} from './types'; +import { DATAFRAME_SQL_INTEGRATION_ID } from './integrationTypes'; +import { getEnvironmentVariablesForIntegrations } from '@deepnote/database-integrations'; /** * Provides environment variables for SQL integrations. - * This service scans notebooks for SQL blocks and injects the necessary credentials + * This service provides credentials for all configured integrations in the project * as environment variables so they can be used during SQL block execution. */ @injectable() @@ -159,6 +26,9 @@ export class SqlIntegrationEnvironmentVariablesProvider implements ISqlIntegrati constructor( @inject(IIntegrationStorage) private readonly integrationStorage: IIntegrationStorage, + @inject(IPlatformNotebookEditorProvider) + private readonly notebookEditorProvider: IPlatformNotebookEditorProvider, + @inject(IPlatformDeepnoteNotebookManager) private readonly notebookManager: IPlatformDeepnoteNotebookManager, @inject(IDisposableRegistry) disposables: IDisposableRegistry ) { logger.info('SqlIntegrationEnvironmentVariablesProvider: Constructor called - provider is being instantiated'); @@ -174,121 +44,77 @@ export class SqlIntegrationEnvironmentVariablesProvider implements ISqlIntegrati } /** - * Get environment variables for SQL integrations used in the given notebook. + * Get environment variables for SQL integrations. + * Provides credentials for all integrations in the Deepnote project. + * The internal DuckDB integration is always included. */ public async getEnvironmentVariables(resource: Resource, token?: CancellationToken): Promise { - const envVars: EnvironmentVariables = {}; - if (!resource) { - return envVars; + return {}; } if (token?.isCancellationRequested) { - return envVars; + return {}; } logger.trace(`SqlIntegrationEnvironmentVariablesProvider: Getting env vars for resource`); - logger.trace( - `SqlIntegrationEnvironmentVariablesProvider: Available notebooks count: ${workspace.notebookDocuments.length}` - ); - // Find the notebook document for this resource - const notebook = workspace.notebookDocuments.find((nb) => nb.uri.toString() === resource.toString()); + // Get the notebook document from the resource + const notebook = this.notebookEditorProvider.findAssociatedNotebookDocument(resource); if (!notebook) { - logger.warn(`SqlIntegrationEnvironmentVariablesProvider: No notebook found for ${resource.toString()}`); - return envVars; + logger.trace(`SqlIntegrationEnvironmentVariablesProvider: No notebook found for resource`); + return {}; } - // Always add the internal DuckDB integration - const dataframeSqlIntegrationEnvVarName = convertToEnvironmentVariableName( - getSqlEnvVarName(DATAFRAME_SQL_INTEGRATION_ID) - ); - const dataframeSqlIntegrationCredentialsJson = JSON.stringify({ - url: 'deepnote+duckdb:///:memory:', - params: {}, - param_style: 'qmark' - }); + // Get the project ID from the notebook metadata + const projectId = notebook.metadata?.deepnoteProjectId as string | undefined; + if (!projectId) { + logger.trace(`SqlIntegrationEnvironmentVariablesProvider: No project ID found in notebook metadata`); + return {}; + } - envVars[dataframeSqlIntegrationEnvVarName] = dataframeSqlIntegrationCredentialsJson; - logger.debug(`SqlIntegrationEnvironmentVariablesProvider: Added env var for dataframe SQL integration`); + logger.trace(`SqlIntegrationEnvironmentVariablesProvider: Project ID: ${projectId}`); - // Scan all cells for SQL integration IDs - const integrationIds = this.scanNotebookForIntegrations(notebook); - if (integrationIds.size === 0) { - logger.info( - `SqlIntegrationEnvironmentVariablesProvider: No SQL integrations found in ${resource.toString()}` - ); - return envVars; + // Get the project from the notebook manager + const project = this.notebookManager.getOriginalProject(projectId); + if (!project) { + logger.trace(`SqlIntegrationEnvironmentVariablesProvider: No project found for ID: ${projectId}`); + return {}; } - logger.trace(`SqlIntegrationEnvironmentVariablesProvider: Found ${integrationIds.size} SQL integrations`); - - // Get credentials for each integration and add to environment variables - for (const integrationId of integrationIds) { - if (token?.isCancellationRequested) { - break; - } + // Get the list of integrations from the project + const projectIntegrations = project.project.integrations?.slice() ?? []; + logger.trace( + `SqlIntegrationEnvironmentVariablesProvider: Found ${projectIntegrations.length} integrations in project` + ); - try { - // Handle internal DuckDB integration specially - if (integrationId === DATAFRAME_SQL_INTEGRATION_ID) { - // Internal DuckDB integration is handled above - continue; - } + const projectIntegrationConfigs = ( + await Promise.all( + projectIntegrations.map((integration) => { + return this.integrationStorage.getIntegrationConfig(integration.id); + }) + ) + ).filter((config) => config != null); - const config = await this.integrationStorage.getIntegrationConfig(integrationId); - if (!config) { - logger.warn( - `SqlIntegrationEnvironmentVariablesProvider: No configuration found for integration ${integrationId}` - ); - continue; - } + // Always add the internal DuckDB integration + projectIntegrationConfigs.push({ + id: DATAFRAME_SQL_INTEGRATION_ID, + name: 'Dataframe SQL (DuckDB)', + type: 'pandas-dataframe', + metadata: {} + }); - // Convert integration config to JSON and add as environment variable - const envVarName = convertToEnvironmentVariableName(getSqlEnvVarName(integrationId)); - const credentialsJson = convertIntegrationConfigToJson(config); + const { envVars: envVarList, errors } = getEnvironmentVariablesForIntegrations(projectIntegrationConfigs, { + projectRootDirectory: '' + }); - envVars[envVarName] = credentialsJson; - logger.debug( - `SqlIntegrationEnvironmentVariablesProvider: Added env var ${envVarName} for integration ${integrationId}` - ); - } catch (error) { - logger.error( - `SqlIntegrationEnvironmentVariablesProvider: Failed to get credentials for integration ${integrationId}`, - error - ); - } - } + errors.forEach((error) => { + logger.error(`SqlIntegrationEnvironmentVariablesProvider: ${error.message}`); + }); + const envVars: EnvironmentVariables = Object.fromEntries(envVarList.map(({ name, value }) => [name, value])); logger.trace(`SqlIntegrationEnvironmentVariablesProvider: Returning ${Object.keys(envVars).length} env vars`); return envVars; } - - /** - * Scan a notebook for SQL integration IDs. - */ - private scanNotebookForIntegrations(notebook: NotebookDocument): Set { - const integrationIds = new Set(); - - for (const cell of notebook.getCells()) { - // Only check SQL cells - if (cell.document.languageId !== 'sql') { - continue; - } - - const metadata = cell.metadata; - if (metadata && typeof metadata === 'object') { - const integrationId = (metadata as Record).sql_integration_id; - if (typeof integrationId === 'string') { - integrationIds.add(integrationId); - logger.trace( - `SqlIntegrationEnvironmentVariablesProvider: Found integration ${integrationId} in cell ${cell.index}` - ); - } - } - } - - return integrationIds; - } } diff --git a/src/platform/notebooks/deepnote/sqlIntegrationEnvironmentVariablesProvider.unit.test.ts b/src/platform/notebooks/deepnote/sqlIntegrationEnvironmentVariablesProvider.unit.test.ts index 92026b7638..217fe92f0f 100644 --- a/src/platform/notebooks/deepnote/sqlIntegrationEnvironmentVariablesProvider.unit.test.ts +++ b/src/platform/notebooks/deepnote/sqlIntegrationEnvironmentVariablesProvider.unit.test.ts @@ -1,707 +1,385 @@ -import { assert } from 'chai'; +import assert from 'assert'; import { instance, mock, when } from 'ts-mockito'; -import { CancellationTokenSource, EventEmitter, NotebookCell, NotebookCellKind, NotebookDocument, Uri } from 'vscode'; +import { CancellationTokenSource, EventEmitter, NotebookDocument, Uri } from 'vscode'; import { IDisposableRegistry } from '../../common/types'; -import { IntegrationStorage } from './integrationStorage'; import { SqlIntegrationEnvironmentVariablesProvider } from './sqlIntegrationEnvironmentVariablesProvider'; -import { - IntegrationType, - PostgresIntegrationConfig, - BigQueryIntegrationConfig, - DATAFRAME_SQL_INTEGRATION_ID, - SnowflakeIntegrationConfig, - SnowflakeAuthMethods -} from './integrationTypes'; -import { mockedVSCodeNamespaces, resetVSCodeMocks } from '../../../test/vscode-mock'; - -const EXPECTED_DATAFRAME_ONLY_ENV_VARS = { - SQL_DEEPNOTE_DATAFRAME_SQL: '{"url":"deepnote+duckdb:///:memory:","params":{},"param_style":"qmark"}' -}; +import { IIntegrationStorage, IPlatformDeepnoteNotebookManager, IPlatformNotebookEditorProvider } from './types'; +import { DATAFRAME_SQL_INTEGRATION_ID } from './integrationTypes'; +import { DatabaseIntegrationConfig } from '@deepnote/database-integrations'; +import type { DeepnoteProject } from '../../deepnote/deepnoteTypes'; + +/** + * Helper function to create a minimal DeepnoteProject for testing + */ +function createMockProject( + projectId: string, + integrations: Array<{ id: string; name: string; type: string }> = [] +): DeepnoteProject { + return { + metadata: { + createdAt: '2023-01-01T00:00:00Z', + modifiedAt: '2023-01-02T00:00:00Z' + }, + project: { + id: projectId, + name: 'Test Project', + notebooks: [], + integrations + }, + version: '1.0' + }; +} suite('SqlIntegrationEnvironmentVariablesProvider', () => { let provider: SqlIntegrationEnvironmentVariablesProvider; - let integrationStorage: IntegrationStorage; + let integrationStorage: IIntegrationStorage; + let notebookEditorProvider: IPlatformNotebookEditorProvider; + let notebookManager: IPlatformDeepnoteNotebookManager; let disposables: IDisposableRegistry; + let onDidChangeIntegrationsEmitter: EventEmitter; setup(() => { - resetVSCodeMocks(); + integrationStorage = mock(); + notebookEditorProvider = mock(); + notebookManager = mock(); disposables = []; - integrationStorage = mock(IntegrationStorage); - when(integrationStorage.onDidChangeIntegrations).thenReturn(new EventEmitter().event); - provider = new SqlIntegrationEnvironmentVariablesProvider(instance(integrationStorage), disposables); + onDidChangeIntegrationsEmitter = new EventEmitter(); + when(integrationStorage.onDidChangeIntegrations).thenReturn(onDidChangeIntegrationsEmitter.event); + + provider = new SqlIntegrationEnvironmentVariablesProvider( + instance(integrationStorage), + instance(notebookEditorProvider), + instance(notebookManager), + disposables + ); }); teardown(() => { disposables.forEach((d) => d.dispose()); + onDidChangeIntegrationsEmitter.dispose(); }); - test('Returns empty object when resource is undefined', async () => { - const envVars = await provider.getEnvironmentVariables(undefined); - assert.deepStrictEqual(envVars, {}); - }); - - test('Returns empty object when notebook is not found', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([]); - - const envVars = await provider.getEnvironmentVariables(uri); - assert.deepStrictEqual(envVars, {}); - }); - - test('Returns empty object when notebook has no SQL cells', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'python', 'print("hello")'), - createMockCell(1, NotebookCellKind.Markup, 'markdown', '# Title') - ]); - - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - - const envVars = await provider.getEnvironmentVariables(uri); - assert.deepStrictEqual(envVars, EXPECTED_DATAFRAME_ONLY_ENV_VARS); - }); - - test('Returns empty object when SQL cells have no integration ID', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT * FROM users', {}) - ]); - - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - - const envVars = await provider.getEnvironmentVariables(uri); - assert.deepStrictEqual(envVars, EXPECTED_DATAFRAME_ONLY_ENV_VARS); - }); - - test('Returns environment variable for internal DuckDB integration', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT * FROM df', { - sql_integration_id: DATAFRAME_SQL_INTEGRATION_ID - }) - ]); - - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); + suite('getEnvironmentVariables', () => { + test('Returns empty object when resource is undefined', async () => { + const result = await provider.getEnvironmentVariables(undefined); - const envVars = await provider.getEnvironmentVariables(uri); - - // Check that the environment variable is set for dataframe SQL - assert.property(envVars, 'SQL_DEEPNOTE_DATAFRAME_SQL'); - const credentialsJson = JSON.parse(envVars['SQL_DEEPNOTE_DATAFRAME_SQL']!); - assert.strictEqual(credentialsJson.url, 'deepnote+duckdb:///:memory:'); - assert.deepStrictEqual(credentialsJson.params, {}); - assert.strictEqual(credentialsJson.param_style, 'qmark'); - }); - - test('Returns environment variable for PostgreSQL integration', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'my-postgres-db'; - const config: PostgresIntegrationConfig = { - id: integrationId, - name: 'My Postgres DB', - type: IntegrationType.Postgres, - host: 'localhost', - port: 5432, - database: 'mydb', - username: 'user', - password: 'pass', - ssl: true - }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT * FROM users', { - sql_integration_id: integrationId - }) - ]); - - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); - - const envVars = await provider.getEnvironmentVariables(uri); - - // Check that the environment variable is set - assert.property(envVars, 'SQL_MY_POSTGRES_DB'); - const credentialsJson = JSON.parse(envVars['SQL_MY_POSTGRES_DB']!); - assert.strictEqual(credentialsJson.url, 'postgresql://user:pass@localhost:5432/mydb'); - assert.deepStrictEqual(credentialsJson.params, { sslmode: 'require' }); - assert.strictEqual(credentialsJson.param_style, 'format'); - }); - - test('Returns environment variable for BigQuery integration', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'my-bigquery'; - const serviceAccountJson = JSON.stringify({ type: 'service_account', project_id: 'my-project' }); - const config: BigQueryIntegrationConfig = { - id: integrationId, - name: 'My BigQuery', - type: IntegrationType.BigQuery, - projectId: 'my-project', - credentials: serviceAccountJson - }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT * FROM dataset.table', { - sql_integration_id: integrationId - }) - ]); - - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); - - const envVars = await provider.getEnvironmentVariables(uri); - - // Check that the environment variable is set - assert.property(envVars, 'SQL_MY_BIGQUERY'); - const credentialsJson = JSON.parse(envVars['SQL_MY_BIGQUERY']!); - assert.strictEqual(credentialsJson.url, 'bigquery://?user_supplied_client=true'); - assert.deepStrictEqual(credentialsJson.params, { - project_id: 'my-project', - credentials: { type: 'service_account', project_id: 'my-project' } + assert.deepStrictEqual(result, {}); }); - assert.strictEqual(credentialsJson.param_style, 'format'); - }); - - test('Handles multiple SQL cells with same integration', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'my-postgres-db'; - const config: PostgresIntegrationConfig = { - id: integrationId, - name: 'My Postgres DB', - type: IntegrationType.Postgres, - host: 'localhost', - port: 5432, - database: 'mydb', - username: 'user', - password: 'pass' - }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT * FROM users', { - sql_integration_id: integrationId - }), - createMockCell(1, NotebookCellKind.Code, 'sql', 'SELECT * FROM orders', { - sql_integration_id: integrationId - }) - ]); - - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); - - const envVars = await provider.getEnvironmentVariables(uri); - - // Should only have one environment variable apart from the internal DuckDB integration - assert.property(envVars, 'SQL_MY_POSTGRES_DB'); - assert.strictEqual(Object.keys(envVars).length, 2); - }); - - test('Handles multiple SQL cells with different integrations', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const postgresId = 'my-postgres-db'; - const bigqueryId = 'my-bigquery'; - - const postgresConfig: PostgresIntegrationConfig = { - id: postgresId, - name: 'My Postgres DB', - type: IntegrationType.Postgres, - host: 'localhost', - port: 5432, - database: 'mydb', - username: 'user', - password: 'pass' - }; - - const bigqueryConfig: BigQueryIntegrationConfig = { - id: bigqueryId, - name: 'My BigQuery', - type: IntegrationType.BigQuery, - projectId: 'my-project', - credentials: JSON.stringify({ type: 'service_account' }) - }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT * FROM users', { - sql_integration_id: postgresId - }), - createMockCell(1, NotebookCellKind.Code, 'sql', 'SELECT * FROM dataset.table', { - sql_integration_id: bigqueryId - }) - ]); - - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(postgresId)).thenResolve(postgresConfig); - when(integrationStorage.getIntegrationConfig(bigqueryId)).thenResolve(bigqueryConfig); - - const envVars = await provider.getEnvironmentVariables(uri); - - // Should have two environment variables apart from the internal DuckDB integration - assert.property(envVars, 'SQL_MY_POSTGRES_DB'); - assert.property(envVars, 'SQL_MY_BIGQUERY'); - assert.strictEqual(Object.keys(envVars).length, 3); - }); - test('Handles missing integration configuration gracefully', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'missing-integration'; + test('Returns empty object when cancellation token is already cancelled', async () => { + const tokenSource = new CancellationTokenSource(); + tokenSource.cancel(); + const resource = Uri.file('/test/notebook.deepnote'); - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT * FROM users', { - sql_integration_id: integrationId - }) - ]); + const result = await provider.getEnvironmentVariables(resource, tokenSource.token); - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(undefined); - - const envVars = await provider.getEnvironmentVariables(uri); + assert.deepStrictEqual(result, {}); + tokenSource.dispose(); + }); - // Should return only dataframe integration when integration config is missing - assert.deepStrictEqual(envVars, EXPECTED_DATAFRAME_ONLY_ENV_VARS); - }); + test('Returns empty object when no notebook is found for resource', async () => { + const resource = Uri.file('/test/notebook.deepnote'); + when(notebookEditorProvider.findAssociatedNotebookDocument(resource)).thenReturn(undefined); - test('Properly encodes special characters in PostgreSQL credentials', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'special-chars-db'; - const config: PostgresIntegrationConfig = { - id: integrationId, - name: 'Special Chars DB', - type: IntegrationType.Postgres, - host: 'db.example.com', - port: 5432, - database: 'my@db:name', - username: 'user@domain', - password: 'pa:ss@word!#$%', - ssl: false - }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT * FROM users', { - sql_integration_id: integrationId - }) - ]); - - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); - - const envVars = await provider.getEnvironmentVariables(uri); - - // Check that the environment variable is set - assert.property(envVars, 'SQL_SPECIAL_CHARS_DB'); - const credentialsJson = JSON.parse(envVars['SQL_SPECIAL_CHARS_DB']!); - - // Verify that special characters are properly URL-encoded - assert.strictEqual( - credentialsJson.url, - 'postgresql://user%40domain:pa%3Ass%40word!%23%24%25@db.example.com:5432/my%40db%3Aname' - ); - assert.deepStrictEqual(credentialsJson.params, {}); - assert.strictEqual(credentialsJson.param_style, 'format'); - }); + const result = await provider.getEnvironmentVariables(resource); - test('Normalizes integration ID with spaces and mixed case for env var name', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'My Production DB'; - const config: PostgresIntegrationConfig = { - id: integrationId, - name: 'Production Database', - type: IntegrationType.Postgres, - host: 'prod.example.com', - port: 5432, - database: 'proddb', - username: 'admin', - password: 'secret', - ssl: true - }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT * FROM products', { - sql_integration_id: integrationId - }) - ]); - - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); - - const envVars = await provider.getEnvironmentVariables(uri); - - // Check that the environment variable name is properly normalized - // Spaces should be converted to underscores and uppercased - assert.property(envVars, 'SQL_MY_PRODUCTION_DB'); - const credentialsJson = JSON.parse(envVars['SQL_MY_PRODUCTION_DB']!); - assert.strictEqual(credentialsJson.url, 'postgresql://admin:secret@prod.example.com:5432/proddb'); - assert.deepStrictEqual(credentialsJson.params, { sslmode: 'require' }); - assert.strictEqual(credentialsJson.param_style, 'format'); - }); - - test('Normalizes integration ID with special characters for env var name', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'my-db@2024!'; - const config: PostgresIntegrationConfig = { - id: integrationId, - name: 'Test DB', - type: IntegrationType.Postgres, - host: 'localhost', - port: 5432, - database: 'testdb', - username: 'user', - password: 'pass', - ssl: false - }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT 1', { - sql_integration_id: integrationId - }) - ]); - - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); - - const envVars = await provider.getEnvironmentVariables(uri); - - // Check that special characters in integration ID are normalized for env var name - // Non-alphanumeric characters should be converted to underscores - assert.property(envVars, 'SQL_MY_DB_2024_'); - const credentialsJson = JSON.parse(envVars['SQL_MY_DB_2024_']!); - assert.strictEqual(credentialsJson.url, 'postgresql://user:pass@localhost:5432/testdb'); - }); + assert.deepStrictEqual(result, {}); + }); - test('Honors CancellationToken (returns empty when cancelled early)', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'cancel-me'; - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT 1', { sql_integration_id: integrationId }) - ]); - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - // Return a slow promise to ensure cancellation path is hit - when(integrationStorage.getIntegrationConfig(integrationId)).thenCall( - () => new Promise((resolve) => setTimeout(() => resolve(undefined), 50)) - ); - const cts = new CancellationTokenSource(); - cts.cancel(); - const envVars = await provider.getEnvironmentVariables(uri, cts.token); - assert.deepStrictEqual(envVars, {}); - }); + test('Returns empty object when notebook has no project ID in metadata', async () => { + const resource = Uri.file('/test/notebook.deepnote'); + const notebook = mock(); + when(notebook.metadata).thenReturn({}); + when(notebookEditorProvider.findAssociatedNotebookDocument(resource)).thenReturn(instance(notebook)); - suite('Snowflake Integration', () => { - test('Returns environment variable for Snowflake with PASSWORD auth', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'my-snowflake'; - const config: SnowflakeIntegrationConfig = { - id: integrationId, - name: 'My Snowflake', - type: IntegrationType.Snowflake, - account: 'myorg-myaccount', - warehouse: 'COMPUTE_WH', - database: 'MYDB', - role: 'ANALYST', - authMethod: SnowflakeAuthMethods.PASSWORD, - username: 'john.doe', - password: 'secret123' - }; + const result = await provider.getEnvironmentVariables(resource); - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT * FROM customers', { - sql_integration_id: integrationId - }) - ]); + assert.deepStrictEqual(result, {}); + }); - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); + test('Returns empty object when project is not found in notebook manager', async () => { + const resource = Uri.file('/test/notebook.deepnote'); + const notebook = mock(); + when(notebook.metadata).thenReturn({ deepnoteProjectId: 'project-123' }); + when(notebookEditorProvider.findAssociatedNotebookDocument(resource)).thenReturn(instance(notebook)); + when(notebookManager.getOriginalProject('project-123')).thenReturn(undefined); - const envVars = await provider.getEnvironmentVariables(uri); + const result = await provider.getEnvironmentVariables(resource); - assert.property(envVars, 'SQL_MY_SNOWFLAKE'); - const credentialsJson = JSON.parse(envVars['SQL_MY_SNOWFLAKE']!); - assert.strictEqual( - credentialsJson.url, - 'snowflake://john.doe:secret123@myorg-myaccount/MYDB?warehouse=COMPUTE_WH&role=ANALYST&application=Deepnote' - ); - assert.deepStrictEqual(credentialsJson.params, {}); - assert.strictEqual(credentialsJson.param_style, 'pyformat'); + assert.deepStrictEqual(result, {}); }); - test('Returns environment variable for Snowflake with legacy null auth (username+password)', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'legacy-snowflake'; - const config: SnowflakeIntegrationConfig = { - id: integrationId, - name: 'Legacy Snowflake', - type: IntegrationType.Snowflake, - account: 'legacy-account', - warehouse: 'WH', - database: 'DB', - authMethod: null, - username: 'user', - password: 'pass' - }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT 1', { - sql_integration_id: integrationId - }) - ]); + test('Returns only DuckDB integration when project has no integrations', async () => { + const resource = Uri.file('/test/notebook.deepnote'); + const notebook = mock(); + const project = createMockProject('project-123', []); - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); + when(notebook.metadata).thenReturn({ deepnoteProjectId: 'project-123' }); + when(notebookEditorProvider.findAssociatedNotebookDocument(resource)).thenReturn(instance(notebook)); + when(notebookManager.getOriginalProject('project-123')).thenReturn(project); - const envVars = await provider.getEnvironmentVariables(uri); + const result = await provider.getEnvironmentVariables(resource); - assert.property(envVars, 'SQL_LEGACY_SNOWFLAKE'); - const credentialsJson = JSON.parse(envVars['SQL_LEGACY_SNOWFLAKE']!); - assert.strictEqual( - credentialsJson.url, - 'snowflake://user:pass@legacy-account/DB?warehouse=WH&application=Deepnote' - ); - assert.deepStrictEqual(credentialsJson.params, {}); + // Should contain DuckDB integration env vars + assert.ok(Object.keys(result).length > 0, 'Should have environment variables for DuckDB'); + // The actual env var name depends on the database-integrations library implementation + // We verify that at least one env var was generated }); - test('Returns environment variable for Snowflake with SERVICE_ACCOUNT_KEY_PAIR auth', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'snowflake-keypair'; - const privateKey = - '-----BEGIN ' + 'PRIVATE KEY-----\nfakekey-MIIEvQIBADANBg...\n-----END ' + 'PRIVATE KEY-----'; - const config: SnowflakeIntegrationConfig = { - id: integrationId, - name: 'Snowflake KeyPair', - type: IntegrationType.Snowflake, - account: 'keypair-account', - warehouse: 'ETL_WH', - database: 'PROD_DB', - role: 'ETL_ROLE', - authMethod: SnowflakeAuthMethods.SERVICE_ACCOUNT_KEY_PAIR, - username: 'service_account', - privateKey: privateKey, - privateKeyPassphrase: 'passphrase123' + test('Retrieves integration configs from storage for project integrations', async () => { + const resource = Uri.file('/test/notebook.deepnote'); + const notebook = mock(); + const postgresConfig: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres DB', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT * FROM events', { - sql_integration_id: integrationId - }) + const project = createMockProject('project-123', [ + { id: 'postgres-1', name: 'My Postgres DB', type: 'pgsql' } ]); - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); + when(notebook.metadata).thenReturn({ deepnoteProjectId: 'project-123' }); + when(notebookEditorProvider.findAssociatedNotebookDocument(resource)).thenReturn(instance(notebook)); + when(notebookManager.getOriginalProject('project-123')).thenReturn(project); + when(integrationStorage.getIntegrationConfig('postgres-1')).thenResolve(postgresConfig); - const envVars = await provider.getEnvironmentVariables(uri); + const result = await provider.getEnvironmentVariables(resource); - assert.property(envVars, 'SQL_SNOWFLAKE_KEYPAIR'); - const credentialsJson = JSON.parse(envVars['SQL_SNOWFLAKE_KEYPAIR']!); - assert.strictEqual( - credentialsJson.url, - 'snowflake://service_account@keypair-account/PROD_DB?warehouse=ETL_WH&role=ETL_ROLE&authenticator=snowflake_jwt&application=Deepnote' - ); - assert.deepStrictEqual(credentialsJson.params, { - snowflake_private_key: Buffer.from(privateKey).toString('base64'), - snowflake_private_key_passphrase: 'passphrase123' - }); - assert.strictEqual(credentialsJson.param_style, 'pyformat'); + // Should contain env vars for both Postgres and DuckDB + assert.ok(Object.keys(result).length > 0, 'Should have environment variables'); }); - test('Returns environment variable for Snowflake with SERVICE_ACCOUNT_KEY_PAIR auth without passphrase', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'snowflake-keypair-no-pass'; - const privateKey = - '-----BEGIN ' + 'PRIVATE KEY-----\nfakekey-MIIEvQIBADANBg...\n-----END ' + 'PRIVATE KEY-----'; - const config: SnowflakeIntegrationConfig = { - id: integrationId, - name: 'Snowflake KeyPair No Pass', - type: IntegrationType.Snowflake, - account: 'account123', - warehouse: 'WH', - database: 'DB', - authMethod: SnowflakeAuthMethods.SERVICE_ACCOUNT_KEY_PAIR, - username: 'svc_user', - privateKey: privateKey + test('Filters out null integration configs from storage', async () => { + const resource = Uri.file('/test/notebook.deepnote'); + const notebook = mock(); + const postgresConfig: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'My Postgres DB', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT 1', { - sql_integration_id: integrationId - }) + const project = createMockProject('project-123', [ + { id: 'postgres-1', name: 'My Postgres DB', type: 'pgsql' }, + { id: 'missing-integration', name: 'Missing', type: 'pgsql' } ]); - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); + when(notebook.metadata).thenReturn({ deepnoteProjectId: 'project-123' }); + when(notebookEditorProvider.findAssociatedNotebookDocument(resource)).thenReturn(instance(notebook)); + when(notebookManager.getOriginalProject('project-123')).thenReturn(project); + when(integrationStorage.getIntegrationConfig('postgres-1')).thenResolve(postgresConfig); + when(integrationStorage.getIntegrationConfig('missing-integration')).thenResolve(undefined); - const envVars = await provider.getEnvironmentVariables(uri); + const result = await provider.getEnvironmentVariables(resource); - assert.property(envVars, 'SQL_SNOWFLAKE_KEYPAIR_NO_PASS'); - const credentialsJson = JSON.parse(envVars['SQL_SNOWFLAKE_KEYPAIR_NO_PASS']!); - assert.strictEqual( - credentialsJson.url, - 'snowflake://svc_user@account123/DB?warehouse=WH&authenticator=snowflake_jwt&application=Deepnote' - ); - assert.deepStrictEqual(credentialsJson.params, { - snowflake_private_key: Buffer.from(privateKey).toString('base64') - }); + // Should only include postgres-1 and DuckDB, not the missing integration + assert.ok(Object.keys(result).length > 0, 'Should have environment variables'); }); - test('Properly encodes special characters in Snowflake credentials', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'snowflake-special'; - const config: SnowflakeIntegrationConfig = { - id: integrationId, - name: 'Snowflake Special', - type: IntegrationType.Snowflake, - account: 'my-org.account', - warehouse: 'WH@2024', - database: 'DB:TEST', - role: 'ROLE#1', - authMethod: SnowflakeAuthMethods.PASSWORD, - username: 'user@domain.com', - password: 'p@ss:word!#$%' - }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT 1', { - sql_integration_id: integrationId - }) - ]); + test('Always includes DuckDB integration in the config list', async () => { + const resource = Uri.file('/test/notebook.deepnote'); + const notebook = mock(); + const project = createMockProject('project-123', []); - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); + when(notebook.metadata).thenReturn({ deepnoteProjectId: 'project-123' }); + when(notebookEditorProvider.findAssociatedNotebookDocument(resource)).thenReturn(instance(notebook)); + when(notebookManager.getOriginalProject('project-123')).thenReturn(project); - const envVars = await provider.getEnvironmentVariables(uri); + const result = await provider.getEnvironmentVariables(resource); - assert.property(envVars, 'SQL_SNOWFLAKE_SPECIAL'); - const credentialsJson = JSON.parse(envVars['SQL_SNOWFLAKE_SPECIAL']!); - // Verify URL encoding of special characters - assert.strictEqual( - credentialsJson.url, - 'snowflake://user%40domain.com:p%40ss%3Aword!%23%24%25@my-org.account/DB%3ATEST?warehouse=WH%402024&role=ROLE%231&application=Deepnote' - ); + // DuckDB should always be included + assert.ok(Object.keys(result).length > 0, 'Should have DuckDB environment variables'); }); - test('Handles Snowflake with minimal optional fields', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'snowflake-minimal'; - const config: SnowflakeIntegrationConfig = { - id: integrationId, - name: 'Snowflake Minimal', - type: IntegrationType.Snowflake, - account: 'minimal-account', - authMethod: SnowflakeAuthMethods.PASSWORD, - username: 'user', - password: 'pass' + test('Generates environment variables for multiple integrations', async () => { + const resource = Uri.file('/test/notebook.deepnote'); + const notebook = mock(); + const postgresConfig: DatabaseIntegrationConfig = { + id: 'postgres-1', + name: 'Postgres DB', + type: 'pgsql', + metadata: { + host: 'localhost', + port: '5432', + database: 'testdb', + user: 'testuser', + password: 'testpass', + sslEnabled: false + } }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT 1', { - sql_integration_id: integrationId - }) + const bigqueryConfig: DatabaseIntegrationConfig = { + id: 'bigquery-1', + name: 'BigQuery', + type: 'big-query', + metadata: { + authMethod: 'service-account', + service_account: '{"type":"service_account","project_id":"test"}' + } + }; + const project = createMockProject('project-123', [ + { id: 'postgres-1', name: 'Postgres DB', type: 'pgsql' }, + { id: 'bigquery-1', name: 'BigQuery', type: 'big-query' } ]); - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); + when(notebook.metadata).thenReturn({ deepnoteProjectId: 'project-123' }); + when(notebookEditorProvider.findAssociatedNotebookDocument(resource)).thenReturn(instance(notebook)); + when(notebookManager.getOriginalProject('project-123')).thenReturn(project); + when(integrationStorage.getIntegrationConfig('postgres-1')).thenResolve(postgresConfig); + when(integrationStorage.getIntegrationConfig('bigquery-1')).thenResolve(bigqueryConfig); - const envVars = await provider.getEnvironmentVariables(uri); + const result = await provider.getEnvironmentVariables(resource); - assert.property(envVars, 'SQL_SNOWFLAKE_MINIMAL'); - const credentialsJson = JSON.parse(envVars['SQL_SNOWFLAKE_MINIMAL']!); - // Should not include warehouse, database, or role in URL when not provided - assert.strictEqual(credentialsJson.url, 'snowflake://user:pass@minimal-account?application=Deepnote'); - assert.strictEqual(credentialsJson.param_style, 'pyformat'); + // Should have env vars for Postgres, BigQuery, and DuckDB + assert.ok(Object.keys(result).length > 0, 'Should have environment variables for all integrations'); }); - test('Skips unsupported Snowflake auth method (OKTA)', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'snowflake-okta'; - const config: SnowflakeIntegrationConfig = { - id: integrationId, - name: 'Snowflake OKTA', - type: IntegrationType.Snowflake, - account: 'okta-account', - authMethod: SnowflakeAuthMethods.OKTA - }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT 1', { - sql_integration_id: integrationId - }) - ]); - - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); + suite('Real environment variable format checks', () => { + test('PostgreSQL integration generates correct SQL_* env var format', async () => { + const resource = Uri.file('/test/notebook.deepnote'); + const notebook = mock(); + const postgresConfig: DatabaseIntegrationConfig = { + id: 'my-postgres', + name: 'Production DB', + type: 'pgsql', + metadata: { + host: 'db.example.com', + port: '5432', + database: 'production', + user: 'admin', + password: 'secret123', + sslEnabled: true + } + }; + const project = createMockProject('project-123', [ + { id: 'my-postgres', name: 'Production DB', type: 'pgsql' } + ]); + + when(notebook.metadata).thenReturn({ deepnoteProjectId: 'project-123' }); + when(notebookEditorProvider.findAssociatedNotebookDocument(resource)).thenReturn(instance(notebook)); + when(notebookManager.getOriginalProject('project-123')).thenReturn(project); + when(integrationStorage.getIntegrationConfig('my-postgres')).thenResolve(postgresConfig); + + const result = await provider.getEnvironmentVariables(resource); + + // The database-integrations library generates env vars with SQL_ prefix + // and the integration ID in uppercase with hyphens replaced by underscores + const expectedEnvVarName = 'SQL_MY_POSTGRES'; + assert.ok(result[expectedEnvVarName], `Should have ${expectedEnvVarName} env var`); + + // The value should be a JSON string with connection details + const envVarValue = result[expectedEnvVarName]; + assert.ok(typeof envVarValue === 'string', 'Env var value should be a string'); + assert.ok(envVarValue, 'Env var value should not be undefined'); + + // Parse and verify the structure + const parsed = JSON.parse(envVarValue!); + assert.ok(parsed.url, 'Should have url field'); + assert.ok(parsed.url.includes('postgresql://'), 'URL should be PostgreSQL connection string'); + assert.ok(parsed.url.includes('db.example.com'), 'URL should contain host'); + assert.ok(parsed.url.includes('5432'), 'URL should contain port'); + assert.ok(parsed.url.includes('production'), 'URL should contain database name'); + }); - // Should return only dataframe integration when unsupported auth method is encountered - const envVars = await provider.getEnvironmentVariables(uri); - assert.deepStrictEqual(envVars, EXPECTED_DATAFRAME_ONLY_ENV_VARS); - }); + test('BigQuery integration generates correct SQL_* env var format', async () => { + const resource = Uri.file('/test/notebook.deepnote'); + const notebook = mock(); + const serviceAccountJson = JSON.stringify({ + type: 'service_account', + project_id: 'my-gcp-project', + private_key_id: 'key123', + private_key: '-----BEGIN PRIVATE KEY-----\ntest\n-----END PRIVATE KEY-----\n', + client_email: 'test@my-gcp-project.iam.gserviceaccount.com', + client_id: '123456789', + auth_uri: 'https://accounts.google.com/o/oauth2/auth', + token_uri: 'https://oauth2.googleapis.com/token' + }); + const bigqueryConfig: DatabaseIntegrationConfig = { + id: 'my-bigquery', + name: 'Analytics BQ', + type: 'big-query', + metadata: { + authMethod: 'service-account', + service_account: serviceAccountJson + } + }; + const project = createMockProject('project-123', [ + { id: 'my-bigquery', name: 'Analytics BQ', type: 'big-query' } + ]); + + when(notebook.metadata).thenReturn({ deepnoteProjectId: 'project-123' }); + when(notebookEditorProvider.findAssociatedNotebookDocument(resource)).thenReturn(instance(notebook)); + when(notebookManager.getOriginalProject('project-123')).thenReturn(project); + when(integrationStorage.getIntegrationConfig('my-bigquery')).thenResolve(bigqueryConfig); + + const result = await provider.getEnvironmentVariables(resource); + + const expectedEnvVarName = 'SQL_MY_BIGQUERY'; + assert.ok(result[expectedEnvVarName], `Should have ${expectedEnvVarName} env var`); + + const envVarValue = result[expectedEnvVarName]; + assert.ok(typeof envVarValue === 'string', 'Env var value should be a string'); + assert.ok(envVarValue, 'Env var value should not be undefined'); + + // Parse and verify the structure + const parsed = JSON.parse(envVarValue!); + // BigQuery env vars should contain connection details + // The exact structure depends on the database-integrations library + assert.ok(parsed, 'Should have parsed BigQuery config'); + }); - test('Skips unsupported Snowflake auth method (AZURE_AD)', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'snowflake-azure'; - const config: SnowflakeIntegrationConfig = { - id: integrationId, - name: 'Snowflake Azure', - type: IntegrationType.Snowflake, - account: 'azure-account', - authMethod: SnowflakeAuthMethods.AZURE_AD - }; + test('DuckDB (dataframe-sql) integration is always included', async () => { + const resource = Uri.file('/test/notebook.deepnote'); + const notebook = mock(); + const project = createMockProject('project-123', []); - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT 1', { - sql_integration_id: integrationId - }) - ]); + when(notebook.metadata).thenReturn({ deepnoteProjectId: 'project-123' }); + when(notebookEditorProvider.findAssociatedNotebookDocument(resource)).thenReturn(instance(notebook)); + when(notebookManager.getOriginalProject('project-123')).thenReturn(project); - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); + const result = await provider.getEnvironmentVariables(resource); - const envVars = await provider.getEnvironmentVariables(uri); - assert.deepStrictEqual(envVars, EXPECTED_DATAFRAME_ONLY_ENV_VARS); + // DuckDB integration should generate an env var + // The exact name depends on DATAFRAME_SQL_INTEGRATION_ID + const expectedEnvVarName = `SQL_${DATAFRAME_SQL_INTEGRATION_ID.toUpperCase().replace(/-/g, '_')}`; + assert.ok(result[expectedEnvVarName], `Should have ${expectedEnvVarName} env var for DuckDB`); + }); }); + }); - test('Skips unsupported Snowflake auth method (KEY_PAIR)', async () => { - const uri = Uri.file('/test/notebook.deepnote'); - const integrationId = 'snowflake-keypair-user'; - const config: SnowflakeIntegrationConfig = { - id: integrationId, - name: 'Snowflake KeyPair User', - type: IntegrationType.Snowflake, - account: 'keypair-user-account', - authMethod: SnowflakeAuthMethods.KEY_PAIR - }; - - const notebook = createMockNotebook(uri, [ - createMockCell(0, NotebookCellKind.Code, 'sql', 'SELECT 1', { - sql_integration_id: integrationId - }) - ]); + suite('onDidChangeEnvironmentVariables event', () => { + test('Fires when integration storage changes', (done) => { + let eventFired = false; + provider.onDidChangeEnvironmentVariables(() => { + eventFired = true; + assert.ok(true, 'Event should fire when integrations change'); + done(); + }); - when(mockedVSCodeNamespaces.workspace.notebookDocuments).thenReturn([notebook]); - when(integrationStorage.getIntegrationConfig(integrationId)).thenResolve(config); + // Trigger the integration storage change event + onDidChangeIntegrationsEmitter.fire(); - const envVars = await provider.getEnvironmentVariables(uri); - assert.deepStrictEqual(envVars, EXPECTED_DATAFRAME_ONLY_ENV_VARS); + // Give it a moment to propagate + setTimeout(() => { + if (!eventFired) { + done(new Error('Event did not fire')); + } + }, 100); }); }); }); - -function createMockNotebook(uri: Uri, cells: NotebookCell[]): NotebookDocument { - return { - uri, - getCells: () => cells - } as NotebookDocument; -} - -function createMockCell( - index: number, - kind: NotebookCellKind, - languageId: string, - value: string, - metadata?: Record -): NotebookCell { - return { - index, - kind, - document: { - languageId, - getText: () => value - }, - metadata: metadata || {} - } as NotebookCell; -} diff --git a/src/platform/notebooks/deepnote/types.ts b/src/platform/notebooks/deepnote/types.ts index ff2a1cd0fe..2632483c99 100644 --- a/src/platform/notebooks/deepnote/types.ts +++ b/src/platform/notebooks/deepnote/types.ts @@ -1,7 +1,8 @@ -import { CancellationToken, Event } from 'vscode'; +import { CancellationToken, Event, NotebookDocument, Uri } from 'vscode'; import { IDisposable, Resource } from '../../common/types'; import { EnvironmentVariables } from '../../common/variables/types'; -import { IntegrationConfig } from './integrationTypes'; +import { DeepnoteProject } from '../../deepnote/deepnoteTypes'; +import { DatabaseIntegrationConfig } from '@deepnote/database-integrations'; /** * Settings for select input blocks @@ -30,7 +31,7 @@ export interface IIntegrationStorage extends IDisposable { */ readonly onDidChangeIntegrations: Event; - getAll(): Promise; + getAll(): Promise; /** * Retrieves the global (non-project-scoped) integration configuration by integration ID. @@ -48,14 +49,17 @@ export interface IIntegrationStorage extends IDisposable { * - The `IntegrationConfig` object if a global configuration exists for the given ID * - `undefined` if no global configuration exists for the given integration ID */ - getIntegrationConfig(integrationId: string): Promise; + getIntegrationConfig(integrationId: string): Promise; /** * Get integration configuration for a specific project and integration */ - getProjectIntegrationConfig(projectId: string, integrationId: string): Promise; + getProjectIntegrationConfig( + projectId: string, + integrationId: string + ): Promise; - save(config: IntegrationConfig): Promise; + save(config: DatabaseIntegrationConfig): Promise; delete(integrationId: string): Promise; exists(integrationId: string): Promise; clear(): Promise; @@ -73,3 +77,23 @@ export interface ISqlIntegrationEnvVarsProvider { */ getEnvironmentVariables(resource: Resource, token?: CancellationToken): Promise; } + +/** + * Platform-layer interface for accessing notebook documents. + * This is a subset of the full INotebookEditorProvider interface from the notebooks layer. + * The implementation in the notebooks layer should be bound to this symbol as well. + */ +export const IPlatformNotebookEditorProvider = Symbol('IPlatformNotebookEditorProvider'); +export interface IPlatformNotebookEditorProvider { + findAssociatedNotebookDocument(uri: Uri): NotebookDocument | undefined; +} + +/** + * Platform-layer interface for accessing Deepnote project data. + * This is a subset of the full IDeepnoteNotebookManager interface from the notebooks layer. + * The implementation in the notebooks layer should be bound to this symbol as well. + */ +export const IPlatformDeepnoteNotebookManager = Symbol('IPlatformDeepnoteNotebookManager'); +export interface IPlatformDeepnoteNotebookManager { + getOriginalProject(projectId: string): DeepnoteProject | undefined; +} diff --git a/src/webviews/webview-side/integrations/BigQueryForm.tsx b/src/webviews/webview-side/integrations/BigQueryForm.tsx index 60710ca912..54cf987898 100644 --- a/src/webviews/webview-side/integrations/BigQueryForm.tsx +++ b/src/webviews/webview-side/integrations/BigQueryForm.tsx @@ -1,41 +1,67 @@ import * as React from 'react'; import { format, getLocString } from '../react-common/locReactSide'; -import { BigQueryIntegrationConfig } from './types'; +import { BigQueryAuthMethods, DatabaseIntegrationConfig } from '@deepnote/database-integrations'; + +type BigQueryConfig = Extract; + +function createEmptyBigQueryConfig(params: { id: string; name?: string }): BigQueryConfig { + const unnamedIntegration = getLocString('integrationsUnnamedIntegration', 'Unnamed Integration ({0})'); + + return { + id: params.id, + name: (params.name || format(unnamedIntegration, params.id)).trim(), + type: 'big-query', + metadata: { + authMethod: BigQueryAuthMethods.ServiceAccount, + service_account: '' + } + }; +} export interface IBigQueryFormProps { integrationId: string; - existingConfig: BigQueryIntegrationConfig | null; - integrationName?: string; - onSave: (config: BigQueryIntegrationConfig) => void; + existingConfig: BigQueryConfig | null; + defaultName?: string; + onSave: (config: BigQueryConfig) => void; onCancel: () => void; } export const BigQueryForm: React.FC = ({ integrationId, existingConfig, - integrationName, + defaultName, onSave, onCancel }) => { - const [name, setName] = React.useState(existingConfig?.name || integrationName || ''); - const [projectId, setProjectId] = React.useState(existingConfig?.projectId || ''); - const [credentials, setCredentials] = React.useState(existingConfig?.credentials || ''); + const [pendingConfig, setPendingConfig] = React.useState( + existingConfig && existingConfig.metadata.authMethod === BigQueryAuthMethods.ServiceAccount + ? structuredClone(existingConfig) + : createEmptyBigQueryConfig({ id: integrationId, name: defaultName }) + ); + const [credentialsError, setCredentialsError] = React.useState(null); - // Update form fields when existingConfig or integrationName changes React.useEffect(() => { - if (existingConfig) { - setName(existingConfig.name || ''); - setProjectId(existingConfig.projectId || ''); - setCredentials(existingConfig.credentials || ''); - setCredentialsError(null); - } else { - setName(integrationName || ''); - setProjectId(''); - setCredentials(''); - setCredentialsError(null); - } - }, [existingConfig, integrationName]); + setPendingConfig( + existingConfig && existingConfig.metadata.authMethod === BigQueryAuthMethods.ServiceAccount + ? structuredClone(existingConfig) + : createEmptyBigQueryConfig({ id: integrationId, name: defaultName }) + ); + setCredentialsError(null); + }, [existingConfig, integrationId, defaultName]); + + // Extract service account value with proper type narrowing + const serviceAccountValue = + pendingConfig.metadata.authMethod === BigQueryAuthMethods.ServiceAccount + ? pendingConfig.metadata.service_account + : ''; + + const handleNameChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + name: e.target.value + })); + }; const validateCredentials = (value: string): boolean => { if (!value.trim()) { @@ -57,31 +83,32 @@ export const BigQueryForm: React.FC = ({ const handleCredentialsChange = (e: React.ChangeEvent) => { const value = e.target.value; - setCredentials(value); + + setPendingConfig((prev) => { + if (prev.metadata.authMethod === BigQueryAuthMethods.ServiceAccount) { + return { + ...prev, + metadata: { + ...prev.metadata, + service_account: value + } + }; + } + return prev; + }); + validateCredentials(value); }; const handleSubmit = (e: React.FormEvent) => { e.preventDefault(); - const trimmedCredentials = credentials.trim(); - // Validate credentials before submitting - if (!validateCredentials(trimmedCredentials)) { + if (!validateCredentials(serviceAccountValue)) { return; } - const unnamedIntegration = getLocString('integrationsUnnamedIntegration', 'Unnamed Integration ({0})'); - - const config: BigQueryIntegrationConfig = { - id: integrationId, - name: (name || format(unnamedIntegration, integrationId)).trim(), - type: 'bigquery', - projectId: projectId.trim(), - credentials: trimmedCredentials - }; - - onSave(config); + onSave(pendingConfig); }; return ( @@ -91,29 +118,13 @@ export const BigQueryForm: React.FC = ({ setName(e.target.value)} + value={pendingConfig.name} + onChange={handleNameChange} placeholder={getLocString('integrationsBigQueryNamePlaceholder', 'My BigQuery Project')} autoComplete="off" /> - - - {getLocString('integrationsBigQueryProjectIdLabel', 'Project ID')}{' '} - {getLocString('integrationsRequiredField', '*')} - - setProjectId(e.target.value)} - placeholder={getLocString('integrationsBigQueryProjectIdPlaceholder', 'my-project-id')} - autoComplete="off" - required - /> - - {getLocString('integrationsBigQueryCredentialsLabel', 'Service Account Credentials (JSON)')}{' '} @@ -121,7 +132,7 @@ export const BigQueryForm: React.FC = ({ void; + existingConfig: DatabaseIntegrationConfig | null; + defaultName?: string; + integrationType: DatabaseIntegrationType; + onSave: (config: DatabaseIntegrationConfig) => void; onCancel: () => void; } export const ConfigurationForm: React.FC = ({ integrationId, existingConfig, - integrationName, + defaultName, integrationType, onSave, onCancel }) => { - // Determine integration type from existing config, integration metadata from project, or ID - const getIntegrationType = (): 'postgres' | 'bigquery' | 'snowflake' => { - if (existingConfig) { - return existingConfig.type; - } - // Use integration type from project if available - if (integrationType) { - return integrationType; - } - // Infer from integration ID - if (integrationId.includes('postgres')) { - return 'postgres'; - } - if (integrationId.includes('bigquery')) { - return 'bigquery'; - } - if (integrationId.includes('snowflake')) { - return 'snowflake'; - } - // Default to postgres - return 'postgres'; - }; - - const selectedIntegrationType = getIntegrationType(); - const title = getLocString('integrationsConfigureTitle', 'Configure Integration: {0}').replace( '{0}', integrationId @@ -63,31 +38,47 @@ export const ConfigurationForm: React.FC = ({ - {selectedIntegrationType === 'postgres' ? ( - - ) : selectedIntegrationType === 'bigquery' ? ( - - ) : ( - - )} + {(() => { + switch (integrationType) { + case 'pgsql': + return ( + + ); + case 'big-query': + return ( + + ); + case 'snowflake': + return ( + + ); + default: { + const unsupportedMessage = getLocString( + 'integrationsUnsupportedIntegrationType', + 'Unsupported integration type: {0}' + ); + return {format(unsupportedMessage, integrationType)}; + } + } + })()} diff --git a/src/webviews/webview-side/integrations/IntegrationItem.tsx b/src/webviews/webview-side/integrations/IntegrationItem.tsx index 73b97fd99e..83944830a6 100644 --- a/src/webviews/webview-side/integrations/IntegrationItem.tsx +++ b/src/webviews/webview-side/integrations/IntegrationItem.tsx @@ -1,6 +1,7 @@ import * as React from 'react'; import { getLocString } from '../react-common/locReactSide'; -import { IntegrationWithStatus, IntegrationType } from './types'; +import { IntegrationWithStatus } from './types'; +import { DatabaseIntegrationType } from '@deepnote/database-integrations'; export interface IIntegrationItemProps { integration: IntegrationWithStatus; @@ -8,11 +9,11 @@ export interface IIntegrationItemProps { onDelete: (integrationId: string) => void; } -const getIntegrationTypeLabel = (type: IntegrationType): string => { +const getIntegrationTypeLabel = (type: DatabaseIntegrationType): string => { switch (type) { - case 'postgres': + case 'pgsql': return getLocString('integrationsPostgresTypeLabel', 'PostgreSQL'); - case 'bigquery': + case 'big-query': return getLocString('integrationsBigQueryTypeLabel', 'BigQuery'); case 'snowflake': return getLocString('integrationsSnowflakeTypeLabel', 'Snowflake'); diff --git a/src/webviews/webview-side/integrations/IntegrationPanel.tsx b/src/webviews/webview-side/integrations/IntegrationPanel.tsx index 9d9cc88e46..fbcc43481f 100644 --- a/src/webviews/webview-side/integrations/IntegrationPanel.tsx +++ b/src/webviews/webview-side/integrations/IntegrationPanel.tsx @@ -3,7 +3,8 @@ import { IVsCodeApi } from '../react-common/postOffice'; import { getLocString, storeLocStrings } from '../react-common/locReactSide'; import { IntegrationList } from './IntegrationList'; import { ConfigurationForm } from './ConfigurationForm'; -import { IntegrationWithStatus, WebviewMessage, IntegrationConfig, IntegrationType } from './types'; +import { IntegrationWithStatus, WebviewMessage } from './types'; +import { DatabaseIntegrationConfig, DatabaseIntegrationType } from '@deepnote/database-integrations'; export interface IIntegrationPanelProps { baseTheme: string; @@ -13,9 +14,11 @@ export interface IIntegrationPanelProps { export const IntegrationPanel: React.FC = ({ baseTheme, vscodeApi }) => { const [integrations, setIntegrations] = React.useState([]); const [selectedIntegrationId, setSelectedIntegrationId] = React.useState(null); - const [selectedConfig, setSelectedConfig] = React.useState(null); - const [selectedIntegrationName, setSelectedIntegrationName] = React.useState(undefined); - const [selectedIntegrationType, setSelectedIntegrationType] = React.useState( + const [selectedConfig, setSelectedConfig] = React.useState(null); + const [selectedIntegrationDefaultName, setSelectedIntegrationDefaultName] = React.useState( + undefined + ); + const [selectedIntegrationType, setSelectedIntegrationType] = React.useState( undefined ); const [message, setMessage] = React.useState<{ type: 'success' | 'error'; text: string } | null>(null); @@ -55,7 +58,7 @@ export const IntegrationPanel: React.FC = ({ baseTheme, case 'showForm': setSelectedIntegrationId(msg.integrationId); setSelectedConfig(msg.config); - setSelectedIntegrationName(msg.integrationName); + setSelectedIntegrationDefaultName(msg.integrationName); setSelectedIntegrationType(msg.integrationType); break; @@ -123,7 +126,7 @@ export const IntegrationPanel: React.FC = ({ baseTheme, setConfirmDelete(null); }; - const handleSave = (config: IntegrationConfig) => { + const handleSave = (config: DatabaseIntegrationConfig) => { vscodeApi.postMessage({ type: 'save', integrationId: config.id, @@ -146,11 +149,11 @@ export const IntegrationPanel: React.FC = ({ baseTheme, - {selectedIntegrationId && ( + {selectedIntegrationId && selectedIntegrationType && ( { + const unnamedIntegration = getLocString('integrationsUnnamedIntegration', 'Unnamed Integration ({0})'); + + return { + id: params.id, + name: (params.name || format(unnamedIntegration, params.id)).trim(), + type: 'pgsql', + metadata: { + host: '', + port: '5432', + database: '', + user: '', + password: '', + sslEnabled: false + } + }; +} export interface IPostgresFormProps { integrationId: string; - existingConfig: PostgresIntegrationConfig | null; - integrationName?: string; - onSave: (config: PostgresIntegrationConfig) => void; + existingConfig: Extract | null; + defaultName?: string; + onSave: (config: Extract) => void; onCancel: () => void; } export const PostgresForm: React.FC = ({ integrationId, existingConfig, - integrationName, + defaultName, onSave, onCancel }) => { - const [name, setName] = React.useState(existingConfig?.name || integrationName || ''); - const [host, setHost] = React.useState(existingConfig?.host || ''); - const [port, setPort] = React.useState(existingConfig?.port?.toString() || '5432'); - const [database, setDatabase] = React.useState(existingConfig?.database || ''); - const [username, setUsername] = React.useState(existingConfig?.username || ''); - const [password, setPassword] = React.useState(existingConfig?.password || ''); - const [ssl, setSsl] = React.useState(existingConfig?.ssl || false); - - // Update form fields when existingConfig or integrationName changes + const [pendingConfig, setPendingConfig] = React.useState>( + existingConfig + ? structuredClone(existingConfig) + : createEmptyPostgresConfig({ id: integrationId, name: defaultName }) + ); + React.useEffect(() => { - if (existingConfig) { - setName(existingConfig.name || ''); - setHost(existingConfig.host || ''); - setPort(existingConfig.port?.toString() || '5432'); - setDatabase(existingConfig.database || ''); - setUsername(existingConfig.username || ''); - setPassword(existingConfig.password || ''); - setSsl(existingConfig.ssl || false); - } else { - setName(integrationName || ''); - setHost(''); - setPort('5432'); - setDatabase(''); - setUsername(''); - setPassword(''); - setSsl(false); - } - }, [existingConfig, integrationName]); + setPendingConfig( + existingConfig + ? structuredClone(existingConfig) + : createEmptyPostgresConfig({ id: integrationId, name: defaultName }) + ); + }, [existingConfig, integrationId, defaultName]); + + const handleNameChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + name: e.target.value + })); + }; + + const handleHostChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + metadata: { + ...prev.metadata, + host: e.target.value + } + })); + }; + + const handlePortChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + metadata: { + ...prev.metadata, + port: e.target.value + } + })); + }; + + const handleDatabaseChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + metadata: { + ...prev.metadata, + database: e.target.value + } + })); + }; + + const handleUsernameChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + metadata: { + ...prev.metadata, + user: e.target.value + } + })); + }; + + const handlePasswordChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + metadata: { + ...prev.metadata, + password: e.target.value + } + })); + }; + + const handleSslChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + metadata: { + ...prev.metadata, + sslEnabled: e.target.checked + } + })); + }; const handleSubmit = (e: React.FormEvent) => { e.preventDefault(); - - const unnamedIntegration = getLocString('integrationsUnnamedIntegration', 'Unnamed Integration ({0})'); - - const config: PostgresIntegrationConfig = { - id: integrationId, - name: (name || format(unnamedIntegration, integrationId)).trim(), - type: 'postgres', - host, - port: parseInt(port, 10), - database: database.trim(), - username: username.trim(), - password: password.trim(), - ssl - }; - - onSave(config); + onSave(pendingConfig); }; return ( @@ -73,8 +131,8 @@ export const PostgresForm: React.FC = ({ setName(e.target.value)} + value={pendingConfig.name} + onChange={handleNameChange} placeholder={getLocString('integrationsPostgresNamePlaceholder', 'My PostgreSQL Database')} autoComplete="off" /> @@ -88,8 +146,8 @@ export const PostgresForm: React.FC = ({ setHost(e.target.value)} + value={pendingConfig.metadata.host} + onChange={handleHostChange} placeholder={getLocString('integrationsPostgresHostPlaceholder', 'localhost')} autoComplete="off" required @@ -104,8 +162,8 @@ export const PostgresForm: React.FC = ({ setPort(e.target.value)} + value={pendingConfig.metadata.port} + onChange={handlePortChange} placeholder={getLocString('integrationsPostgresPortPlaceholder', '5432')} min={1} max={65535} @@ -123,8 +181,8 @@ export const PostgresForm: React.FC = ({ setDatabase(e.target.value)} + value={pendingConfig.metadata.database} + onChange={handleDatabaseChange} placeholder={getLocString('integrationsPostgresDatabasePlaceholder', 'mydb')} autoComplete="off" required @@ -139,8 +197,8 @@ export const PostgresForm: React.FC = ({ setUsername(e.target.value)} + value={pendingConfig.metadata.user} + onChange={handleUsernameChange} placeholder={getLocString('integrationsPostgresUsernamePlaceholder', 'postgres')} autoComplete="username" required @@ -155,8 +213,8 @@ export const PostgresForm: React.FC = ({ setPassword(e.target.value)} + value={pendingConfig.metadata.password} + onChange={handlePasswordChange} placeholder={getLocString('integrationsPostgresPasswordPlaceholder', '••••••••')} autoComplete="current-password" required @@ -165,7 +223,7 @@ export const PostgresForm: React.FC = ({ - setSsl(e.target.checked)} /> + {getLocString('integrationsPostgresSslLabel', 'Use SSL')} diff --git a/src/webviews/webview-side/integrations/SnowflakeForm.tsx b/src/webviews/webview-side/integrations/SnowflakeForm.tsx index 6aaf4c5cca..097d87e784 100644 --- a/src/webviews/webview-side/integrations/SnowflakeForm.tsx +++ b/src/webviews/webview-side/integrations/SnowflakeForm.tsx @@ -1,177 +1,241 @@ import * as React from 'react'; import { format, getLocString } from '../react-common/locReactSide'; -import { - SnowflakeIntegrationConfig, - SnowflakeAuthMethod, - SnowflakeAuthMethods, - isSupportedSnowflakeAuthMethod -} from './types'; +import { DatabaseIntegrationConfig, SnowflakeAuthMethods } from '@deepnote/database-integrations'; -export interface ISnowflakeFormProps { - integrationId: string; - existingConfig: SnowflakeIntegrationConfig | null; - integrationName?: string; - onSave: (config: SnowflakeIntegrationConfig) => void; - onCancel: () => void; +type SnowflakeConfig = Extract; +type SnowflakeAuthMethod = SnowflakeConfig['metadata']['authMethod']; + +const SUPPORTED_AUTH_METHODS = [SnowflakeAuthMethods.Password, SnowflakeAuthMethods.ServiceAccountKeyPair] as const; + +function isSupportedSnowflakeAuthMethod(authMethod: SnowflakeAuthMethod): boolean { + return (SUPPORTED_AUTH_METHODS as readonly SnowflakeAuthMethod[]).includes(authMethod); } -// Helper to get initial values from existing config -function getInitialValues(existingConfig: SnowflakeIntegrationConfig | null) { - if (!existingConfig) { - return { +function createEmptySnowflakeConfig(params: { id: string; name?: string }): SnowflakeConfig { + const unnamedIntegration = getLocString('integrationsUnnamedIntegration', 'Unnamed Integration ({0})'); + + return { + id: params.id, + name: (params.name || format(unnamedIntegration, params.id)).trim(), + type: 'snowflake', + metadata: { + authMethod: SnowflakeAuthMethods.Password, + accountName: '', username: '', - password: '', - privateKey: '', - privateKeyPassphrase: '' - }; - } - - // Type narrowing based on authMethod - // Note: existingConfig can have authMethod === null (legacy configs from backend) - if (existingConfig.authMethod === null || existingConfig.authMethod === SnowflakeAuthMethods.PASSWORD) { - return { - username: existingConfig.username || '', - password: existingConfig.password || '', - privateKey: '', - privateKeyPassphrase: '' - }; - } else if (existingConfig.authMethod === SnowflakeAuthMethods.SERVICE_ACCOUNT_KEY_PAIR) { - return { - username: existingConfig.username || '', - password: '', - privateKey: existingConfig.privateKey || '', - privateKeyPassphrase: existingConfig.privateKeyPassphrase || '' - }; - } else { - // Unsupported auth method - try to extract username if available - return { - username: 'username' in existingConfig ? String(existingConfig.username || '') : '', - password: '', - privateKey: '', - privateKeyPassphrase: '' - }; - } + password: '' + } + }; +} + +export interface ISnowflakeFormProps { + integrationId: string; + existingConfig: SnowflakeConfig | null; + defaultName?: string; + onSave: (config: SnowflakeConfig) => void; + onCancel: () => void; } export const SnowflakeForm: React.FC = ({ integrationId, existingConfig, - integrationName, + defaultName, onSave, onCancel }) => { - const isUnsupported = existingConfig ? !isSupportedSnowflakeAuthMethod(existingConfig.authMethod) : false; - const initialValues = getInitialValues(existingConfig); - - const [name, setName] = React.useState(existingConfig?.name || integrationName || ''); - const [account, setAccount] = React.useState(existingConfig?.account || ''); - const [authMethod, setAuthMethod] = React.useState( - existingConfig?.authMethod ?? SnowflakeAuthMethods.PASSWORD + const [pendingConfig, setPendingConfig] = React.useState( + existingConfig && isSupportedSnowflakeAuthMethod(existingConfig.metadata.authMethod) + ? structuredClone(existingConfig) + : createEmptySnowflakeConfig({ id: integrationId, name: defaultName }) ); - const [username, setUsername] = React.useState(initialValues.username); - const [password, setPassword] = React.useState(initialValues.password); - const [privateKey, setPrivateKey] = React.useState(initialValues.privateKey); - const [privateKeyPassphrase, setPrivateKeyPassphrase] = React.useState(initialValues.privateKeyPassphrase); - const [database, setDatabase] = React.useState(existingConfig?.database || ''); - const [warehouse, setWarehouse] = React.useState(existingConfig?.warehouse || ''); - const [role, setRole] = React.useState(existingConfig?.role || ''); - - // Update form fields when existingConfig or integrationName changes + React.useEffect(() => { - if (existingConfig) { - const values = getInitialValues(existingConfig); - setName(existingConfig.name || ''); - setAccount(existingConfig.account || ''); - setAuthMethod(existingConfig.authMethod ?? SnowflakeAuthMethods.PASSWORD); - setUsername(values.username); - setPassword(values.password); - setPrivateKey(values.privateKey); - setPrivateKeyPassphrase(values.privateKeyPassphrase); - setDatabase(existingConfig.database || ''); - setWarehouse(existingConfig.warehouse || ''); - setRole(existingConfig.role || ''); - } else { - setName(integrationName || ''); - setAccount(''); - setAuthMethod(SnowflakeAuthMethods.PASSWORD); - setUsername(''); - setPassword(''); - setPrivateKey(''); - setPrivateKeyPassphrase(''); - setDatabase(''); - setWarehouse(''); - setRole(''); - } - }, [existingConfig, integrationName]); + setPendingConfig( + existingConfig && isSupportedSnowflakeAuthMethod(existingConfig.metadata.authMethod) + ? structuredClone(existingConfig) + : createEmptySnowflakeConfig({ id: integrationId, name: defaultName }) + ); + }, [existingConfig, integrationId, defaultName]); - const handleSubmit = (e: React.FormEvent) => { - e.preventDefault(); + // Extract values for form fields with proper type narrowing + const usernameValue = + pendingConfig.metadata.authMethod === SnowflakeAuthMethods.Password || + pendingConfig.metadata.authMethod === SnowflakeAuthMethods.ServiceAccountKeyPair + ? pendingConfig.metadata.username + : ''; + + const passwordValue = + pendingConfig.metadata.authMethod === SnowflakeAuthMethods.Password ? pendingConfig.metadata.password : ''; + + const privateKeyValue = + pendingConfig.metadata.authMethod === SnowflakeAuthMethods.ServiceAccountKeyPair + ? pendingConfig.metadata.privateKey + : ''; + + const privateKeyPassphraseValue = + pendingConfig.metadata.authMethod === SnowflakeAuthMethods.ServiceAccountKeyPair + ? pendingConfig.metadata.privateKeyPassphrase || '' + : ''; + + const handleNameChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + name: e.target.value + })); + }; - const unnamedIntegration = getLocString('integrationsUnnamedIntegration', 'Unnamed Integration ({0})'); - - let config: SnowflakeIntegrationConfig; - - if (authMethod === SnowflakeAuthMethods.PASSWORD) { - config = { - id: integrationId, - name: (name || format(unnamedIntegration, integrationId)).trim(), - type: 'snowflake', - account: account.trim(), - authMethod: authMethod, - username: username.trim(), - password: password.trim(), - database: database.trim() || undefined, - warehouse: warehouse.trim() || undefined, - role: role.trim() || undefined - }; - } else if (authMethod === SnowflakeAuthMethods.SERVICE_ACCOUNT_KEY_PAIR) { - // Guard against empty private key - if (!privateKey.trim()) { - return; + const handleAccountChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + metadata: { + ...prev.metadata, + accountName: e.target.value } + })); + }; - config = { - id: integrationId, - name: (name || format(unnamedIntegration, integrationId)).trim(), - type: 'snowflake', - account: account.trim(), - authMethod: authMethod, - username: username.trim(), - privateKey: privateKey.trim(), - privateKeyPassphrase: privateKeyPassphrase.trim() || undefined, - database: database.trim() || undefined, - warehouse: warehouse.trim() || undefined, - role: role.trim() || undefined - }; - } else { - // This shouldn't happen as we disable the form for unsupported methods - return; - } + const handleAuthMethodChange = (e: React.ChangeEvent) => { + const newAuthMethod = e.target.value as SnowflakeAuthMethod; - onSave(config); + setPendingConfig((prev) => { + if (newAuthMethod === SnowflakeAuthMethods.Password) { + return { + ...prev, + metadata: { + authMethod: SnowflakeAuthMethods.Password, + accountName: prev.metadata.accountName, + username: '', + password: '', + warehouse: prev.metadata.warehouse, + database: prev.metadata.database, + role: prev.metadata.role + } + }; + } else { + return { + ...prev, + metadata: { + authMethod: SnowflakeAuthMethods.ServiceAccountKeyPair, + accountName: prev.metadata.accountName, + username: '', + privateKey: '', + warehouse: prev.metadata.warehouse, + database: prev.metadata.database, + role: prev.metadata.role + } + }; + } + }); + }; + + const handleUsernameChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => { + if ( + prev.metadata.authMethod === SnowflakeAuthMethods.Password || + prev.metadata.authMethod === SnowflakeAuthMethods.ServiceAccountKeyPair + ) { + return { + ...prev, + metadata: { + ...prev.metadata, + username: e.target.value + } + }; + } + return prev; + }); + }; + + const handlePasswordChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => { + if (prev.metadata.authMethod === SnowflakeAuthMethods.Password) { + return { + ...prev, + metadata: { + ...prev.metadata, + password: e.target.value + } + }; + } + return prev; + }); + }; + + const handlePrivateKeyChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => { + if (prev.metadata.authMethod === SnowflakeAuthMethods.ServiceAccountKeyPair) { + return { + ...prev, + metadata: { + ...prev.metadata, + privateKey: e.target.value + } + }; + } + return prev; + }); + }; + + const handlePrivateKeyPassphraseChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => { + if (prev.metadata.authMethod === SnowflakeAuthMethods.ServiceAccountKeyPair) { + return { + ...prev, + metadata: { + ...prev.metadata, + privateKeyPassphrase: e.target.value + } + }; + } + return prev; + }); + }; + + const handleDatabaseChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + metadata: { + ...prev.metadata, + database: e.target.value || undefined + } + })); + }; + + const handleWarehouseChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + metadata: { + ...prev.metadata, + warehouse: e.target.value || undefined + } + })); + }; + + const handleRoleChange = (e: React.ChangeEvent) => { + setPendingConfig((prev) => ({ + ...prev, + metadata: { + ...prev.metadata, + role: e.target.value || undefined + } + })); + }; + + const handleSubmit = (e: React.FormEvent) => { + e.preventDefault(); + onSave(pendingConfig); }; return ( - {isUnsupported && ( - - {getLocString( - 'integrationsSnowflakeUnsupportedAuthMethod', - 'This Snowflake integration uses an authentication method that is not supported in VS Code. You can view the integration details but cannot edit or use it.' - )} - - )} - - {getLocString('integrationsSnowflakeNameLabel', 'Integration name')} + {getLocString('integrationsSnowflakeNameLabel', 'Name (optional)')} setName(e.target.value)} - placeholder={getLocString('integrationsSnowflakeNamePlaceholder', '')} + value={pendingConfig.name} + onChange={handleNameChange} + placeholder={getLocString('integrationsSnowflakeNamePlaceholder', 'My Snowflake Database')} autoComplete="off" - disabled={isUnsupported} /> @@ -183,13 +247,12 @@ export const SnowflakeForm: React.FC = ({ setAccount(e.target.value)} + value={pendingConfig.metadata.accountName} + onChange={handleAccountChange} placeholder={getLocString('integrationsSnowflakeAccountPlaceholder', 'abcd.us-east-1')} autoComplete="off" required pattern=".*\S.*" - disabled={isUnsupported} /> @@ -197,138 +260,132 @@ export const SnowflakeForm: React.FC = ({ {getLocString('integrationsSnowflakeAuthMethodLabel', 'Authentication')} - setAuthMethod(e.target.value as SnowflakeAuthMethod)} - disabled={isUnsupported} - > - + + {getLocString('integrationsSnowflakeAuthMethodUsernamePassword', 'Username & password')} - + {getLocString('integrationsSnowflakeAuthMethodKeyPair', 'Key-pair (service account)')} - {!isUnsupported && - (authMethod === SnowflakeAuthMethods.PASSWORD ? ( - <> - - - {getLocString('integrationsSnowflakeUsernameLabel', 'Username')}{' '} - {getLocString('integrationsRequiredField', '*')} - - setUsername(e.target.value)} - autoComplete="username" - required - pattern=".*\S.*" - /> - - - - - {getLocString('integrationsSnowflakePasswordLabel', 'Password')}{' '} - {getLocString('integrationsRequiredField', '*')} - - setPassword(e.target.value)} - placeholder={getLocString('integrationsSnowflakePasswordPlaceholder', '••••••••')} - autoComplete="current-password" - required - pattern=".*\S.*" - /> - - > - ) : ( - <> - - - {getLocString( - 'integrationsSnowflakeServiceAccountUsernameLabel', - 'Service Account Username' - )}{' '} - {getLocString('integrationsRequiredField', '*')} - - - {getLocString( - 'integrationsSnowflakeServiceAccountUsernameHelp', - 'The username of the service account that will be used to connect to Snowflake' - )} - - setUsername(e.target.value)} - autoComplete="username" - required - pattern=".*\S.*" - aria-describedby="username-help" - /> - - - - - {getLocString('integrationsSnowflakePrivateKeyLabel', 'Private Key')}{' '} - {getLocString('integrationsRequiredField', '*')} - - - {getLocString( - 'integrationsSnowflakePrivateKeyHelp', - 'The private key in PEM format. Make sure to include the entire key, including BEGIN and END markers.' - )} - - setPrivateKey(e.target.value)} - placeholder={getLocString( - 'integrationsSnowflakePrivateKeyPlaceholder', - "Begins with '-----BEGIN PRIVATE KEY-----'" - )} - rows={8} - autoComplete="off" - spellCheck={false} - autoCorrect="off" - autoCapitalize="off" - required - aria-describedby="privateKey-help" - /> - - - - - {getLocString( - 'integrationsSnowflakePrivateKeyPassphraseLabel', - 'Private Key Passphrase (optional)' - )} - - - {getLocString( - 'integrationsSnowflakePrivateKeyPassphraseHelp', - 'If the private key is encrypted, provide the passphrase to decrypt it' - )} - - setPrivateKeyPassphrase(e.target.value)} - autoComplete="off" - aria-describedby="privateKeyPassphrase-help" - /> - - > - ))} + {pendingConfig.metadata.authMethod === SnowflakeAuthMethods.Password ? ( + <> + + + {getLocString('integrationsSnowflakeUsernameLabel', 'Username')}{' '} + {getLocString('integrationsRequiredField', '*')} + + + + + + + {getLocString('integrationsSnowflakePasswordLabel', 'Password')}{' '} + {getLocString('integrationsRequiredField', '*')} + + + + > + ) : ( + <> + + + {getLocString( + 'integrationsSnowflakeServiceAccountUsernameLabel', + 'Service Account Username' + )}{' '} + {getLocString('integrationsRequiredField', '*')} + + + {getLocString( + 'integrationsSnowflakeServiceAccountUsernameHelp', + 'The username of the service account that will be used to connect to Snowflake' + )} + + + + + + + {getLocString('integrationsSnowflakePrivateKeyLabel', 'Private Key')}{' '} + {getLocString('integrationsRequiredField', '*')} + + + {getLocString( + 'integrationsSnowflakePrivateKeyHelp', + 'The private key in PEM format. Make sure to include the entire key, including BEGIN and END markers.' + )} + + + + + + + {getLocString( + 'integrationsSnowflakePrivateKeyPassphraseLabel', + 'Private Key Passphrase (optional)' + )} + + + {getLocString( + 'integrationsSnowflakePrivateKeyPassphraseHelp', + 'If the private key is encrypted, provide the passphrase to decrypt it' + )} + + + + > + )} @@ -337,11 +394,10 @@ export const SnowflakeForm: React.FC = ({ setDatabase(e.target.value)} + value={pendingConfig.metadata.database || ''} + onChange={handleDatabaseChange} placeholder={getLocString('integrationsSnowflakeDatabasePlaceholder', '')} autoComplete="off" - disabled={isUnsupported} /> @@ -350,11 +406,10 @@ export const SnowflakeForm: React.FC = ({ setRole(e.target.value)} + value={pendingConfig.metadata.role || ''} + onChange={handleRoleChange} placeholder={getLocString('integrationsSnowflakeRolePlaceholder', '')} autoComplete="off" - disabled={isUnsupported} /> @@ -365,16 +420,15 @@ export const SnowflakeForm: React.FC = ({ setWarehouse(e.target.value)} + value={pendingConfig.metadata.warehouse || ''} + onChange={handleWarehouseChange} placeholder={getLocString('integrationsSnowflakeWarehousePlaceholder', '')} autoComplete="off" - disabled={isUnsupported} /> - + {getLocString('integrationsSave', 'Save')} diff --git a/src/webviews/webview-side/integrations/types.ts b/src/webviews/webview-side/integrations/types.ts index f5660e4b21..80ab12f2a3 100644 --- a/src/webviews/webview-side/integrations/types.ts +++ b/src/webviews/webview-side/integrations/types.ts @@ -1,96 +1,19 @@ -import { - type SnowflakeAuthMethod, - SnowflakeAuthMethods, - SUPPORTED_SNOWFLAKE_AUTH_METHODS, - isSupportedSnowflakeAuthMethod -} from '../../../platform/notebooks/deepnote/snowflakeAuthConstants'; - -export type IntegrationType = 'postgres' | 'bigquery' | 'snowflake'; +import { DatabaseIntegrationConfig, type DatabaseIntegrationType } from '@deepnote/database-integrations'; export type IntegrationStatus = 'connected' | 'disconnected' | 'error'; -// Re-export Snowflake auth constants for convenience -export { - type SnowflakeAuthMethod, - SnowflakeAuthMethods, - SUPPORTED_SNOWFLAKE_AUTH_METHODS, - isSupportedSnowflakeAuthMethod -}; - -export interface BaseIntegrationConfig { - id: string; - name: string; - type: IntegrationType; -} - -export interface PostgresIntegrationConfig extends BaseIntegrationConfig { - type: 'postgres'; - host: string; - port: number; - database: string; - username: string; - password: string; - ssl?: boolean; -} - -export interface BigQueryIntegrationConfig extends BaseIntegrationConfig { - type: 'bigquery'; - projectId: string; - credentials: string; -} - -/** - * Base Snowflake configuration with common fields - */ -interface BaseSnowflakeConfig extends BaseIntegrationConfig { - type: 'snowflake'; - account: string; - warehouse?: string; - database?: string; - role?: string; -} - -/** - * Snowflake integration configuration (discriminated union) - */ -export type SnowflakeIntegrationConfig = BaseSnowflakeConfig & - ( - | { - authMethod: typeof SnowflakeAuthMethods.PASSWORD | null; - username: string; - password: string; - } - | { - authMethod: typeof SnowflakeAuthMethods.SERVICE_ACCOUNT_KEY_PAIR; - username: string; - privateKey: string; - privateKeyPassphrase?: string; - } - | { - // Unsupported auth methods - we store them but don't allow editing - authMethod: - | typeof SnowflakeAuthMethods.OKTA - | typeof SnowflakeAuthMethods.NATIVE_SNOWFLAKE - | typeof SnowflakeAuthMethods.AZURE_AD - | typeof SnowflakeAuthMethods.KEY_PAIR; - [key: string]: unknown; // Allow any additional fields for unsupported methods - } - ); - -export type IntegrationConfig = PostgresIntegrationConfig | BigQueryIntegrationConfig | SnowflakeIntegrationConfig; - export interface IntegrationWithStatus { id: string; - config: IntegrationConfig | null; + config: DatabaseIntegrationConfig | null; status: IntegrationStatus; integrationName?: string; - integrationType?: IntegrationType; + integrationType?: DatabaseIntegrationType; } export interface IVsCodeMessage { type: string; integrationId?: string; - config?: IntegrationConfig; + config?: DatabaseIntegrationConfig; } export interface UpdateMessage { @@ -101,9 +24,9 @@ export interface UpdateMessage { export interface ShowFormMessage { type: 'showForm'; integrationId: string; - config: IntegrationConfig | null; + config: DatabaseIntegrationConfig | null; integrationName?: string; - integrationType?: IntegrationType; + integrationType?: DatabaseIntegrationType; } export interface StatusMessage {
- {getLocString( - 'integrationsSnowflakeServiceAccountUsernameHelp', - 'The username of the service account that will be used to connect to Snowflake' - )} -
- {getLocString( - 'integrationsSnowflakePrivateKeyHelp', - 'The private key in PEM format. Make sure to include the entire key, including BEGIN and END markers.' - )} -
- {getLocString( - 'integrationsSnowflakePrivateKeyPassphraseHelp', - 'If the private key is encrypted, provide the passphrase to decrypt it' - )} -
+ {getLocString( + 'integrationsSnowflakeServiceAccountUsernameHelp', + 'The username of the service account that will be used to connect to Snowflake' + )} +
+ {getLocString( + 'integrationsSnowflakePrivateKeyHelp', + 'The private key in PEM format. Make sure to include the entire key, including BEGIN and END markers.' + )} +
+ {getLocString( + 'integrationsSnowflakePrivateKeyPassphraseHelp', + 'If the private key is encrypted, provide the passphrase to decrypt it' + )} +