Connect data sources to a Kavla canvas.
Receive SQL from the browser, execute it locally, and relay previews back.
Your credentials never leave your machine.
- Every Kavla canvas has its own isolated backend
- A CLI session connects to one canvas over a dedicated websocket
- When it connects the CLI shares its configured sources to the canvas
- Only the CLI owner can execute queries against the CLI sources
- SQL can now be sent from the canvas to the CLI
- Credentials are never sent, they stay local
- When the CLI is terminated, the connection drops instantly
- Preview results are stored on the canvas after the CLI disconnects
- Results follow the canvas access rules and need to be explicitly shared
| Source | Type | Connection |
|---|---|---|
| DuckDB | duckdb |
Path to an existing DuckDB file |
| Directory | directory |
Path to a directory with CSV, Parquet, or JSON files |
| BigQuery | bigquery |
Google Cloud project ID |
| Postgres | postgres |
Postgres URI |
curl -fsSL https://raw.githubusercontent.com/aleda145/kavla-cli/main/install.sh | bashkavla initkavla config add-sourcekavla loginkavla connectOr connect directly:
kavla connect <room_id>Config lives in ~/.kavla/config.yaml. Tokens and source definitions stay on your machine.
