Marimo notebooks connected to BigQuery, running locally via Docker.
- Docker Desktop installed and running
- gcloud CLI installed
./run.shThe script will:
- Prompt for your GCP project ID if not set
- Authenticate to GCP if needed (
gcloud auth application-default login) - Build the Docker image on first run
- Start Marimo and open it in your browser
Your notebooks are saved to the ./notebooks/ folder and persist between sessions.
To stop: docker compose down
import marimo as mo
from google.cloud import bigquery
import os
client = bigquery.Client(project=os.environ["GOOGLE_CLOUD_PROJECT"])
df = client.query("SELECT * FROM `project.dataset.table` LIMIT 100").to_dataframe()
mo.ui.table(df)A working example is available at notebooks/bigquery_example.py.
git pull
docker compose build
docker compose up