A collection of sample Databricks Apps demonstrating different frameworks, integrations, and use cases for building applications on the Databricks platform.
Location: holiday_request_app/
Framework: Streamlit
Database: Databricks Lakebase (PostgreSQL)
A full-stack holiday request management system showcasing the integration between Databricks Lakebase and Databricks Apps. Features include:
- Employee holiday request submission and tracking
- Manager approval/decline workflow with comments
- Real-time status updates with color-coded interface (π‘ Pending, π’ Approved, π΄ Declined)
- OAuth-based authentication with automatic token management
- Radio button selection interface for intuitive request management
Key Technologies: SQLAlchemy, psycopg, OAuth tokens, session state management
Notable Features: Demonstrates proper PostgreSQL connection patterns with OAuth token injection, comprehensive error handling, and enterprise-grade security through Unity Catalog integration.
Location: data_ui_app/
Framework: Streamlit
Purpose: Data Analysis & Visualization
An intelligent data analysis application that automatically transforms uploaded data files into interactive dashboards. Features include:
- Automatic data type detection (CSV, JSON, text)
- Smart visualizations with histograms, bar charts, and correlation heatmaps
- Interactive data explorer with filtering, sorting, and pagination
- Comprehensive statistics with data profiling and quality metrics
- Entity extraction from unstructured text (emails, phones, dates, currency)
- Export functionality (CSV, JSON, text summaries)
Key Technologies: Streamlit, Plotly, Pandas, NumPy, regex pattern matching
Notable Features: Zero-configuration data analysis with automatic chart generation, intelligent column type detection, and built-in sample datasets for demonstration.
Location: chatbotcuj_app/
Framework: Gradio
Integration: Model Serving Endpoints
A conversational AI chatbot demonstrating integration with Databricks Model Serving endpoints. Features include:
- Chat interface with conversation history
- LLM endpoint integration with configurable parameters
- OpenAI-style message formatting for compatibility
- Error handling and logging for production reliability
- Environment-based configuration through app.yaml
Key Technologies: Gradio ChatInterface, Databricks Model Serving, OpenAI message format
Notable Features: Shows how to build production-ready chatbots with proper error handling, request logging, and seamless integration with Databricks serving infrastructure.
Location: simple_streamlit_app/
Framework: Streamlit
Purpose: Basic demonstration
A minimal Streamlit application demonstrating basic Databricks Apps functionality. Features include:
- Simple button interactions with immediate feedback
- Session state management for persistent interactions
- Clean, centered page layout with modern UI elements
- Multiple interaction patterns (immediate and persistent buttons)
Key Technologies: Streamlit core components, session state
Notable Features: Perfect starting point for new Databricks Apps, demonstrates fundamental Streamlit patterns and session management.
Location: simple_lakebase_app/
Framework: Streamlit
Database: Databricks Lakebase (PostgreSQL)
A basic database connectivity demonstration showing how to connect Streamlit apps to Lakebase. Features include:
- Direct PostgreSQL connection using OAuth authentication
- Database introspection with connection details display
- Token presence verification for debugging
- Simple data retrieval from holiday requests table
Key Technologies: SQLAlchemy, psycopg, OAuth tokens, PostgreSQL
Notable Features: Minimal example of Lakebase integration, useful for understanding basic database connection patterns and debugging connectivity issues.
All apps use OAuth token-based authentication through the Databricks SDK:
from databricks.sdk import WorkspaceClient
workspace_client = WorkspaceClient()
token = workspace_client.config.oauth_token().access_token
PostgreSQL connections use automatic token injection:
@event.listens_for(engine, "do_connect")
def provide_token(dialect, conn_rec, cargs, cparams):
cparams["password"] = workspace_client.config.oauth_token().access_token
Apps use app.yaml
for runtime configuration:
- Port specification (typically 8000 for Streamlit)
- Server address binding (0.0.0.0 for proper networking)
- Environment variables for service endpoints
- Resource declarations (database instances, model endpoints)
All apps in this project use uv for fast and reliable Python package management. uv provides:
- Ultra-fast dependency resolution - 10-100x faster than pip
- Deterministic installs with lock files
- Virtual environment management built-in
- Compatible with pip and existing Python workflows
# macOS and Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# Windows
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
# With pip (if you already have Python)
pip install uv
For more installation options, see the uv installation guide.
- Databricks workspace with Apps enabled
- Unity Catalog configured
- Lakebase instance (for database apps)
- Model Serving endpoint (for chatbot app)
- uv for dependency management
- Create new Databricks App in your workspace
- Configure resources (database, model endpoints) as needed
- Upload application code or clone from repository
- Deploy with one-click through the Databricks interface
Each app includes development setup instructions using uv:
# Initialize project and create virtual environment
uv init
# Install dependencies from requirements.txt
uv add -r requirements.txt
# Activate virtual environment
source .venv/bin/activate
# Or run directly with uv (recommended)
uv run streamlit run app.py --server.port=8000
For more detailed uv usage, see the uv getting started guide.
- Serverless runtime - No infrastructure management required
- Built-in authentication - OAuth integration with workspace identity
- Unity Catalog integration - Seamless governance and security
- One-click deployment - Simplified CI/CD workflows
- Fully managed PostgreSQL - No database administration overhead
- Low-latency operations - Optimized for transactional workloads
- Lakehouse integration - Native connectivity to Delta Lake
- Built-in security - Automatic encryption and access controls
- Use type hints for better code maintainability
- Implement comprehensive error handling with user-friendly messages
- Follow PEP 8 style guidelines (use Black for formatting)
- Include docstrings for complex functions
- Never hardcode credentials - Use OAuth tokens and environment variables
- Validate user inputs to prevent SQL injection
- Use parameterized queries with SQLAlchemy
- Implement proper access controls through Unity Catalog
- Use connection pooling for database operations
- Implement caching for expensive operations
- Monitor resource usage through Databricks observability
- Optimize queries and data retrieval patterns
When adding new apps to this collection:
- Follow naming conventions - Use descriptive directory names
- Include comprehensive README - Document purpose, setup, and key features
- Provide sample data - Include test datasets where applicable
- Add proper error handling - Ensure production-ready code quality
- Update this main README - Add your app to the overview section
- Databricks Apps Documentation
- Databricks Lakebase Guide
- Model Serving Documentation
- Unity Catalog Security
- uv Documentation - Fast Python package and project manager
- uv Getting Started - Installation and basic usage
- uv User Guide - Comprehensive tutorials and guides
Built with β€οΈ on the Databricks platform
Each app in this collection demonstrates different aspects of building modern data applications with integrated security, governance, and deployment capabilities.