Skip to content

thejasono/data-pipeline-airflow-api-postgres-dbt-powerbi

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

API → Postgres → dbt → Power BI

This stack loads raw data from the mock API into Postgres, transforms it with dbt, and exposes cleaned analytics tables (materialized in the public_analytics schema) for the bi_read user.

Diagramtic

Services

  • docker – container runtime and packaging layer; provides consistent environments for Postgres, Airflow, dbt, and the mock API.
  • postgres – database used by all components
  • mock-api – provides sample API data
  • dbt – dbt CLI container with the project mounted at /usr/app
  • airflow – orchestrates extraction and dbt transformations

Usage

Start the core services (dbt is invoked on-demand by the Airflow DAG):

docker compose up -d --build

Airflow performs its own database initialization on startup, so no separate init container is required.

The DAG will execute dbt via docker compose run --rm dbt ... after extracting raw data. You can still run dbt manually if needed:

docker compose run --rm dbt run
docker compose run --rm dbt test

Power BI usage for this project is intentionally lightweight—the goal is simply to prove the pipeline delivers analytics tables that Power BI can read. Connect with the bi_read credentials and verify that the public_analytics schema tables are visible. A fully designed report or synthetic visuals are not required for sign-off.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published