Skip to content

course-files/Kafka

Repository files navigation

Kafka

Key Value
Course Code BBT 4106
Course Name BBT 4106: Business Intelligence I (Week 4-6)
Semester April to July 2026
Lecturer Allan Omondi
Contact aomondi@strathmore.edu
Note The lecture contains both theory and practice.
This notebook forms part of the practice.
It is intended for educational purposes only.
Recommended citation: BibTex

Technology Stack

System Architecture

System Architecture

Repository Structure

.
├── 0_admin_instructions
│   ├── 0_instructions_for_project_setup.md
│   ├── 1_instructions_for_python_installation.md
│   └── 2_instructions_for_project_cleanup.md
├── 1_kafka_fundamentals
│   ├── consumer_order_inventory.py
│   ├── consumer_order_notification.py
│   ├── docker-compose.yaml
│   ├── instructions_for_project_setup.md
│   ├── producer_order.py
│   ├── project_cleanup.sh
│   ├── project_setup.sh
│   └── requirements.txt
├── 2_containerized_microservices
│   ├── consumer-inventory
│   │   ├── Dockerfile.consumer-inventory
│   │   ├── consumer_order_inventory.py
│   │   ├── models.py
│   │   └── requirements.txt
│   ├── consumer-notification
│   │   ├── Dockerfile.consumer-notification
│   │   ├── consumer_order_notification.py
│   │   └── requirements.txt
│   ├── database
│   │   └── init.sql
│   ├── docker-compose.yaml
│   ├── instructions_for_project_setup.md
│   ├── producer
│   │   ├── Dockerfile.producer
│   │   ├── producer_order.py
│   │   └── requirements.txt
│   ├── project_cleanup.sh
│   └── project_setup.sh
├── 3_data_engineering
│   ├── clickhouse
│   │   ├── config.d
│   │   │   └── timezone.xml
│   │   └── init.sql
│   ├── consumer-inventory
│   │   ├── Dockerfile.consumer-inventory
│   │   ├── consumer_order_inventory.py
│   │   ├── models.py
│   │   └── requirements.txt
│   ├── consumer-notification
│   │   ├── Dockerfile.consumer-notification
│   │   ├── consumer_order_notification.py
│   │   └── requirements.txt
│   ├── database
│   │   └── init.sql
│   ├── docker-compose.yaml
│   ├── instructions_for_project_setup.md
│   ├── kafka-connect
│   │   ├── connector-config.json
│   │   ├── connector-config.json_documented_version.md
│   │   └── register-connector.sh
│   ├── producer
│   │   ├── Dockerfile.producer
│   │   ├── producer_order.py
│   │   └── requirements.txt
│   ├── project_cleanup.sh
│   ├── project_setup.sh
│   └── transformer
│       ├── Dockerfile.transformer
│       ├── requirements.txt
│       └── transformer.py
├── 4_data_analytics
│   ├── connect_clickhouse_with_ODBC.R
│   ├── generate_data.py
│   ├── instructions_for_project_setup.md
│   ├── lab4_analytics_with_odbc.Rmd
│   └── lab4_analytics_with_odbc.nb.html
├── Kafka.Rproj
├── LICENSE
├── README.md
├── assets
│   └── images
│       ├── DataGrip_Output.png
│       └── SystemArchitecture.png
├── lab_submission_instructions.md
├── project_cleanup.sh
└── requirements
    ├── base.txt
    ├── colab.txt
    ├── constraints.txt
    ├── dev.inferred.txt
    ├── dev.lock.txt
    ├── dev.txt
    └── prod.txt

21 directories, 68 files

Setup Instructions

Lab Manual

Refer to the files below, in the order specified, for more details:

  1. Part 1: Kafka Fundamentals
  2. Part 2: Containerized Microservices
  3. Part 3: Data Engineering using Kafka
  4. Part 4: Data Analytics using R and ClickHouse

Lab Submission Instructions

Cleanup Instructions (to be done after submitting the lab)

About

How to create a data pipeline to move data from a relational DB (PostgreSQL) to a data warehouse (ClickHouse) using Kafka with KRaft and Debezium Change Data Capture (CDC).

Resources

License

Stars

Watchers

Forks

Contributors

Languages