Skip to content

Simple project using Kafka to trigger REST calls to FastAPI (Python REST framework) to make changes to Database.

License

Notifications You must be signed in to change notification settings

Leenoose/FastAPI-Kafka-SQL-Example

Repository files navigation

Contributors Forks Stargazers Issues MIT License LinkedIn

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Deploying on OpenShift
  5. License

About The Project

This is a simple project to demonstrate running a FastAPI server as a Kafka Consumer and Producer, with the FastAPI server listening to Kafka topics and writing them to the database as they come in. The functionality of the FastAPI server is to listen to messages on a specific topic, and to write the messages to a PostGreSQL database. A POST endpoint is available for the user to write messages to topics.

(back to top)

Built With

  • Python
  • FastAPI
  • PostgreSQL

(back to top)

Getting Started

To get a local copy up and running follow these simple steps.

Prerequisites

This is an example of how to list things you need to use the software and how to install them.

  • Ensure Python is installed and of version >3.7
      python --version
      Python 3.12.3

Installation

  1. Clone the repo
    git clone https://github.com/Leenoose/FastAPI-Kafka-SQL-Example.git
    cd FastAPI-Kafka-SQL-Example/
  2. Create virtual environment and switch into it for a clean environment to start
    python -m venv .venv
    source .venv/bin/activate
  3. Install required dependencies
    pip install -r requirements.txt
  4. Ensure you have a PostgreSQL running. If not, use Podman or Docker to run an image.
    podman run -d --name <db-name> -p 5432:5432 -e POSTGRES_PASSWORD=<mypassword> -v /db-init-script.sql:/docker-entrypoint-initdb.d/init.sql postgres:latest
    #The command should run interchangeably with Docker. the -v flag copies the db-init-script.sql into the container and runs it at initialization
  5. Ensure that you have a Kafka Zookeeper and Kafka Server instance running, with a topic created.
    #On your Local machine or a Kafka container
    bin/zookeeper-server-start.sh config/zookeeper.properties
    
    #On another terminal on your local machine or another Kafka container
    bin/kafka-server-start.sh config/server.properties
    
    #On another terminal (3) on your local machine or another Kafka container
    bin/kafka-topics.sh --create --topic <event-name> --bootstrap-server localhost:9092
  6. Run the FastAPI server
    uvicorn main:app

(back to top)

Usage

Access the Swagger UI via localhost:8000 (or whichever port you are using for this project) and run the /producer/{topicname} POST request.

After doing so, check your database. The message that was sent in the POST request should have a new entry in the database.

(back to top)

Deploying on OpenShift

To deploy on Red Hat OpenShift, follow the steps as follows.

  1. Provisioning Kafka

To provision a Kafka cluster, install the operator AMQ Streams. After doing so, use the Provided APIs to create a Kafka cluster. Feel free to name the cluster however you wish.

  1. Provision a PostgreSQL instance

Simply add a Database via the Developer Catalog, and select PostgreSQL. Note that when doing so, it is important to have a PostgreSQL Connection Username, PostgreSQL Connection Password, and a PostgreSQL Database Name. These will be used in the environment variables

  1. Cloning the project

Add the project to your OpenShift console using the git repo url (https://github.com/Leenoose/FastAPI-Kafka-SQL-Example.git). Use the default import strategy (Dockerfile) for this.

Note that the build for the projet will fail. This is because the environment variables are not being set yet, and the default values (localhost) are still being used, even though they are not applicable here. To remedy this, go to Builds, look for the name of this repo (if you did not change it when adding the project), and edit the build config. There should be a section to add environment variables. Update the values as follows

Name Value
KAFKA_HOSTNAME <kafka-cluster-url>
DB_HOSTNAME <postgres-cluster-url>
DB_USER PostgreSQL Connection username
DB_PASSWORD PostgreSQL Connection password
DB_NAME PostgreSQL Database Name

For the HOSTNAME variables, refer to the hostname you have under Administrator > Networking > Services. Click into the individual services and copy the value under Service routing.

License

Distributed under the MIT License. See LICENSE.txt for more information.

(back to top)

About

Simple project using Kafka to trigger REST calls to FastAPI (Python REST framework) to make changes to Database.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published