Skip to content

asynchronous event handling using microservices and kafka with go

Notifications You must be signed in to change notification settings

bilalislam/order-fulfillment

Repository files navigation

Overview of Order Fulfillment Implementation

Techniques employed

  • Build a microservice.
  • Build an event publisher.
  • Build multiple event consumers.

Project outline

  1. Kafka Basics Using the Command Line
  2. Build a Basic Microservice and Kafka Event Publisher
  3. Build an Order Service and Publish a First Event
  4. Build an Inventory Consumer That Handles Order Received Events
  5. Build Notification, Warehouse, and Shipper Consumers
  6. Define and Use KPIs To Evaluate System Performance

  1. two KPIs were defined and published to two new Kafka topics
    1. A Metric that tracks each time an order is placed and how many products were ordered is defined to expose information which could be used to answer things like:
      • how the system is performing (i.e. 0 orders in an hour might be indicative of a larger problem)
      • time of day customers tend to shop
      • how many orders per (second/minute/hours etc)
      • average number of products per order
    2. A Metric that can be used to track the total time it takes to process an order, from when it was received by the warehouse and when it was picked and packed. This can be used to answer things like:
      • average time it takes to pick and pack an order
      • what times of day are slower than others

How to Test?

I was able to test all of the code created in this milestone on my local machine. The instructions below assume you are running on your local machine. I implemented this on a Mac, so references to the command-line will show as a UNIX shell.

  1. Kafka and Zookeeper need to be running
    1. The OrderReceived topic should be created
    2. The OrderPickedAndPacked topic should be created
    3. The Notification topic should be created
    4. The DeadLetterQueue topic should be created
      1. All the topics need to be created: click here for more information
      2. Start kafka: click here for more information
      3. Start zookeeper: click here for more information
    5. A new topic OrderCount is created
      $ $KAFKA_HOME/bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic OrderCount
    6. A new topic OrderTime is created
      $ $KAFKA_HOME/bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic OrderTime
  2. The Postgress database needs to be running
    1. The right database, schema and table need to be created: click here for more information
  3. The Order service needs to be running (assumes you are in the /code folder)
    1. If this is the first time you are running this code, you will need to setup Go modules
      1. In your ~/.bash_profile, make sure you have the following ENV var set: export GO111MODULE=on and make sure the file is sourced.
      2. Initialize Go modules
        $ go mod init
        $ go mod tidy
    2. Run the Order service
      $ go run order/main.go
  4. The Inventory consumer needs to be running (assumes you are in the /code folder)
    1. If this is the first time you are running this code, you will need to setup Go modules
      1. In your ~/.bash_profile, make sure you have the following ENV var set: export GO111MODULE=on and make sure the file is sourced.
      2. Initialize Go modules
        $ go mod init
        $ go mod tidy
    2. Run the Inventory consumer service
      $ go run inventory/main.go
  5. The Warehouse consumer needs to be running (assumes you are in the /code folder)
    1. If this is the first time you are running this code, you will need to setup Go modules
      1. In your ~/.bash_profile, make sure you have the following ENV var set: export GO111MODULE=on and make sure the file is sourced.
      2. Initialize Go modules
        $ go mod init
        $ go mod tidy
    2. Run the Warehouse consumer service
      $ go run warehouse/main.go
  6. The Notification consumer needs to be running (assumes you are in the /code folder)
    1. If this is the first time you are running this code, you will need to setup Go modules
      1. In your ~/.bash_profile, make sure you have the following ENV var set: export GO111MODULE=on and make sure the file is sourced.
      2. Initialize Go modules
        $ go mod init
        $ go mod tidy
    2. Run the Notification consumer service
      $ go run notification/main.go
  7. The Shipper consumer needs to be running (assumes you are in the /code folder)
    1. If this is the first time you are running this code, you will need to setup Go modules
      1. In your ~/.bash_profile, make sure you have the following ENV var set: export GO111MODULE=on and make sure the file is sourced.
      2. Initialize Go modules
        $ go mod init
        $ go mod tidy
    2. Run the Shipper consumer service
      $ go run shipper/main.go
  8. Send a HTTP request to the order service:
    $ curl -v -H "Content-Type: application/json" -d '{"id":"6e042f29-350b-4d51-8849-5e36456dfa48","products":[{"productCode":"12345","quantity":2}],"customer":{"firstName":"Tom","lastName":"Hardy","emailAddress":"tom.hardy@email.com","shippingAddress":{"line1":"123 Anywhere St","city":"Anytown","state":"AL","postalCode":"12345"}}}' http://localhost:8080/orders
  9. You should see output in the console of the order service, and no errors. You can also check the contents of the OrderReceived topic in Kafka.
    $ $KAFKA_HOME/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic OrderReceived --from-beginning
  10. You can also check the contents of the Notification topic in Kafka.
    $ $KAFKA_HOME/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic Notification --from-beginning
  11. You should see output in the console of the inventory consumer, and no errors.
  12. You should see output in the console of the warehouse consumer, and no errors.
  13. You should see output in the console of the notification consumer, and no errors.
  14. To check the the shipping consumer, you will need to manually publish a message to the OrderPickedAndPacked topic in Kafka.
    $ $KAFKA_HOME/bin/kafka-console-producer.sh --bootstrap-server localhost:9092 --topic OrderPickedAndPacked
    Then you can paste a payload like this (though it should correlate to the same order being tested with in previous steps)
    {"EventBase":{"EventID":"4a651ef8-a851-4d77-a58b-3d8af748a570","EventTimestamp":"2020-08-16T16:03:05.258542-04:00"},"EventBody":{"id":"c6b37316-b4da-4b25-94c8-14c08bad95e6","products":[{"productCode":"12345","quantity":2}],"customer":{"firstName":"Tom","lastName":"Hardy","emailAddress":"tom.hardy@email.com","shippingAddress":{"line1":"123 Anywhere St","city":"Anytown","state":"AL","postalCode":"12345"}}}}
  15. You should see output in the console of the shipper consumer, and no errors.

Project Conclusions

After all milestones are complete, you will have a collection of microservices and consumers that can independently scale and evolve over time, are loosely coupled, and are highly cohesive. This outcome is made possible by the introduction of a distributed messaging system like Kafka. Using Kafka adds fault tolerance and durable storage of events leveraged in the system. When used together, these components create an asynchronous, event-driven architecture that makes your system adaptable, scalable, and resilient over time. You should be able to take the lessons learned here and apply the concepts to future projects that participate within a distributed system.

Review what you have learned and accomplished:

  1. Interacting with Kafka by using provided command-line tools
  2. Creating and altering Kafka topics
  3. Publishing events to, and consuming events from, Kafka topics through the command line and programmatically through Go
  4. Using asynchronous communication patterns across independent services and consumers
  5. Handling situations when events can’t be processed
  6. Evaluating performance of the system through the lens of the business by defining and publishing KPIs

useful-links

  1. https://nordicapis.com/using-api-analytics-to-empower-the-platform/
  2. https://api-umbrella.readthedocs.io/en/latest/getting-started.html
  3. https://github.com/ddosify/alaz?ea-publisher=readthedocs
  4. https://microservices.io/patterns/communication-style/idempotent-consumer.html
  5. https://kafka.apache.org/documentation/#producerconfigs_enable.idempotence
  6. https://lankydan.dev/running-kafka-locally-with-docker

About

asynchronous event handling using microservices and kafka with go

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published