View this repository on GitHub For Local Deployment Guide
This repository contains the technical prototype of a lightweight CRM system built on a microservices architecture. It demonstrates data ingestion, audience segmentation, campaign execution, Kafka-based asynchronous processing, and AI-assisted features.
The system is composed of the following services:
- Data Ingestion Faker β Generates artificial
Customer
andOrder
data. - MiniCRM Backend β Core backend handling REST APIs, campaigns, and Kafka producers/consumers.
- Vendor Faker β Simulates external delivery vendors with randomized outcomes.
- Frontend β User interface for campaign management, segmentation, and history tracking.
The MiniCRM project follows a microservices-driven event-based architecture, where services communicate asynchronously using Kafka as the backbone.
-
MiniCRM Backend (
minicrm_backend
)- Exposes all REST endpoints (Customers, Orders, Campaigns, Users, Logs, Receipts, AI).
- Only service directly integrated with Kafka.
- Responsible for campaign orchestration and async updates.
-
Data Ingestion Faker (
data_injestion_faker
)- Simulates customer and order data streams.
- Publishes mock records to Kafka (
customer-topic
,order-topic
).
-
Vendor Backend (
vendor_faker
)- Emulates a third-party vendor delivering campaign communications.
- Randomizes
success
/fail
outcomes (10% failure chance). - Reports results back via Receipts API.
-
PostgreSQL Database
- Central data store for Customers, Orders, Campaigns, Communication Logs, and Users.
- Accessed through Prisma ORM.
-
Kafka Topics
- Backbone for inter-service async communication:
customer-topic
β Customer ingestion.order-topic
β Order ingestion.campaign-deliveries
β Campaign execution batches.deliveries-receipt
β Vendor delivery outcomes.
- Backbone for inter-service async communication:
[Data Ingestion Faker] β (customer-topic / order-topic) β [MiniCRM Backend] β [Postgres DB]
- Customers & orders are published into Kafka.
- Backend consumers persist data into Postgres.
[Frontend] β (POST /campaigns) β [MiniCRM Backend]
- Request body includes
ruleGroup
(segmentation criteria). - Backend:
- Converts
ruleGroup
into Prismawhere
clause. - AI module generates campaign intent.
- Shortlisted customers are published to
campaign-deliveries
. - Campaign record created in DB.
- API response returns campaign ticket.
- Converts
[campaign-deliveries Topic] β [Vendor Backend] β External Vendor (simulated)
- Vendor Faker consumes batches of campaign customers.
- For each message:
- Simulates delivery (
success
orfail
). - Sends POST
/receipts
β MiniCRM Backend.
- Simulates delivery (
[Vendor Backend] β (POST /receipts) β [MiniCRM Backend] β (deliveries-receipt Topic)
- Receipts are quickly enqueued into Kafka (
deliveries-receipt
). - Async consumer processes them:
- Updates
Communication_log
. - Adjusts
sent_count
/failed_count
in Campaign. - On last receipt β triggers AI summarizer to finalize campaign insights.
- Updates
- Async-first β Kafka buffers ensure resilient, decoupled services.
- Batch Processing β Consumers process records in chunks for efficiency.
- Fail-Safe Vendor Simulation β Vendor failures are randomized for realism.
- DB Consistency via Prisma β All campaign & communication logs are updated transactionally.
- AI Integration β Provides campaign intent detection & summary generation.
To run the system, 4 terminals are required.
cd data_injestion_faker
npm install
node index.js
-
Comment/uncomment the appropriate start functions in index.js to switch between:
-
Customer Generator
-
Order Generator
-
cd vendor_faker
npm install
node index.js
-
Runs on port 3500.
-
Simulates vendor delivery attempts and reports results back to the backend via receipts.
Two terminals are needed:
cd minicrm_backend
npm install
node index.js
- Runs on port 3000.
node consumer.js
- Handles background processing for campaign deliveries and receipts.
Kafka is the backbone of this system, enabling asynchronous communication between microservices. The topics are initialized in config/kafkaInit.js
.
-
customer-topic
Carries validated customer data from the Data Ingestion Faker before bulk insertion into the database. -
order-topic
Carries validated order data from the Data Ingestion Faker before bulk insertion into the database. -
campaign-deliveries
Holds campaign-triggered customer batches that are queued for delivery via the Vendor Faker. -
deliveries-receipt
Receives vendor delivery outcomes (success
/fail
) which are then processed into the Communication Log and campaign status updates.
The system uses PostgreSQL as the primary database. Prisma ORM is used for schema management and queries.
The schema models represent the CRM domain: customers, orders, campaigns, communications, and users.
Represents an individual customer in the CRM.
model Customer {
id String @id @default(uuid())
name String
email String @unique
phone String? @unique
city String?
total_spend Float @default(0)
visit Int @default(0)
last_order_date DateTime?
createdAt DateTime @default(now())
orders Order[]
communication_log Communication_log[]
}
Represents a customer order.
model Order {
id String @id @default(uuid())
customer Customer @relation(fields: [customerId], references: [id], onDelete: Cascade)
customerId String
order_date DateTime @default(now())
amount Float
status OrderStatus @default(PENDING)
items OrderItem[]
}
Represents a marketing campaign targeting specific customer segments.
model Campaign {
id String @id @default(uuid())
name String
userId String
intent String
segment Json
summary String?
audience_size Int
message String
status CampaignStatus @default(RUNNING)
pending_count Int @default(0)
sent_count Int @default(0)
failed_count Int @default(0)
createdAt DateTime @default(now())
communication_log Communication_log[]
}
Tracks message delivery status for each customer in a campaign.
model Communication_log {
id String @id @default(uuid())
campaign Campaign @relation(fields: [campaignId], references: [id], onDelete: Cascade)
campaignId String
customer Customer @relation(fields: [customerId], references: [id], onDelete: Cascade)
customerId String
customer_name String
email String
phone String?
personalized_msg String
status DeliveryStatus @default(PENDING)
delivered_at DateTime?
createdAt DateTime @default(now())
}
Represents a CRM user authenticated via Google OAuth.
model User {
id String @id @default(dbgenerated("concat('user_', replace(cast(gen_random_uuid() as text), '-', ''))"))
googleId String? @unique
email String? @unique
name String?
avatar String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}
- Kafka & Zookeeper must be running locally before starting any services.
- PostgreSQL must be installed and properly configured as the primary database.
- All inter-service communication related to customers, orders, and campaign execution happens through Kafka topics.
- Consumers are designed for batch processing and asynchronous updates, ensuring high throughput.
- Use
consumer.js
in the backend to listen and process messages from the above topics.