Real-time mainframe event processing with Confluent Kafka and Google Gemini AI.
- Node.js 16+
- Confluent Cloud account (or local Kafka)
- Google Gemini API key
-
Clone the repository:
git clone https://github.com/yourusername/mainframe-pulse-rt.git cd mainframe-pulse-rt -
Install dependencies:
npm install
-
Configure environment variables:
cp .env.example .env
Edit
.envand add your credentials:# Gemini AI Configuration (REQUIRED) GEMINI_API_KEY=your_gemini_api_key_here # Confluent Cloud Kafka Configuration (REQUIRED) KAFKA_BOOTSTRAP=your-kafka-bootstrap-server:9092 KAFKA_TOPIC=mainframe-events KAFKA_GROUP_ID=mainframe-pulse-consumer KAFKA_API_KEY=your_confluent_api_key KAFKA_API_SECRET=your_confluent_api_secret # Server Configuration (OPTIONAL) PORT=3000
-
Start the server:
npm start
-
Open the dashboard: Navigate to
http://localhost:3000/dashboard
- Go to Google AI Studio
- Create a new API key
- Copy the key to your
.envfile
- Sign up at Confluent Cloud
- Create a cluster and topic
- Generate API key and secret
- Copy credentials to your
.envfile
GET /- Landing pageGET /dashboard- Real-time dashboardGET /health- Health checkGET /events/latest- Get latest processed events with AI insightsPOST /events/simulate- Generate sample events for testingPOST /events/startStream- Start continuous event generationPOST /events/stopStream- Stop event generationGET /events/stats- Get processing statistics
- Real-time Event Processing: Live mainframe event analysis
- AI-Powered Insights: Google Gemini analyzes events for ABEND codes, causes, and fixes
- Interactive Dashboard: Real-time charts, tables, and event details
- Kafka Integration: Confluent Cloud streaming with SSL/SASL authentication
- Event Simulation: Generate realistic mainframe events for testing
- Time Filtering: View events by time windows (30s, 5m, all)
- Event Details: Click any event for comprehensive analysis
- Backend: Node.js + Express
- Streaming: Confluent Kafka (KafkaJS)
- AI Analysis: Google Gemini Pro
- Frontend: Vanilla JavaScript with real-time updates
- Storage: In-memory (last 200 events)
Test event simulation:
curl -X POST http://localhost:3000/events/simulate \
-H "Content-Type: application/json"{
"id": "evt_1703331234567_abc123def",
"type": "SMF30",
"job_name": "PAYROLL0123",
"abend_code": "S0C4",
"severity": "high",
"system": "PROD-SYS1",
"cpu_time": 1250,
"timestamp": "2024-01-15T10:30:00Z",
"message": "SMF30 event in PAYROLL0123 - ABEND S0C4 detected"
}- Never commit
.envfiles to version control - Use environment variables for all sensitive data
- Rotate API keys regularly
- Use HTTPS in production
Set these environment variables in your deployment platform:
GEMINI_API_KEY=your_production_gemini_key
KAFKA_BOOTSTRAP=your_production_kafka_bootstrap
KAFKA_API_KEY=your_production_kafka_key
KAFKA_API_SECRET=your_production_kafka_secret
KAFKA_TOPIC=mainframe-events
PORT=3000MIT License - see LICENSE file for details
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
For issues and questions:
- Create an issue on GitHub
- Check the documentation
- Review the console logs for debugging
MainframePulse RT - Real-time mainframe intelligence powered by AI 🚀