A modern, full-stack Support Ticket System built with the MERN Stack (MongoDB, Express, React, Node.js) and Docker. This application features an intelligent LLM Integration that automatically categorizes tickets and suggests priority levels based on user descriptions.
- 🤖 AI Auto-Classification: Uses OpenAI (GPT-3.5) to analyze ticket descriptions and suggest accurate Categories and Priority levels in real-time.
- 📊 Real-time Dashboard: Visualizes key metrics like Total Tickets, Open Count, Average Tickets/Day, and distributions using efficient MongoDB Aggregations.
- 📱 Fully Responsive Design: Built with Tailwind CSS, ensuring a seamless experience on both Desktop and Mobile devices.
- ⚡ Optimistic UI: Instant feedback on status updates (Open -> In Progress -> Resolved) for a snappy user experience.
- 🔍 Advanced Filtering: Search by keyword and filter by Status, Priority, and Category simultaneously.
- 🐳 Dockerized: standardized development environment using
docker-compose.
- Frontend: React (Vite), Tailwind CSS, Axios
- Backend: Node.js, Express.js
- Database: MongoDB (Mongoose)
- AI/LLM: OpenAI API (
gpt-3.5-turbo) - DevOps: Docker, Docker Compose
- Docker Desktop (Running)
- OpenAI API Key
-
Clone the repository:
git clone <repository_url> cd SupportTicketSystem
-
Configure Environment:
- The project comes with a root
.envfile (or backend.env). Ensure it contains your API Key:OPENAI_API_KEY=your_openai_api_key_here PORT=5000 MONGO_URI=mongodb+srv://... (Connection string)
- The frontend connects to
http://localhost:5000by default.
- The project comes with a root
-
Start the Application:
docker-compose up --build
-
Access the App:
- Frontend: http://localhost:3000
- Backend API: http://localhost:5000
Backend:
- Navigate to
backend/. - Create a
.envfile:PORT=5000 MONGO_URI=your_mongodb_connection_string OPENAI_API_KEY=your_openai_api_key
- Install & Run:
npm install npm start
Frontend:
- Navigate to
frontend/. - Install & Run:
npm install npm run dev
We chose OpenAI's GPT-3.5-turbo for the classification engine because:
- Speed & Cost: It balances response time with cost-effectiveness, which is crucial for a real-time "on-blur" UI interaction.
- Accuracy: Zero-shot classification performance for support contexts is excellent.
- Reliability: High availability ensures users aren't blocked from submitting tickets.
- MongoDB Aggregation: Instead of calculating stats (like "Average Tickets Per Day") in JavaScript, we utilize MongoDB's native aggregation pipeline for performance scalability.
- Separation of Concerns: The LLM interaction is isolated in a
llmService.jsmodule. This allows for easy swapping of models (e.g., to Anthropic or a local Llama model) without rewriting controller logic. - Graceful Degradation: If the LLM service fails or the API key is missing, the system defaults to "General" category and "Medium" priority, ensuring the core functionality (creating a ticket) never breaks.
- Component-Based UI: The React frontend is broken down into reusable components (
TicketList,TicketForm,StatsDashboard) for maintainability.
SupportTicketSystem/
├── backend/ # Node.js/Express Server
│ ├── config/ # DB Connection logic
│ ├── controllers/ # Request handlers (Tickets, Stats)
│ ├── models/ # Mongoose Schemas (Ticket.js)
│ ├── routes/ # API Route definitions
│ └── services/ # External services (OpenAI)
├── frontend/ # React Client
│ ├── src/
│ │ ├── components/ # UI Components (Form, List, Dashboard)
│ │ └── api.js # Axios configuration
│ └── tailwind.config.js # Styling configuration
├── docker-compose.yml # Container orchestration
└── .env # Environment variables (API Keys)
The repository includes a comprehensive set of functional features that can be tested manually:
- Submit a Ticket: Type a description and watch the "Category" and "Priority" dropdowns auto-update.
- Check Stats: Verify the dashboard counters increment immediately.
- Filter: Use the search bar or dropdowns to slice the data.
For detailed instructions on how to deploy this application to a production server (VPS/Cloud) using Docker, please refer to the Deployment Guide.
Issue: "Connection Refused" when accessing API
- Verify Docker containers are running:
docker ps - Check if port 5000 is occupied by another process.
Issue: LLM not updating Category/Priority
- Check your
OPENAI_API_KEYin.env. - Ensure you have internet access for external API calls.