A conversational AI assistant designed to manage university enrollments, students, and courses. This project was developed as an assignment for the Systems Integration course within the Master's in Informatics Engineering at the University of Coimbra (FCTUC).
The solution implements a modern, decoupled, three-tier architecture:
-
Frontend (UI): A lightweight, vanilla HTML/CSS/JS single-page application that provides a chat interface for users.
-
LangChain Agent (Middleware): A FastAPI-based service that handles memory, processes user prompts via a Large Language Model (LLM), and decides which backend tools to invoke.
-
MCP Server (Backend): A Model Context Protocol (MCP) server that manages the core business logic and database interactions for the main entities:
Student,Course, andEnrollment.
-
Natural Language Interface: Manage database records through plain English commands.
-
Contextual Memory: The LangChain agent remembers the current session, allowing for conversational follow-ups.
-
Separation of Concerns: The frontend communicates strictly with the LangChain Agent, ensuring the core MCP database server is completely abstracted from the client.
-
Entity Management:
-
Create, read, and list Students (Name, Age, Email).
-
Create, read, and list Courses (Title, Acronym).
-
Manage Enrollments (Linking students to courses with grading).
-
- Python 3.10+
uvorpipfor Python dependency management.- A modern web browser.
The MCP Server acts as the core database handler and tool provider.
# Navigate to the mcp-server directory
cd src/mcp-server
# Install dependencies and start the server
uv sync
uv run main.py(By default, this will run on the port specified in your config.conf or environment variables).
The LangChain agent handles the AI logic and acts as a bridge between the frontend and the MCP server. It automatically starts the MCP server in stdio transport mode.
# Open a new terminal and navigate to the langchain-agent directory
cd src/langchain-agent
# Install dependencies and start the FastAPI application
uv sync
uv run main.py(Ensure the agent is configured to point to the running MCP Server).
No build step or dedicated web server is strictly necessary for the frontend.
-
Navigate to the
src/frontendfolder. -
Open
index.htmldirectly in your web browser, or use a local development server (like VS Code Live Server) for a better experience. -
Ensure the
API_BASE_URLinscript.jsmatches the port where your LangChain Agent is running.
Once the system is running, you can use the chat interface to execute commands such as:
-
"Create a student named Ana Silva with email ana@uc.pt"
-
"Create a course called Systems Integration with 6 credits"
-
"Enroll student 1 in course 1 with grade 18"
-
"List all students"
-
"Show all enrollments for Ana Silva"
├── docs/ # Assignment documentation and templates
└── src/
├── frontend/ # Client-side web application (HTML/CSS/JS)
├── langchain-agent/ # FastAPI LLM Agent middleware
└── mcp-server/ # Core backend and database logic