Damna is a full-stack Java application designed to eliminate the technical barrier between natural language and relational databases. Originally developed as a capstone for the Object-Oriented Design and Programming (5CS019) module, it serves as a robust proof-of-concept for AI-integrated software that respects data sovereignty.
By implementing CRUD by proxy, Damna allows users to interact with a database using plain English, converting conversational intent into valid, executable SQL.
Built with a focus on professional software engineering, Damna strictly adheres to the Open/Closed Principle—ensured to be open for extension but closed for modification.
- Polymorphic Core: The system interacts with a
DatabaseServiceinterface. This Dynamic Binding allows you to swap between different providers (MySQL, Postgres, etc.) with zero changes to the core logic. - Engine-Agnostic AI: Using the jlama library, Damna hosts a TinyLlama model locally. Your data and schemas never leave your server, maintaining total privacy.
- Semantic Filtering: A middleware layer prevents the AI from becoming overwhelmed by large metadata sets, keeping the reasoning sharp and focused.
- Pure CLI Interface: Optimized for power users and remote servers where a GUI is dead weight.
- RESTful API: A lean Spring Boot layer featuring two primary endpoints:
/api/status(Health) and/api/ask(Query). - Auto-Sanitization: A custom
sanitizeSqlmethod strips LLM artifacts and markdown before they hit your database driver.
Damna wasn't just "coded"—it was engineered using a strict Tests-First (TDD) methodology. Every logical flow is validated through a 6-Step Lifecycle:
- Create (Generate Table)
- Check (Verify Existence)
- Insert (Write Data)
- Verify (Confirm Write)
- Delete (Clear Data)
- Drop (Remove Table)
- Vector Performance: For optimal LLM performance in Java, the
jdk.incubator.vectormodule is manually integrated into the unit test arguments. - Hallucination Control: To keep the small language model on the rails, the logic utilizes structured templates to guide the AI toward valid SQL output.
While initially validated on a 1MB Ecommerce dataset, the polymorphic bridge is architected for:
- Graph Evolution: Expanding into Neo4j and MongoDB.
- Enterprise Scaling: Transitioning to more powerful Text-to-SQL models via the modular bridge.
Developed by Matthew Beddoes (2121729)