This project implements an end-to-end AI-powered chatbot built using modern Machine Learning, Generative AI, Prompt Engineering, and RAG architecture. The solution includes a Streamlit UI, an API backend (Python FastAPI), and integrations with HuggingFace embedding modelsβall running on Windows OS with containerized deployment options.
- Develop a functional AI/ML-powered chatbot that accepts user's concept name as mathematical questions.
- UI also shows sample questions user can ask.
- Integrate embeddings, vector search, and LLM inference pipelines.
- Build a clean UI using Streamlit.
- Implement a backend API using Python.
- Containerize using Docker.
- Maintain high-quality documentation and testing standards (optional).
- Streamlit-based user interface accepts user input
- Backend API (FastAPI or ASP.NET Core)
- Vector database (FAISS / ChromaDB)
- HuggingFace embeddings
- LLM inference (OpenAI/Llama/Mistral/Other)
- Logging and monitoring support (optional)
- Modular and scalable architecture
- Container-ready (Docker/Podman)
- Average response time < 10 seconds ( will be decided based on model pricing and token input later)
- Secure API key management using
.env - Reliable error handling and retry logic
| Deliverable | Description |
|---|---|
| Source Code | Complete implementation of UI, backend, and ML components |
| README | Setup instructions, architecture, usage guide |
| Architecture Diagram | Visual system overview |
| ML Notebooks | Embeddings & evaluation notebooks |
| Dockerfile | Container build configuration |
| Test Cases | Unit and integration tests |
- User interacts with Streamlit UI
- UI sends request to backend API
- API performs tokenization & embeddings
- Vector DB performs similarity search
- LLM generates response
- UI displays final result
TBD
| Category | Tools / Technologies |
|---|---|
| Frontend | Streamlit |
| Backend | FastAPI |
| AI / ML Frameworks | HuggingFace Transformers |
| Embeddings Models | all-MiniLM-L6-v2, InstructorXL, local HF models |
| LLMs | GPT-4o-mini |
| Vector Databases | ChromaDB |
| Containerization | Docker Desktop |
| OS / Environment | Windows 10/11 (Dev environment) |
| Package Mgmt | pip |
- Python:
pytest - .NET: MSTest / xUnit
- API endpoint tests
- UI-to-backend workflow tests
- Locust or JMeter
- Guardrail validation
- Hallucination checks
- Prompt consistency testing
- Clone repository
- Install dependencies
- Create and configure
.envfile - Start backend API
- Start Streamlit UI
docker build -t ai-project . docker run -p 8080:8080 ai-project
This project uses a combination of Python, AI/ML libraries, and containerization tools on Windows OS.
Below is the complete list of dependencies required for seamless development and deployment.
- Python 3.10+
- pip (Package Manager)
- virtualenv / conda (Optional for environment isolation)
fastapiuvicornpydanticpython-dotenv
streamlit
transformers(HuggingFace models)sentence-transformers(for embedding models)chromadb
pandasnumpyrequests
pytesthttpx
- Windows 10/11 (Primary development OS)
- VS Code with:
- Python Extension
- Docker Extension
- Markdown Preview
- Git
- Docker Desktop for Windows
Future improvements planned for this project:
- Add RAG evaluation using RAGAS
- Implement role-based authentication
- Add analytics dashboard (Streamlit/Plotly)
- Generate questions with images
This project is licensed under the MIT License.
You are free to use, modify, and distribute the code.
Contributions are welcome!
- Fork the repository
- Create a feature branch
- Submit a pull request
- HuggingFace Transformers for open-source models
- Streamlit for rapid UI development
- OpenAI
- Docker for containerization tools