π A Powerful Retrieval-Augmented Generation (RAG) API using LLama3, LangChain, Ollama, and ChromaDB
Doc_Query_Genie is a cutting-edge RAG-based Python solution that enhances information retrieval and generation using Flask API. With the power of LLama3, LangChain, Ollama, and ChromaDB, this tool provides a seamless experience for querying both general knowledge and custom document uploads.
β
AI-Powered Chat β Use it like OpenAI's ChatGPT to ask any question.
β
PDF Intelligence β Upload a PDF and ask context-specific questions.
β
Source Referencing β Get precise answers with citations from the document (paragraph/line references).
β
Fast & Efficient β Optimized for quick and reliable response generation.
β
Easy Integration β Simple API setup to integrate with other applications.
This project brings the best of AI-driven retrieval and context-aware generation, making it a versatile tool for researchers, students, and professionals.
- Clone the repository
git clone https://github.com/yourusername/Doc_Query_Genie.git cd Doc_Query_Genie
- Install dependencies
pip install -r requirements.txt
- Run the application
python app.py
- Use the API
- Access it at
http://127.0.0.1:5000/
- Upload PDFs and start querying
- Access it at
We welcome contributions! Feel free to submit issues or pull requests.
π If you find this project helpful, please consider giving it a star!