Mentoria is a RAG (Retrieval Augmented Generation) application that enables you to chat with your data from different sources, documents like: .txt
, .pdf
, .doc
and .docx
, or chat with data from URL
which can point to article/blog or to Youtube video. It uses Gemini API to serve the user in the background.
Hopefully soon I'll add more options like OpenAI and Cohere and make the choice to the user to use what he prefer. Also adding more options for data sources like .log
files.
- LangChain
- Streamlit
- FAISS
- Gemini
To try mentoria locally, follow those steps:
-
Clone the repo:
git clone git@github.com:mohamedhassan218/mentoria-demo.git
-
Create a virtual environment:
python -m venv .venv
-
Activate your virtual environment:
-
On Windows:
.venv\Scripts\activate
-
On Unix or MacOS:
source .venv/bin/activate
-
-
Install the dependencies:
pip install -r requirements.txt
-
Set up environment variables: Create a
.env
file in the project root and add the following:GOOGLE_API_KEY=""
Get your free API key from here
-
Run the Project:
streamlit run main.py
Contributions are highly appreciated! If you have any additional features, ideas or suggestions for improvements, please don't hesitate to submit a pull request. Also, feel free to take the code, customize it and try different ideas.