REALM is a personal knowledge management system that connects your Personal notes to a local vector store and lets you query your notes using a Large Language Model (LLM). It supports automatic syncing, embedding with Chroma DB, and querying with Gemini via LangChain.
- Syncs only updated or new Markdown files from your Personal notes
- Stores embeddings locally using ChromaDB
- Queries your notes using Gemini (Google Generative AI)
- Command-line interface for seamless access
myBrain/
│
├── embeddings/ # Stores Chroma vector database and metadata
├── scripts/
│ ├── create_embeddings.py # Sync and embed vault notes
│ └── langchain_integration.py # LLM querying logic
├── .env # Stores environment variables (DO NOT COMMIT)
├── brain.sh # CLI entrypoint script
├── requirements.txt # Python dependencies
└── README.md
git clone https://github.com/yourusername/myBrain.git
cd myBrain
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
VAULT_PATH=/absolute/path/to/your/obsidian/vault
GEMINI_API_KEY=your_google_gemini_api_key
Now you can run the script anytime
./brain.sh
This will:
- Activate the virtual environment
- Sync and embed updated notes from your vault
- Start the LLM-powered QA system
- Prompt you to ask questions about your knowledge base
- Type
exit
to quit