A free, open-source AI platform that lets you run local LLMs, connect cloud AI providers, teach your AI with your own data, and share your AI instance globally β all with full privacy and unlimited usage.
- π₯ Overview
- β¨ Features
- π οΈ Tech Stack
- π¦ Installation Guide
- βοΈ Environment Variables
- π Folder Structure
- π§© API Documentation
- π Usage Workflow
- π€ Contributing
- πͺͺ License
- π€ Author
LocalMind is a free, open-source, self-hosted AI platform built for students, developers, researchers, and creators who want powerful AI capabilities without subscriptions, limits, or privacy concerns.
With LocalMind, you can:
- Run local LLMs like LLaMA, Mistral, Phi, Gemma β 100% free & offline
- Connect cloud AI models like Gemini, OpenAI, Groq, RouterAI
- Train your AI with Excel/CSV files or Q&A datasets
- Expose your local AI to the world via LocalTunnel or Ngrok
- Build apps using a developer-friendly API layer
- Test multiple models using an integrated AI playground
LocalMind gives you freedom, privacy, flexibility, and unlimited usage β all for free.
Supports both local and cloud AI engines:
- LLaMA
- Mistral
- Phi
- Gemma
- Any Ollama-compatible model
- Google Gemini
- OpenAI GPT
- Groq
- RouterAI
- (More coming soon!)
Teach your AI with your own files:
- Upload Excel (.xlsx / .csv)
- Upload Q&A datasets
- Automatically builds a private vector database
- Fully local, no uploads, no cloud storage
Perfect for students, researchers, startups, and internal tools.
Expose your local instance so anyone on the internet can use your AI:
- LocalTunnel
- Ngrok
Great for demos, clients, teammates, or beta testing.
You can run:
- One model at a time, OR
- Multiple models (local + cloud) simultaneously
LocalMind handles routing internally.
Your data is yours β always.
- API keys stay on your device
- No analytics or tracking
- Fully open-source
- No external storage
- No vendor lock-in
| Layer | Technology |
|---|---|
| Frontend | React, TypeScript, Vite |
| Backend | Node.js, Express, TypeScript |
| AI Layer | Ollama + Cloud Providers |
git clone https://github.com/your-username/LocalMind.git
cd LocalMindcd server
npm install
npm run dev # http://localhost:3000cd ../client
npm install
npm run dev # http://localhost:5173Create .env inside server:
| Variable | Description |
|---|---|
API_KEY |
Your cloud AI key (Gemini/OpenAI/etc.) |
ENVIRONMENT |
development / production |
LOCALMIND_SECRET |
JWT/API generator secret |
β οΈ Do NOT commit.envfiles to GitHub.
LocalMind/
β
βββ server/
β βββ src/
β βββ routes/
β βββ controllers/
β βββ models/
β
βββ client/
βββ src/
βββ components/
βββ pages/
βββ hooks/
| Method | Endpoint | Description |
|---|---|---|
| POST | /api/v1/user/register |
Register user |
| POST | /api/v1/user/login |
Login |
| GET | /api/v1/user/profile |
User profile |
| Method | Endpoint | Description |
|---|---|---|
| POST | /api/v1/user/local-mind-api-key-generator |
Generate API key |
| GET | /api/v1/user/local-mind-api-keys |
Fetch keys |
| GET | /api/v1/user/ai-config |
Get AI configuration |
| Method | Endpoint | Description |
|---|---|---|
| POST | /api/v1/chat/send-message |
Send message to AI |
| GET | /api/v1/chat/history |
Get chat history |
| Method | Endpoint | Description |
|---|---|---|
| POST | /api/v1/upload/excel |
Upload Excel/CSV |
| POST | /api/v1/upload/dataSet |
Upload Q&A dataset |
| POST | /api/v1/train/upload |
Upload training data |
| Method | Endpoint | Description |
|---|---|---|
| POST | /api/v1/expose/localtunnel |
Public URL via LT |
| POST | /api/v1/expose/ngrok |
Public URL via Ngrok |
- Start backend & frontend
- Register or Login
- Connect cloud AI or select local model
- Upload files (optional)
- Chat with AI
- Expose AI globally (optional)
We welcome all contributions!
Fork β Create Branch β Commit β Push β Pull Request
Use meaningful commit messages & follow TypeScript conventions.
Licensed under the MIT License.
NexGenStudioDev
π LocalMind β Free, Private, Limitless AI for Everyone.