Skip to content

NexGenStudioDev/LocalMind

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

LocalMind Banner

LocalMind β€” AI Without Limits

A free, open-source AI platform that lets you run local LLMs, connect cloud AI providers, teach your AI with your own data, and share your AI instance globally β€” all with full privacy and unlimited usage.


MIT License TypeScript React

πŸ“– Table of Contents


πŸ”₯ Overview

LocalMind is a free, open-source, self-hosted AI platform built for students, developers, researchers, and creators who want powerful AI capabilities without subscriptions, limits, or privacy concerns.

With LocalMind, you can:

  • Run local LLMs like LLaMA, Mistral, Phi, Gemma β€” 100% free & offline
  • Connect cloud AI models like Gemini, OpenAI, Groq, RouterAI
  • Train your AI with Excel/CSV files or Q&A datasets
  • Expose your local AI to the world via LocalTunnel or Ngrok
  • Build apps using a developer-friendly API layer
  • Test multiple models using an integrated AI playground

LocalMind gives you freedom, privacy, flexibility, and unlimited usage β€” all for free.


✨ Features

🧠 AI Model Support

Supports both local and cloud AI engines:

Local Models (via Ollama)

  • LLaMA
  • Mistral
  • Phi
  • Gemma
  • Any Ollama-compatible model

Cloud Models

  • Google Gemini
  • OpenAI GPT
  • Groq
  • RouterAI
  • (More coming soon!)

πŸ“š Train with Your Own Data (RAG)

Teach your AI with your own files:

  • Upload Excel (.xlsx / .csv)
  • Upload Q&A datasets
  • Automatically builds a private vector database
  • Fully local, no uploads, no cloud storage

Perfect for students, researchers, startups, and internal tools.


🌐 Share Your AI Globally

Expose your local instance so anyone on the internet can use your AI:

  • LocalTunnel
  • Ngrok

Great for demos, clients, teammates, or beta testing.


πŸ€– Single or Multiple Models

You can run:

  • One model at a time, OR
  • Multiple models (local + cloud) simultaneously

LocalMind handles routing internally.


πŸ”’ Privacy & Security

Your data is yours β€” always.

  • API keys stay on your device
  • No analytics or tracking
  • Fully open-source
  • No external storage
  • No vendor lock-in

πŸ› οΈ Tech Stack

Layer Technology
Frontend React, TypeScript, Vite
Backend Node.js, Express, TypeScript
AI Layer Ollama + Cloud Providers

πŸ“¦ Installation Guide

1. Clone the Repo

git clone https://github.com/your-username/LocalMind.git
cd LocalMind

2. βš™οΈ Backend Setup

cd server
npm install
npm run dev   # http://localhost:3000

3. 🎨 Frontend Setup

cd ../client
npm install
npm run dev   # http://localhost:5173

βš™οΈ Environment Variables

Create .env inside server:

Variable Description
API_KEY Your cloud AI key (Gemini/OpenAI/etc.)
ENVIRONMENT development / production
LOCALMIND_SECRET JWT/API generator secret

⚠️ Do NOT commit .env files to GitHub.


πŸ“ Folder Structure

LocalMind/
β”‚
β”œβ”€β”€ server/                
β”‚   β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ routes/
β”‚   β”œβ”€β”€ controllers/
β”‚   └── models/
β”‚
└── client/
    β”œβ”€β”€ src/
    β”œβ”€β”€ components/
    β”œβ”€β”€ pages/
    └── hooks/

🧩 API Documentation

πŸ” Auth

Method Endpoint Description
POST /api/v1/user/register Register user
POST /api/v1/user/login Login
GET /api/v1/user/profile User profile

βš™οΈ AI Config & Keys

Method Endpoint Description
POST /api/v1/user/local-mind-api-key-generator Generate API key
GET /api/v1/user/local-mind-api-keys Fetch keys
GET /api/v1/user/ai-config Get AI configuration

πŸ’¬ Chat & Messages

Method Endpoint Description
POST /api/v1/chat/send-message Send message to AI
GET /api/v1/chat/history Get chat history

πŸ“š Upload & Training

Method Endpoint Description
POST /api/v1/upload/excel Upload Excel/CSV
POST /api/v1/upload/dataSet Upload Q&A dataset
POST /api/v1/train/upload Upload training data

🌐 Port Exposure

Method Endpoint Description
POST /api/v1/expose/localtunnel Public URL via LT
POST /api/v1/expose/ngrok Public URL via Ngrok

πŸš€ Usage Workflow

  1. Start backend & frontend
  2. Register or Login
  3. Connect cloud AI or select local model
  4. Upload files (optional)
  5. Chat with AI
  6. Expose AI globally (optional)

🀝 Contributing

We welcome all contributions!

Fork β†’ Create Branch β†’ Commit β†’ Push β†’ Pull Request

Use meaningful commit messages & follow TypeScript conventions.


πŸͺͺ License

Licensed under the MIT License.


πŸ‘€ Author

NexGenStudioDev

πŸš€ LocalMind β€” Free, Private, Limitless AI for Everyone.

About

No description, website, or topics provided.

Resources

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages