LocalDom turns your local LLM engines into secure, authenticated API services. It allows you to generate professional API credentials for your local AI (Ollama, LM Studio, etc.), making it seamless to use your private models anywhere—from mobile apps to external web services—with End-to-End Encryption (E2EE) and Persistent Memory.
LocalDom uses a "Blind Relay" pattern. The relay server routes traffic without seeing your prompts, which are encrypted locally by your agent.
graph LR
UserApp[External App] -- Encrypted Prompt --> Relay[LocalDom Relay]
Relay -- Encrypted Payload --> Agent[LocalDom Agent]
Agent -- Decrypts & Proxies --> LocalLLM[Ollama / LM Studio]
LocalLLM -- Response --> Agent
Agent -- Encrypts Response --> Relay
Relay -- Encrypted Result --> UserApp
This builds the dashboard and starts the relay on port 9090.
npm startNote: Copy the ld_... API key printed in your terminal.
Start the discovery agent on your local machine.
npm run agentLocalDom now manages conversation context locally. Just provide the X-LocalDom-Session header, and the agent will automatically "stitch" your conversation history into the prompt.
- Privacy: History is stored on your machine, never the cloud.
- Persistence: Memories survive restarts in
agent/memory.json.
Tired of stale lists? The dashboard now features a Live Rescan button. Trigger a system-wide search for new LLM runners without restarting your agent.
- Rate Limiting: Protects your machine from DDoS and brute-force key guessing.
- Header Sanitization: Automatically strips sensitive metadata (
Cookies,Host) before proxying. - AES-256-GCM: Industry-standard encryption for all tunneled traffic.
LocalDom is designed to be a drop-in replacement for any OpenAI-compatible client.
fetch('http://localhost:9090/api/ollama/v1/chat/completions', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-LocalDom-Key': 'your_ld_key',
'X-LocalDom-Session': 'chat-123' // Optional memory session
},
body: JSON.stringify({
model: 'llama3',
messages: [{ role: 'user', content: 'Tell me a joke.' }]
})
}).then(res => res.json()).then(console.log);curl http://localhost:9090/api/ollama/v1/chat/completions \
-H "X-LocalDom-Key: your_ld_key" \
-H "Content-Type: application/json" \
-d '{
"model": "llama3",
"messages": [{"role": "user", "content": "Ping!"}]
}'- API Reference — Headers, Routes, and Status codes.
- Architecture — Technical deep-dive.
- Security — How our E2EE works.
- User Guide — Multi-agent setup.
This project is licensed under the MIT License.
While the license is permissive, the creator of LocalDom maintains a strict Non-Commercial Philosophy:
- 100% Free Use: This software is intended to be free for everyone.
- No Reselling: You are not permitted to reverse engineer this project for the purpose of selling it for profit.
- Community First: Keep the gateway open and private.
MIT © 2026 LocalDom Team