Give your AI a brain, not just hands.
Persistent memory and grounded knowledge for any LLM.
Works with Claude, GPT, Codex, Ollama, and OpenAI-compatible endpoints.
Quick Start • What It Does • Bring Your Own AI • Dream Engine • Safe by Design • Enterprise
Your AI forgets everything between sessions. It hallucinates about your domain. And if you give it the ability to act, it might get your accounts banned.
Solari fixes all three.
Solari is persistent memory and grounded knowledge for any AI you already use. It's not a replacement for Claude, GPT, or Codex — it's what makes them actually useful.
# Install from PyPI
pip install solari-ai
# Or from source
git clone https://github.com/SolariResearch/solari.git
cd solari && pip install -e .
# Feed it your knowledge
solari ingest --pdf research_paper.pdf --mind physics
solari ingest --url "https://docs.your-project.com" --mind my_project
solari ingest --youtube "https://youtube.com/watch?v=..." --mind lectures
# Ask questions grounded in YOUR knowledge
solari query "explain the key findings" --minds physicsYour AI now has expert-level knowledge in whatever you fed it. It persists across sessions. Responses are grounded in your verified data instead of the model's training set. Works with any LLM that has an API.
Try it right now with the included starter minds (1,767 entries across programming, biology, and physics):
solari query "how does a hash table handle collisions" --minds-dir starter-minds
solari query "how does the immune system fight infection" --minds-dir starter-minds
solari query "what is entropy in thermodynamics" --minds-dir starter-mindsSolari is not another AI provider. It extends the one you already pay for.
# Works with your Claude subscription
solari agent --provider anthropic --model claude-sonnet-4-20250514
# Works with OpenAI / Codex
solari agent --provider openai --model gpt-4o
# Works with local models (Ollama)
solari agent --provider ollama --model qwen2.5:7b
# Works with any OpenAI-compatible endpoint
solari agent --provider custom --base-url http://localhost:8080/v1Set your API key once:
export ANTHROPIC_API_KEY="your-key-here"
# or
export OPENAI_API_KEY="your-key-here"Solari handles the memory. Your provider handles the intelligence. Together they're unstoppable.
pip install solari-aiOr from source:
git clone https://github.com/SolariResearch/solari.git
cd solari && pip install -e .# Ingest a Wikipedia article
solari ingest --wikipedia "Machine learning" --mind ml
# Or a PDF
solari ingest --pdf paper.pdf --mind research
# Or a YouTube lecture
solari ingest --youtube "https://youtube.com/watch?v=..." --mind lecturessolari query "how do neural networks learn" --mind mlGrounded, sourced answers from the knowledge you ingested. No hallucination.
solari dream --minds ml,research --cycles 3Watch Solari find connections between your knowledge domains that you didn't know existed.
| Source | Command |
|---|---|
| Web page | solari ingest --url URL --mind NAME |
solari ingest --pdf PATH --mind NAME |
|
| YouTube | solari ingest --youtube URL --mind NAME |
| arXiv paper | solari ingest --arxiv 2301.00001 --mind NAME |
| Wikipedia | solari ingest --wikipedia "Topic" --mind NAME |
| Local file | solari ingest --file notes.txt --mind NAME |
| Batch URLs | solari ingest --batch urls.txt --mind NAME |
| Directory | solari ingest --dir ./docs/ --mind NAME |
Each "mind" is a local vector index. Stack them. Query across them. They persist forever.
# Search all minds
solari query "your question"
# Search specific minds
solari query "your question" --minds physics,chemistry
# JSON output for pipelines
solari query "your question" --json --top 10
# List available minds
solari mindsGrounding your LLM with relevant knowledge consistently improves answer quality on domain-specific questions compared to ungrounded responses.
Minds stay sharp automatically. When you ingest better information on a topic, outdated entries get replaced.
# Ingest with confidence — replaces similar entries with lower confidence
solari ingest --url "https://new-research.com" --mind physics --confidence 0.9
# See mind health
solari minds
# Manual cleanup: remove low-quality entries
solari prune --mind physics --below 0.3 --dry-run
solari prune --mind physics --below 0.3No stale knowledge accumulating forever. Higher-confidence entries automatically supersede weaker ones on the same topic (cosine similarity > 0.85). Your minds get better over time, not just bigger.
This is what nobody else has.
The Dream Engine takes separate knowledge bases and finds cross-domain connections that no single expert would see.
solari dream --minds physics,economics,biology --cycles 5How it works:
- NREM phase — probes pairs of knowledge bases with shared questions, finds hidden structural bridges
- REM phase — feeds bridges into your LLM to generate novel hypotheses
- Parliament mode — expert viewpoints debate, dissent is measured, synthesis emerges
The Dream Engine can surface surprising connections — like structural parallels between immune system antibody selection and genetic algorithms, or thermodynamic entropy and information theory.
A production implementation of Global Workspace Theory for building cognitive agents.
from solari.workspace import GlobalWorkspace, Processor, WorkspaceItem
class ThreatDetector(Processor):
name = "threat"
def bid(self, context):
return [WorkspaceItem(
source=self.name,
content="Anomalous login pattern detected",
item_type="threat",
urgency=0.9,
novelty=0.8,
)]
gw = GlobalWorkspace(capacity=7)
gw.register_processor(ThreatDetector())
result = gw.tick()Attention competition, coherence scoring, narrative threading, meta-cognition, phenomenal state. A production-grade cognitive architecture for building intelligent agents.
Other agent tools give AI the ability to act first and understand later. That's how people get their accounts banned and their credentials leaked.
Solari takes a different approach:
- Knowledge first — the agent queries your minds before responding, so it has real context instead of guessing
- No plugin marketplace — no third-party skills to install, no supply chain risk
- Your data stays local — knowledge is stored as files on your machine, not sent to third-party servers
- No background processes — Solari runs when you invoke it, not autonomously
- You control the AI provider — bring your own API key, use your own subscription, switch anytime
Solari doesn't execute system commands or automate workflows (yet). It makes your AI smarter about your domain — that's the foundation everything else should be built on.
┌──────────────────┐
│ solari ingest │ ← PDFs, URLs, YouTube, arXiv
└────────┬─────────┘
│
┌────────▼─────────┐
│ Minds │ ← Vector indices, <50ms lookup
│ (local storage) │
└──┬──────────┬────┘
│ │
┌────────▼──┐ ┌───▼────────┐
│solari query│ │solari dream│
│ (retrieve) │ │(synthesize)│
└─────┬──────┘ └──────┬─────┘
│ │
┌─────▼────────────────▼─────┐
│ Your AI Provider │
│ (Claude / GPT / Ollama) │
└────────────────────────────┘
| Starter minds included | 1,767 entries (programming, biology, physics) |
| Storage per mind | ~4MB per 1,000 entries |
| Providers supported | Claude, GPT, Codex, Ollama, OpenAI-compatible endpoints |
| Dependencies | FAISS, sentence-transformers, numpy, requests, bs4 |
| License | AGPL-3.0 (commercial licensing available) |
| Problem | Solari |
|---|---|
| AI forgets between sessions | Minds persist on disk — knowledge carries over |
| AI hallucinates on your domain | Responses grounded in YOUR verified knowledge |
| Locked into one AI provider | Works with Claude, GPT, Ollama — bring your own key |
| RAG needs infrastructure | No Docker, no database, no server — just pip install |
| Knowledge stays siloed | Dream Engine finds cross-domain connections |
See the examples/ directory:
- quickstart.py — Ingest and query in 30 lines
- dream_demo.py — Cross-domain synthesis in action
- workspace_demo.py — Build a cognitive architecture
Solari is built by Solari Systems, extracted from a production autonomous intelligence system with 15 months of R&D behind it.
These are real tools that run in production every day — not prototypes, not demos. Each module works independently. Together they form something greater.
If Solari saves you time, consider supporting development:
For enterprise deployments with managed hosting, team features, and priority support, visit solarisystems.net.
AGPL-3.0 — Free for open-source use. Commercial licensing available.
Built by Solari Systems
Give your AI a brain, not just hands.
