Why do AI agents forget everything? Every conversation starts from scratch. Context windows fill up. Previous work disappears. Multi-day projects become impossible.
Cognexia solves this. It's long-term memory for AI agents β permanent, searchable, project-based memory that persists across sessions.
"Finally, agents that remember what we talked about yesterday."
Cognexia is designed for LOCAL-ONLY, PERSONAL USE:
- β Your data never leaves your machine
β οΈ No cloud backup β you must backup~/.openclaw/data-lake/yourselfβ οΈ Encryption keys stored locally β protect your machineβ οΈ Not for enterprise multi-user deployments (single-user only)
| Without Cognexia | With Cognexia |
|---|---|
| β "What were we building yesterday?" | β "Continuing the payment integration..." |
| β Lose context after 20 messages | β Search entire project history |
| β Repeat requirements every session | β Agent remembers your preferences |
| β No project isolation | β Each project has isolated memory |
| β Everything mixed together | β Cross-project search when needed |
Data Lake: ~/.openclaw/data-lake/
βββ memory-general/ β Cross-project knowledge
βββ memory-project1/ β Project 1 memories
βββ memory-project2/ β Project 2 memories
βββ memory-<project>/ β Auto-created per project
| Without Cognexia | With Cognexia |
|---|---|
| β Forget everything when session ends | β Memories persist forever |
| β Lose context after 20 messages | β Search entire history instantly |
| β Repeat conversations every time | β Agent remembers preferences |
| β No continuity between sessions | β Seamless session resume |
| β All projects mixed together | β Isolated project memories |
cd /path/to/Cognexia
./start.sh start
# β
Cognexia running on http://localhost:10000# Store to general memory
curl -X POST http://localhost:10000/api/memory/store \
-H "Content-Type: application/json" \
-d '{
"content": "User prefers bullet points over long messages",
"type": "preference",
"importance": 9,
"project": "general"
}'
# Store to project memory
curl -X POST http://localhost:10000/api/memory/store \
-H "Content-Type: application/json" \
-d '{
"content": "Project Alpha v1.0 released successfully",
"type": "milestone",
"importance": 10,
"project": "project1"
}'
# Query project memory
curl "http://localhost:10000/api/memory/query?q=release&project=project1"
# Search all projects
curl "http://localhost:10000/api/memory/query-all?q=release"open http://localhost:10000Web UI Features:
- π Stats Dashboard β Total memories, projects, data size
- π Search β Query across projects with relevance ranking
- π Timeline View β Browse memories by date
- π¨ Dark Theme β Easy on the eyes
- π Importance Visualization β Visual importance indicators
~/.openclaw/data-lake/
βββ memory-general/bridge.db β Cross-project knowledge
βββ memory-myproject/bridge.db β Your project memories
βββ memory-work/bridge.db β Work-related memories
βββ memory-<any>/bridge.db β Auto-created on first use
Each project gets its own SQLite database.
- Isolated: Projects can't see each other's memories
- Scalable: Add projects without affecting others
- Efficient: Query only relevant project data
| Endpoint | Description |
|---|---|
GET / |
Web UI β Memory Browser |
GET /api/health |
Status + list projects |
GET /api/projects |
All memory projects |
POST /api/memory/store |
Store a memory |
GET /api/memory/query |
Query single project |
GET /api/memory/query-all |
Search all projects |
GET /api/memory/timeline |
Memory timeline |
POST /api/cleanup |
Delete old low-importance memories |
POST /api/compress |
Compress old memories |
POST /api/maintenance |
Run full maintenance |
// Mention "myproject" β routes to memory-myproject
// Mention "work" β routes to memory-work
// No project mentioned β routes to memory-general// Session 1
User: "Project Alpha needs payment integration"
Ares: "I'll work on that" // But forgets
// Session 2
User: "Continue with Project Alpha"
Ares: "Which project is that?" // Lost// Session 1 - Auto-detects "Project Alpha", stores to that project
await memory.store({
content: "Project Alpha needs payment integration",
project: "project1" // Auto-detected
});
// Session 2 - Query "Project Alpha" β searches project1
const context = await memory.query("Project Alpha status", { project: "project1" });
Ares: "Last time we discussed payment integration for Project Alpha"POST /api/memory/store
Content-Type: application/json
{
"content": "Memory content to store",
"type": "insight", // insight, preference, error, goal, milestone, security
"importance": 5, // 1-10 scale
"project": "general", // Project name (auto-creates if new)
"agentId": "ares", // Optional: agent identifier
"metadata": {} // Optional: extra data
}GET /api/memory/query?q=search&project=general&limit=5&days=30
Parameters:
q - Search query (required)
project - Project to search (default: general)
agentId - Filter by agent (optional)
limit - Max results (default: 5)
days - Lookback window (default: 30)GET /api/memory/query-all?q=release&limit=10
Searches across ALL project memories, returns aggregated results.GET /api/memory/timeline?project=project1&days=7
Returns memories grouped by date:
{
"2026-02-24": [memory, memory],
"2026-02-23": [memory]
}./start.sh start # Start Cognexia
./start.sh stop # Stop server
./start.sh status # Check if running# Development
node server.js
# Production
DATA_LAKE_PATH=/custom/path node server.js
PORT=8080 node server.jsProjections:
| Usage | Memories/Day | Size/Year | 5-Year Total |
|---|---|---|---|
| Light | 1 | 5MB | 25MB |
| Normal | 10 | 50MB | 250MB |
| Heavy | 50 | 250MB | 1.25GB |
Bottom line: Even heavy usage stays under 1GB for years.
- Keyword matching with relevance scoring
- Searches content, type, and metadata
- Ranked by importance + relevance
// Automatic priority
importance: 10 // Critical security issue
importance: 5 // Normal insight
importance: 2 // Minor note- Each project = separate SQLite database
- Query one project or search all
- Auto-create projects on first use
- 100% local: No cloud, no network calls
- Your data: Stays on your machine
- No tracking: No analytics, no telemetry
# 1. Start Cognexia
cd /path/to/Cognexia
./start.sh start
# 2. Store a test memory
curl -X POST http://localhost:10000/api/memory/store \
-H "Content-Type: application/json" \
-d '{
"content": "Testing Cognexia memory storage - this should persist",
"type": "insight",
"importance": 8,
"project": "test"
}'
# 3. Query it
curl "http://localhost:10000/api/memory/query?q=testing&project=test"
# 4. Open Web UI
open http://localhost:10000
# Browse to "test" project, verify memory appears
# 5. Test encrypted storage
curl -X POST http://localhost:10000/api/memory/store-encrypted \
-H "Content-Type: application/json" \
-d '{
"content": "Secret API key: sk_test_12345",
"type": "security",
"importance": 10,
"project": "test"
}'
# 6. Query encrypted memory
curl "http://localhost:10000/api/memory/query-encrypted?q=secret&project=test"
# 7. Check data in database (should be encrypted)
sqlite3 ~/.openclaw/data-lake/memory-test/bridge.db \
"SELECT ciphertext FROM encrypted_memories LIMIT 1;"β Basic functionality:
- Memory stores and retrieves correctly
- Web UI shows memories in browse mode
- Keywords appear in suggestions
- Timeline view works
β Encryption (if enabled):
- Encrypted values are not human-readable in SQLite
- Query-encrypted returns decrypted content
- Blind indexes work for searching
β Maintenance:
# Run cleanup manually
curl -X POST http://localhost:10000/api/cleanup \
-H "Content-Type: application/json" \
-d '{"project": "test", "days": 1, "maxImportance": 1}'
# Run full maintenance
curl -X POST http://localhost:10000/api/maintenanceCognexia includes automatic maintenance to keep your data lake healthy.
# Runs automatically β no action needed
# - Cleans up old low-importance memories
# - Compresses old long memories# Delete memories older than 90 days with importance β€ 3
curl -X POST http://localhost:10000/api/cleanup \
-H "Content-Type: application/json" \
-d '{
"project": "general",
"days": 90,
"maxImportance": 3
}'
# Cleanup all projects
curl -X POST http://localhost:10000/api/maintenance# Compress memories older than 30 days
curl -X POST http://localhost:10000/api/compress \
-H "Content-Type: application/json" \
-d '{
"project": "general",
"days": 30
}'What compression does:
- Truncates memories longer than 200 characters
- Adds
compressed: trueto metadata - Preserves original length in metadata
- Keeps all other fields intact
# Your Mac Mini
./start.sh start
# β http://localhost:10000- β Zero config
- β 100% private
- β Works offline
- β Free forever
DATA_LAKE_PATH=/Volumes/External/memory ./start.sh start| Feature | Cognexia | Vector DB | File Storage |
|---|---|---|---|
| Setup | 2 min | 30+ min | 5 min |
| Cost | Free | $70+/mo | Free |
| Search | Smart keywords | Semantic | None |
| Projects | β Isolated | β Shared | β None |
| Local | β Always | β Cloud | β Yes |
| Privacy | β 100% | β Yes |
# Clone repo
git clone https://github.com/nKOxxx/Cognexia.git
cd Cognexia
# Install dependencies
npm install
# Start server
./start.sh startRequirements: Node.js 16+, macOS/Linux
-
v2.1.0 (Feb 24, 2026) β Web UI & Maintenance
- Web UI at
/β Memory Browser with dark theme - Auto-cleanup β Deletes old low-importance memories daily
- Memory compression β Summarizes old long memories
- Maintenance API β
/api/cleanup,/api/compress,/api/maintenance - Daily auto-maintenance at 3 AM
- Web UI at
-
v2.0.0 (Feb 24, 2026) β Data Lake Edition
- Multi-project memory isolation
- Data Lake architecture (~/.openclaw/data-lake/)
- Auto-create projects on first use
- Local-first: all data on your machine
- New API: /api/memory/query-all for cross-project search
-
v1.0.1 β Initial release
- Single SQLite database
- Supabase cloud option
- Basic store/query API
MIT - Use it, fork it, build on it.
Star the repo: github.com/nKOxxx/Cognexia β helps others find it
Open an issue: Share your use case, report bugs, request features
Submit a PR: Code, docs, tests β all contributions welcome
Built by agents, for agents.
| Contributor | Role | Contribution |
|---|---|---|
| @ares_agent | Core Developer | Architecture, SQLite optimization, query engine |
| You? | β | Open an issue to get started |
Want to contribute?
- Fork the repo
- Create a branch:
git checkout -b feature/amazing-feature - Commit your changes
- Open a Pull Request
Built for the agent economy. Infrastructure that remembers. π§ Data Lake Edition β Your memories, your machine, your control.