Synapse is a high-performance, neuro-symbolic knowledge graph system designed to serve as the long-term memory for agentic AI. It bridges the gap between unstructured semantic search (Vector RAG) and formal logical reasoning (Knowledge Graphs).
- Blazing Fast Core: Powered by Rust and Oxigraph for low-latency graph operations.
- Neuro-symbolic Search: Hybrid retrieval combining vector similarity with graph traversal expansion.
- Reasoning Engine: Built-in OWL-RL and RDFS reasoning strategies to derive implicit knowledge.
- Scenario Marketplace: (v0.6.0) Dynamic loading of domain-specific "scenarios" (ontologies + data + docs) to instantly equip agents with specialized knowledge.
- Native MCP: Seamlessly integrates as a Model Context Protocol server.
- Ontology-Driven: Automatically loads standard ontologies (Schema.org, PROV-O, etc.) via the
corescenario.
npx skills install pmaojo/synapse-engineNote: During installation, you will be prompted to set Synapse as your default memory provider.
v0.6.0 introduces the official high-level SDK:
pip install ./python-sdkConnect and ingest knowledge with just a few lines of code:
from synapse import get_client
# Connect to local engine
client = get_client()
# Ingest semantic triples
client.ingest_triples([
{"subject": "Pelayo", "predicate": "expertIn", "object": "Neuro-symbolic AI"}
], namespace="work")
# Hybrid Search
results = client.hybrid_search("What is Pelayo's expertise?", namespace="work")Add Synapse to your openclaw.json (or Cursor/Claude Desktop) to enable direct LLM access to your knowledge graph:
"mcpServers": {
"synapse": {
"command": "path/to/synapse",
"args": ["--mcp"],
"env": {
"GRAPH_STORAGE_PATH": "./data/graphs"
}
}
}list_scenarios: Browse the Scenario Marketplace.install_scenario: Install a domain package (e.g.,research-assistant).ingest_triples: Direct RDF ingestion.sparql_query: Complex graph querying.hybrid_search: Semantic + structural retrieval.apply_reasoning: Trigger OWL-RL/RDFS inference.ingest_url: Automated scraping and embedding.
Synapse now supports a Scenario Marketplace, allowing agents to dynamically install knowledge packages. A Scenario bundles:
- Ontologies: Formal schema definitions (OWL).
- Seed Data: Initial knowledge graph triples.
- Documentation: Text guides automatically indexed for RAG retrieval.
- Core: Essential ontologies (Schema.org, PROV-O, SKOS, FOAF, Memory) loaded by default.
- Research Assistant: Specialized ontology for academic papers and authors.
To install a scenario via MCP:
{
"name": "install_scenario",
"arguments": {
"name": "research-assistant",
"namespace": "my-research"
}
}Synapse supports full offline operation by running a local embedding server instead of relying on the HuggingFace Inference API.
We provide a lightweight server compatible with the HuggingFace API spec:
# Start the server (dependencies are installed with requirements.txt)
python scripts/local_embedding_server.pyThis will download the model (default: sentence-transformers/all-MiniLM-L6-v2) on first run and cache it locally.
Set the HUGGINGFACE_API_URL environment variable to point to your local server:
export HUGGINGFACE_API_URL="http://localhost:8000"
./start_rust_server.shSynapse can automatically distill your Notion notes into formal knowledge using LLM-driven extraction.
- Configure the
notionskill in your environment. - Add a sync job to your
openclaw.json:
openclaw cron add --name "Notion Sync" --every "1h" --message "Sync recent Notion pages to Synapse namespace 'personal'"Ontologies are defined in standard OWL format. Synapse uses these schemas to validate incoming triples, ensuring semantic consistency (domain/range checks) and preventing logical contradictions.
The Rust core implements a multi-strategy reasoner:
- RDFS: Efficient class and property transitivity.
- OWL-RL: Advanced logic for
SymmetricProperty,TransitiveProperty, andinverseOfrelationships. - Materialization: Inferred facts are persisted in the graph, making reasoning-based queries near-instantaneous.
v0.4.0 includes a new Rollback Mechanism: if vector indexing fails during ingestion, graph changes are automatically reverted to maintain memory integrity.
Synapse includes an E2E test suite to verify the integration between the Python client and Rust backend.
# Ensure Rust server is running
./start_rust_server.sh
# Run tests
pytest tests/We welcome contributions! Please see CONTRIBUTING.md for guidelines on how to get started.
This project is licensed under the MIT License - see the LICENSE file for details.
Developed by Pelayo Maojo & the Synapse Team