Skip to content

Synapse is a high-performance, neuro-symbolic knowledge graph system designed to serve as the long-term memory for agentic AI.

License

Notifications You must be signed in to change notification settings

pmaojo/synapse-engine

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

77 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Synapse πŸ§ β›“οΈ

Synapse is a high-performance, neuro-symbolic knowledge graph system designed to serve as the long-term memory for agentic AI. It bridges the gap between unstructured semantic search (Vector RAG) and formal logical reasoning (Knowledge Graphs).

πŸš€ Key Capabilities

  • Blazing Fast Core: Powered by Rust and Oxigraph for low-latency graph operations.
  • Neuro-symbolic Search: Hybrid retrieval combining vector similarity with graph traversal expansion.
  • Reasoning Engine: Built-in OWL-RL and RDFS reasoning strategies to derive implicit knowledge.
  • Scenario Marketplace: (v0.6.0) Dynamic loading of domain-specific "scenarios" (ontologies + data + docs) to instantly equip agents with specialized knowledge.
  • Native MCP: Seamlessly integrates as a Model Context Protocol server.
  • Ontology-Driven: Automatically loads standard ontologies (Schema.org, PROV-O, etc.) via the core scenario.

πŸ“¦ Installation & Setup

One-Click for OpenClaw

npx skills install pmaojo/synapse-engine

Note: During installation, you will be prompted to set Synapse as your default memory provider.

Python SDK

v0.6.0 introduces the official high-level SDK:

pip install ./python-sdk

πŸ› οΈ Usage

Python SDK (Recommended)

Connect and ingest knowledge with just a few lines of code:

from synapse import get_client

# Connect to local engine
client = get_client()

# Ingest semantic triples
client.ingest_triples([
    {"subject": "Pelayo", "predicate": "expertIn", "object": "Neuro-symbolic AI"}
], namespace="work")

# Hybrid Search
results = client.hybrid_search("What is Pelayo's expertise?", namespace="work")

MCP Integration

Add Synapse to your openclaw.json (or Cursor/Claude Desktop) to enable direct LLM access to your knowledge graph:

"mcpServers": {
  "synapse": {
    "command": "path/to/synapse",
    "args": ["--mcp"],
    "env": { 
      "GRAPH_STORAGE_PATH": "./data/graphs" 
    }
  }
}

Available Tools:

  • list_scenarios: Browse the Scenario Marketplace.
  • install_scenario: Install a domain package (e.g., research-assistant).
  • ingest_triples: Direct RDF ingestion.
  • sparql_query: Complex graph querying.
  • hybrid_search: Semantic + structural retrieval.
  • apply_reasoning: Trigger OWL-RL/RDFS inference.
  • ingest_url: Automated scraping and embedding.

πŸ“š Scenario Marketplace (New in v0.6.0)

Synapse now supports a Scenario Marketplace, allowing agents to dynamically install knowledge packages. A Scenario bundles:

  1. Ontologies: Formal schema definitions (OWL).
  2. Seed Data: Initial knowledge graph triples.
  3. Documentation: Text guides automatically indexed for RAG retrieval.

Built-in Scenarios:

  • Core: Essential ontologies (Schema.org, PROV-O, SKOS, FOAF, Memory) loaded by default.
  • Research Assistant: Specialized ontology for academic papers and authors.

To install a scenario via MCP:

{
  "name": "install_scenario",
  "arguments": {
    "name": "research-assistant",
    "namespace": "my-research"
  }
}

πŸ”Œ Offline Mode

Synapse supports full offline operation by running a local embedding server instead of relying on the HuggingFace Inference API.

1. Start the Local Embedding Server

We provide a lightweight server compatible with the HuggingFace API spec:

# Start the server (dependencies are installed with requirements.txt)
python scripts/local_embedding_server.py

This will download the model (default: sentence-transformers/all-MiniLM-L6-v2) on first run and cache it locally.

2. Configure Synapse

Set the HUGGINGFACE_API_URL environment variable to point to your local server:

export HUGGINGFACE_API_URL="http://localhost:8000"
./start_rust_server.sh

🌐 Notion Sync: Automated Memory

Synapse can automatically distill your Notion notes into formal knowledge using LLM-driven extraction.

  1. Configure the notion skill in your environment.
  2. Add a sync job to your openclaw.json:
openclaw cron add --name "Notion Sync" --every "1h" --message "Sync recent Notion pages to Synapse namespace 'personal'"

πŸ—οΈ Technical Architecture

1. Ontology-Driven Validation

Ontologies are defined in standard OWL format. Synapse uses these schemas to validate incoming triples, ensuring semantic consistency (domain/range checks) and preventing logical contradictions.

2. The Synapse Reasoner

The Rust core implements a multi-strategy reasoner:

  • RDFS: Efficient class and property transitivity.
  • OWL-RL: Advanced logic for SymmetricProperty, TransitiveProperty, and inverseOf relationships.
  • Materialization: Inferred facts are persisted in the graph, making reasoning-based queries near-instantaneous.

3. Robust Ingestion

v0.4.0 includes a new Rollback Mechanism: if vector indexing fails during ingestion, graph changes are automatically reverted to maintain memory integrity.

πŸ§ͺ Testing

Synapse includes an E2E test suite to verify the integration between the Python client and Rust backend.

# Ensure Rust server is running
./start_rust_server.sh

# Run tests
pytest tests/

🀝 Contributing

We welcome contributions! Please see CONTRIBUTING.md for guidelines on how to get started.

βš–οΈ License

This project is licensed under the MIT License - see the LICENSE file for details.


Developed by Pelayo Maojo & the Synapse Team

About

Synapse is a high-performance, neuro-symbolic knowledge graph system designed to serve as the long-term memory for agentic AI.

Resources

License

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors 2

  •  
  •  

Languages