A comprehensive AI helper and knowledge base system built with LangChain.js, featuring multi-model support, Neo4j graph database, PostgreSQL with pgvector for RAG, and Model Context Protocol (MCP) server.
- 🤖 Multi-Agent System: Router, RAG, Knowledge Graph, and Data Processing agents
- 🧠 Multi-Model Support: Google Gemini, OpenAI, Anthropic Claude
- 📊 Hybrid Storage: Neo4j knowledge graph + PostgreSQL vector store
- 🔒 Permission System: Admin and user chat types with different access levels
- 💬 Multi-Chat Support: Separate chat sessions with persistent history
- 👤 User Profiles: Store and utilize user information (name, email, address, etc.)
- 🔌 MCP Server: Expose tools to Claude Desktop and other MCP clients
- 📁 File System Access: Read/write files with permission controls
- 🔍 Advanced Search: Semantic similarity + graph context
-
Storage Layer
- PostgreSQL: User profiles, chat history, documents, vector embeddings
- Neo4j: Knowledge graph with entities and relationships
-
Agent System
- Router Agent: Analyzes requests and routes to appropriate agents
- RAG Agent: Document retrieval and question answering
- Knowledge Graph Agent: Graph operations and entity relationships
- Data Processing Agent: Programmatic and LLM-based data processing
-
Services
- AgentService: Manages agent execution
- ChatService: Chat session management
- UserProfileService: User profile CRUD
- PermissionService: Access control
- EmbeddingService: Vector embeddings
- IngestionService: Document processing and storage
-
Tools
- FileSystemTool: File operations
- GraphQueryTool: Neo4j queries
- VectorSearchTool: RAG operations
- DataTransformTool: Data processing
-
Servers
- Express API: HTTP REST endpoints
- MCP Server: Model Context Protocol for tool exposure
- Node.js 18+
- Docker and Docker Compose (for database services) OR
- PostgreSQL 14+ with pgvector extension and Neo4j 5+ (for manual setup)
- Clone and install dependencies:
cd knowledgeBase
npm install- Start database services with Docker:
# Using npm script (recommended)
npm run docker:up
# Or using docker-compose directly
docker-compose up -dThis will start:
- PostgreSQL with pgvector extension on port 5432
- Neo4j on ports 7474 (HTTP) and 7687 (Bolt)
- Configure environment variables:
# Create .env file
cat > .env << 'EOF'
# Model API Keys
GOOGLE_API_KEY=your_google_api_key
OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key
# Databases (Docker Compose defaults)
NEO4J_URI=bolt://localhost:7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=password
POSTGRES_URI=postgresql://user:password@localhost:5432/knowledgebase
# Server
PORT=3000
MCP_PORT=3001
NODE_ENV=development
# Default Settings
DEFAULT_MODEL=gemini-pro
DEFAULT_EMBEDDING_MODEL=text-embedding-004
EOFEdit .env and add your API keys.
- Verify services are running:
# Using npm script
npm run docker:ps
# Or using docker-compose directly
docker-compose psFor detailed Docker setup instructions, see DOCKER_SETUP.md.
Useful Docker commands:
npm run docker:up- Start servicesnpm run docker:down- Stop servicesnpm run docker:logs- View logsnpm run docker:ps- Check statusnpm run docker:restart- Restart services
- Clone and install dependencies:
cd knowledgeBase
npm install- Set up PostgreSQL with pgvector:
CREATE DATABASE knowledgebase;
\c knowledgebase
CREATE EXTENSION vector;- Set up Neo4j:
- Install Neo4j Desktop or use Neo4j AuraDB
- Create a new database
- Note the connection URI, username, and password
- Configure environment variables:
# Create .env file
cat > .env << 'EOF'
# Model API Keys
GOOGLE_API_KEY=your_google_api_key
OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key
# Databases
NEO4J_URI=bolt://localhost:7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=your_password
POSTGRES_URI=postgresql://user:password@localhost:5432/knowledgebase
# Server
PORT=3000
MCP_PORT=3001
NODE_ENV=development
# Default Settings
DEFAULT_MODEL=gemini-pro
DEFAULT_EMBEDDING_MODEL=text-embedding-004
EOFEdit .env and add your API keys and database credentials.
npm start
# or for development
npm run devThe server will be available at http://localhost:3000/api
npm run mcp:start
# or for development
npm run mcp:devFirst, install frontend dependencies (if not already done):
npm run frontend:installThen start the frontend development server:
npm run frontend:devThe frontend will be available at http://localhost:5173
Other frontend commands:
npm run frontend:build- Build the frontend for productionnpm run frontend:preview- Preview the production build
// Create admin chat (full access)
POST /api/chat/create
{
"chatType": "admin",
"userId": "user123",
"metadata": {
"description": "Admin session"
}
}
// Create user chat (read-only)
POST /api/chat/create
{
"chatType": "user",
"userId": "user123"
}POST /api/profile
{
"userId": "user123",
"username": "John Doe",
"email": "john@example.com",
"phone": "+1234567890",
"address": "123 Main St, City, State",
"preferences": {
"theme": "dark",
"notifications": true
},
"customFields": {
"department": "Engineering"
}
}POST /api/query
{
"chatId": "chat-uuid",
"prompt": "What are the latest updates?",
"options": {
"model": "gemini-pro",
"useRAG": true,
"useGraph": true
}
}POST /api/query
{
"chatId": "chat-uuid",
"prompt": "Calculate the average of these numbers",
"data": [10, 20, 30, 40, 50],
"options": {
"processData": true
}
}POST /api/ingest
{
"content": "Your document content here...",
"metadata": {
"title": "Important Document",
"source": "https://example.com/doc",
"author": "John Doe"
}
}GET /api/chat/{chatId}/history?limit=20&offset=0import { APIClient } from './src/utils/APIClient.js';
const client = new APIClient('http://localhost:3000/api');
// Create chat
const { chat } = await client.createChat('admin', 'user123');
// Query with chat context
const result = await client.query(
chat.chat_id,
'Find documents about machine learning',
null,
{ useRAG: true }
);
console.log(result.response);
// Update user profile
await client.updateUserProfile('user123', {
username: 'Jane Doe',
email: 'jane@example.com'
});- Full CRUD access to all resources
- Can modify knowledge graph
- Can edit/delete files
- Can update user profiles
- Can manage all chats
- Read-only access to resources
- Can query knowledge graph (no modifications)
- Can read files (no write/delete)
- Can search documents
- Can view own user profile
All messages are automatically stored with:
- Message content
- Role (user/assistant/system)
- Timestamp
- Metadata (agents used, tools used)
Chat history is automatically injected into agent context for continuity.
User profiles are automatically loaded and injected into the agent context when:
- A chat is associated with a userId
- The profile contains relevant information
- The query might benefit from personalization
The system formats profile data as:
=== User Profile ===
User: John Doe
Email: john@example.com
Phone: +1234567890
Address: 123 Main St
Preferences: {"theme":"dark"}
==================
The MCP server exposes tools for use with Claude Desktop and other MCP clients.
- semantic_search: Search knowledge base using semantic similarity
- graph_search: Search entities in knowledge graph
- get_entity: Get entity details with relationships
- ingest_document: Add documents to knowledge base (admin only)
- query_with_model: Query using specific AI model
- knowledge://graph/stats: Knowledge graph statistics
- knowledge://models/available: List of available AI models
Add to claude_desktop_config.json:
{
"mcpServers": {
"knowledge-base": {
"command": "node",
"args": ["/path/to/knowledgeBase/mcp-server.js"]
}
}
}import requests
# Query the knowledge base
response = requests.post('http://localhost:3000/api/query', json={
'chatId': 'your-chat-id',
'prompt': 'Your question here',
'options': {
'model': 'gemini-pro',
'useRAG': True
}
})
result = response.json()
print(result['response'])import { APIClient } from '../knowledgeBase/src/utils/APIClient.js';
const client = new APIClient();
// Use in extension
const result = await client.queryDirect(
'What is the user\'s email?',
null,
{ chatType: 'user' }
);POST /api/chat/create- Create chat sessionGET /api/chat/:chatId- Get chat detailsGET /api/chat/:chatId/history- Get chat history
POST /api/query- Query with chat contextPOST /api/query/direct- Direct query without chat
POST /api/ingest- Ingest document
GET /api/knowledge/:id- Get entity with relationshipsGET /api/stats/graph- Graph statistics
POST /api/profile- Create/update profileGET /api/profile/:userId- Get profile
GET /api/health- Health checkGET /api/models- List available models
knowledgeBase/
├── src/
│ ├── agents/ # Agent implementations
│ ├── models/ # Model factory
│ ├── tools/ # LangChain tools
│ ├── services/ # Business logic
│ ├── storage/ # Database services
│ ├── mcp/ # MCP server
│ └── utils/ # Utilities
├── config/ # Configuration
├── server.js # API server
├── mcp-server.js # MCP server
└── package.json
- Create agent in
src/agents/YourAgent.js - Add to
AgentOrchestrator.js - Update router logic to include new agent
- Create tool in
src/tools/YourTool.js - Implement using
DynamicStructuredToolwith Zod schema - Add permission checks
- Register in appropriate agent
# PostgreSQL
psql -U postgres
\l # List databases
\c knowledgebase # Connect
\dx # List extensions (should show vector)
# Neo4j
# Check connection in Neo4j Browser: http://localhost:7474# Verify API keys are set
node -e "require('dotenv').config(); console.log(process.env.GOOGLE_API_KEY ? 'Set' : 'Not set')"- Admin chats: Use for write operations
- User chats: Limited to read operations
- Check chat type in responses
MIT
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request