Open-Source AI Orchestration Platform The DeepSeek to n8n's OpenAI - Local-first, Developer-focused, Rust-powered
GhostFlow is a local-first AI orchestration platform that lets you build, deploy, and manage AI-powered workflows. Think n8n meets LangChain, but faster, type-safe, and fully under your control.
- π Rust Performance - Blazing fast execution with minimal resource usage
- π Local-First - Run entirely on your hardware, no cloud required
- π€ AI Native - Built-in Ollama, LiteLLM, and Jarvis integration
- π Extensible - Easy node development with full type safety
- π³ Docker Ready - One-command deployment with docker-compose
- π¨ Visual Editor - Leptos-powered web UI (100% Rust)
- π Secure - Air-gapped friendly, zero-trust architecture
# Clone the repository
git clone https://github.com/ghostkellz/ghostflow
cd ghostflow
# Start GhostFlow in development mode
./scripts/start.sh dev
# Access the platform
# UI: http://localhost:8080
# API: http://localhost:3000
# Prerequisites: Rust 1.75+, PostgreSQL, Ollama (optional)
# Build the project
cargo build --release
# Run migrations
sqlx migrate run --database-url postgresql://ghostflow:ghostflow@localhost/ghostflow
# Start the server
cargo run --bin ghostflow-server
# In another terminal, start the UI
cargo run --bin ghostflow-ui
ghostflow/
βββ crates/
β βββ ghostflow-core/ # Core traits and types
β βββ ghostflow-schema/ # Flow schemas and models
β βββ ghostflow-engine/ # Execution engine
β βββ ghostflow-nodes/ # Built-in nodes
β βββ ghostflow-api/ # REST/WebSocket API
β βββ ghostflow-ui/ # Leptos web UI
β βββ ghostflow-jarvis/ # Jarvis CLI integration
β βββ ghostflow-server/ # Main server binary
β βββ ghostflow-cli/ # gflow CLI tool
βββ migrations/ # PostgreSQL migrations
βββ docker-compose.yml # Docker orchestration
βββ Dockerfile # Multi-stage build
- HTTP Request - Make API calls with full request control
- Webhook - Receive incoming HTTP requests
- Template - Process templates with variable substitution
- If/Else - Conditional flow control
- Delay - Time-based flow control
- Ollama Generate - Local LLM text generation
- Ollama Embeddings - Generate vector embeddings
- Jarvis Command - Execute Rust CLI automation
- Database Query (PostgreSQL, MySQL, SQLite)
- Vector Database (Qdrant, Weaviate)
- Email (SMTP/IMAP)
- Slack/Discord
- OpenAI/Anthropic
use ghostflow_core::{Node, Result};
use async_trait::async_trait;
pub struct MyCustomNode;
#[async_trait]
impl Node for MyCustomNode {
fn definition(&self) -> NodeDefinition {
// Define inputs, outputs, and parameters
}
async fn execute(&self, context: ExecutionContext) -> Result<Value> {
// Your node logic here
}
}
# Run all tests
cargo test
# Run specific crate tests
cargo test -p ghostflow-engine
# Run with logging
RUST_LOG=debug cargo test
The docker-compose setup includes:
- PostgreSQL - Flow and execution storage
- MinIO - S3-compatible artifact storage
- Ollama - Local LLM runtime
- GhostFlow - Main application
- Adminer - Database UI (dev only)
GET /api/flows # List all flows
POST /api/flows # Create new flow
GET /api/flows/:id # Get flow details
PUT /api/flows/:id # Update flow
DELETE /api/flows/:id # Delete flow
POST /api/flows/:id/execute # Execute flow
GET /api/executions # List executions
GET /api/executions/:id # Get execution details
GET /api/nodes # List available nodes
Connect to /ws
for real-time execution updates.
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
- Additional node implementations
- UI/UX improvements
- Documentation
- Testing
- Performance optimization
See TODO.md for detailed roadmap and progress.
- β Core execution engine
- β Basic nodes
- β Docker setup
- π§ Web UI improvements
- π§ More AI integrations
- π Authentication & RBAC
- π Distributed execution
Built with amazing open-source projects:
- Rust - Systems programming language
- Leptos - Full-stack Rust web framework
- Axum - Web application framework
- Ollama - Local LLM runtime
- PostgreSQL - Database
MIT License - see LICENSE for details.
Built with β€οΈ by the GhostFlow Community
Fast. Flexible. Fully yours.