Skip to content

gitcoder89431/agentic

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Agentic

    █████╗  ██████╗ ███████╗███╗   ██╗████████╗██╗ ██████╗
   ██╔══██╗██╔════╝ ██╔════╝████╗  ██║╚══██╔══╝██║██╔════╝
   ███████║██║  ███╗█████╗  ██╔██╗ ██║   ██║   ██║██║     
   ██╔══██║██║   ██║██╔══╝  ██║╚██╗██║   ██║   ██║██║     
   ██║  ██║╚██████╔╝███████╗██║ ╚████║   ██║   ██║╚██████╗
   ╚═╝  ╚═╝ ╚═════╝ ╚══════╝╚═╝  ╚═══╝   ╚═╝   ╚═╝ ╚═════╝

License: MIT Crate

Agentic :: The agent you work WITH
AI Model Orchestrator & Agent Framework

Demo

Core Philosophy

Agentic transforms the typical command-response dynamic into true collaboration. Instead of barking orders at an AI, you work together through thoughtful query refinement and synthesis. Our "Karesansui" design philosophy creates a zen garden of computational thought - clean, minimalist, and purposeful. Every interaction flows like carefully placed stones in sand, building toward profound understanding rather than quick answers.

Key Features

Collaborative Query Refinement via a Local AI Orchestrator
Multi-Provider Support - Ollama, LM Studio, or any OpenAI-compatible API • Seamless Integration with Powerful Cloud Models (via OpenRouter)
Minimalist, Keyboard-Driven "Zen Garden" TUI
Creates Structured, "Atomic Notes" (Markdown + YAML) for your Knowledge Base
Built in Rust 🦀 for a Fast, Native Experience

Installation

Install via Cargo (Recommended)

cargo install ruixen
# Then run with:
ruixen

Build from Source

git clone https://github.com/gitcoder89431/agentic.git
cd agentic
cargo build --release
./target/release/ruixen

Configuration

⚠️ Important: Agentic requires BOTH a local AI model (for query orchestration) and a cloud model (for synthesis) to function. The local model privately refines your questions, then the cloud model creates the final insights.

Complete Setup Guide

Follow these steps in order - you need both components:

Step 1: Local AI Setup (Required)

Choose either Ollama OR LM Studio:

Option A: Ollama (Recommended for beginners)

  1. Install Ollama (Free, runs on your computer)

    # macOS
    brew install ollama
    
    # Or download from: https://ollama.ai
  2. Download a Local Model

    # Start Ollama service
    ollama serve
    
    # In another terminal, pull a model (this may take a few minutes)
    ollama pull llama3.2:3b    # Good balance of speed/quality
    # or
    ollama pull qwen2.5:7b     # Higher quality, needs more RAM
  3. Configure in Agentic

    • In Settings, set "Local Endpoint" to localhost:11434
    • Agentic auto-detects Ollama and loads your models
    • Select your downloaded model from the list

Option B: LM Studio (For power users)

  1. Install LM Studio from https://lmstudio.ai
  2. Download and load a model in LM Studio
  3. Start the local server (usually localhost:1234)
  4. Configure in Agentic
    • In Settings, set "Local Endpoint" to localhost:1234
    • Agentic auto-detects LM Studio and loads your models
    • Select your model from the list

Step 2: Cloud Setup (Required)

  1. Get an OpenRouter Account

    • Visit openrouter.ai and sign up (takes 2 minutes)
    • Generate an API key from your dashboard
    • Add $5-10 credit OR use free models (see guide below)
  2. Configure in Agentic

    • Run ruixen in your terminal
    • Press s to open Settings
    • Navigate to "Cloud API Key" and paste your OpenRouter key
    • Browse available models and select one (see model selection guide below)
    • Press s to save

🎯 Model Selection Guide

When choosing a cloud model in Agentic's settings, look for these indicators:

💰 Cost Structure:

  • Models with :free suffix = Completely free (perfect for learning)
  • Models with pricing = Pay per token (~$0.50-10 per 1M tokens)
  • Check the "pricing" column to see prompt/completion costs

🧠 Model Types:

  • Look for :instruct or :chat in the name = Good for conversations (what you want)
  • Avoid :base models = Raw models without instruction training
  • Avoid :embed models = For embeddings only, not chat

📏 Context Length:

  • 4k-8k tokens = Good for short conversations
  • 32k-128k tokens = Better for longer discussions
  • 1M+ tokens = Can handle very long contexts (costs more)

🏷️ Model Families:

  • anthropic/claude-* = Excellent reasoning and safety
  • openai/gpt-* = Well-rounded performance
  • meta-llama/* = Open source, good quality
  • google/gemini-* = Strong at analysis and coding
  • deepseek/* = Often have free versions available

💡 Beginner Tips:

  • Start with any :free model to test the system
  • If you have credits, try anthropic/claude-3.5-sonnet for quality
  • Higher context length = more expensive but can handle longer discussions
  • The model list updates frequently - newer models often perform better

Step 3: Ready to Collaborate!

  • Type your question naturally
  • Watch the local model orchestrate thoughtful proposals
  • Choose a proposal for the cloud model to synthesize
  • Save the resulting "atomic note" to your knowledge base
  • Files are automatically saved to ~/Documents/ruixen/ as Markdown with YAML metadata

Why Both Models?

The local model (Ollama) handles query orchestration privately on your machine, while the cloud model (OpenRouter) provides powerful synthesis capabilities. This hybrid approach gives you both privacy and cutting-edge AI performance!

Troubleshooting

"Local endpoint not accessible"

  • Make sure Ollama is running: ollama serve
  • Check the endpoint in settings: localhost:11434

"OpenRouter API key invalid"

  • Verify your key starts with sk-or-v1-
  • Check you have credits or selected a free model

"Model not found"

  • For local: ensure model is downloaded with ollama list
  • For cloud: verify model name exactly matches OpenRouter's list

Usage

Navigation

  • Tab/Shift+Tab - Navigate between UI elements
  • ↑/↓ or j/k - Move through lists and proposals
  • Enter - Select/Confirm actions
  • Esc - Return to previous screen
  • q - Quit application

Slash Commands

  • /settings - Open configuration modal
  • /about - View application information
  • /quit - Exit the application

Key Bindings

  • s - Quick access to Settings
  • a - Quick access to About
  • Left/Right - Scroll through About page content

Architecture

Agentic follows the RuixenOS workspace architecture:

agentic/
├── crates/
│   ├── agentic-core/     # The "motor" - reusable AI logic
│   ├── agentic-tui/      # The "drill" - terminal interface  
│   └── starlit-gui/      # Future graphical interface
└── Cargo.toml            # Workspace configuration

This modular design allows the same AI capabilities to power multiple interfaces while maintaining clean separation between logic and presentation.

License

MIT License - see LICENSE for details.

Credits

UI Theme: Everforest color scheme by sainnhe - A comfortable and pleasant green forest color scheme.

Contributing

Built with constitutional Rust patterns and love. Issues and PRs welcome!


The curiosity machine doesn't just process queries - it awakens wonder.