OctaAI is an autonomous AI agent that can design, implement, test, deploy, and manage software projects end-to-end with minimal human intervention.
- Autonomous Planning & Execution: Takes high-level prompts and breaks them into actionable tasks
- Multi-LLM Support: Works with OpenAI, Claude, and Ollama (local models)
- Code Generation & Self-Repair: Generates code, runs tests, and fixes errors automatically
- Remote Server Management: SSH into servers, install packages, configure services
- Deployment Automation: Deploy applications, configure Nginx, setup SSL certificates
- Safety First: Configurable allowed paths and command filtering
octa-agentd (Daemon - Agent Runtime)
|
+-- LLM Provider Layer (OpenAI/Claude/Ollama)
+-- Tool Registry
+-- Filesystem Tools
+-- Code Execution Tools
+-- Git Tools
+-- SSH Tools
+-- HTTP Tools
- Go 1.21+
- Ollama (optional, for local models)
# Clone the repository
git clone https://github.com/mparvin/octaai.git
cd octaai
# Build the project
make build
# Or install directly
make installCreate a configuration file at ~/.config/octaai/config.yaml:
projects_root: "/home/user/Projects"
llm:
provider: "ollama"
model: "qwen2.5:32b"
base_url: "http://localhost:11434"
temperature: 0.3
safety:
allow_paths:
- "/home/user/Projects"
deny_commands:
- "rm -rf /"octa-agentdocta-agent goal "Write a python script that gives weather data about all countries in Europe, store in redis, create a Flask site, name it LocalWeather"octa-agent statusocta-agent goal "IP: 1.2.3.4, Username: root, Password: 123456. Setup nginx, deploy github.com/mparvin/ip9.git, configure SSL for ip9.com"octaai/
├── cmd/
│ ├── octa-agentd/ # Agent daemon
│ └── octa-agent/ # CLI client
├── pkg/
│ ├── agent/ # Core agent logic
│ ├── config/ # Configuration
│ ├── llm/ # LLM provider abstraction
│ ├── storage/ # State persistence
│ └── tools/ # Tool implementations
├── examples/ # Example workflows
└── docs/ # Documentation
- Phase 1: Skeleton & LLM Provider
- Phase 2: Filesystem & Code Runner Tools
- Phase 3: Error Loop & Self-Repair
- Phase 4: SSH Tool
- Phase 5: Workflow Integration
- Phase 6: Memory & Vector Store
MIT License - See LICENSE file for details
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.