Skip to content

KartikB3/llmos

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLMOS

Natural language Linux orchestrator: describe what you want → LLM generates commands → validated for safety → executed in Docker containers.

Architecture

Next.js Frontend → FastAPI Backend → Rust CLI (llmos-exec) → Docker
  • Frontend (frontend/): Next.js app with xterm.js terminal, WebSocket live streaming
  • Backend (backend/): FastAPI with SQLite, job queue, LLM integration (OpenAI)
  • Rust CLI (llmos-exec/): Streams NDJSON from Docker containers (stdout/stderr + exit code)

Supported Distros

Ubuntu, Debian, Fedora, Arch, Alpine, CentOS/Rocky, NixOS

Setup

Prerequisites

  • Docker, Node.js 18+, Python 3.11+, Rust toolchain

Backend

cd backend
cp .env.example .env  # Add your OPENAI_API_KEY
pip install -r requirements.txt
python start_uvicorn.py

Frontend

cd frontend
npm install
npm run dev

Rust CLI

cd llmos-exec
cargo build --release

Environment Variables

Variable Description Default
OPENAI_API_KEY OpenAI API key (required)
OPENAI_MODEL Model to use gpt-4o-mini
LLM_PROVIDER LLM provider openai
DATABASE_URL SQLite path sqlite:///./llmos.db
NEXT_PUBLIC_API_URL Backend URL for frontend http://localhost:8000

API Endpoints

  • POST /generate - Generate commands from natural language
  • POST /validate - Validate commands for safety
  • POST /execute - Execute commands (SSE streaming)
  • POST /enqueue - Queue execution for async processing
  • GET /history - Execution history
  • GET /executions/{id}/status - Job status
  • WS /ws/execute-live - WebSocket live terminal
  • WS /ws/execute/{id} - WebSocket stream for existing job

Execution Options

The Rust CLI accepts resource limits via the JSON payload:

{
  "distro": "ubuntu",
  "commands": ["echo hello"],
  "memory_limit": "512m",
  "cpu_limit": "1.0",
  "container_name": "my-session",
  "keep_alive": true
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors