A bash CLI utility that orchestrates file-to-context compaction workflows using cllm.
Point it at a directory of files and a prompt, and it will build conversation context
from your files and execute your compaction prompt in one command.
Install the cllm CLI (v0.1.6 or newer) from o3-cloud/cllm:
uv tool install https://github.com/o3-cloud/cllm.git
cllm --version # Verify installationAdd the cllm-compactor script to your PATH:
# Option 1: Copy to a directory in your PATH
cp cllm-compactor /usr/local/bin/
chmod +x /usr/local/bin/cllm-compactor
# Option 2: Add this repo to your PATH
export PATH="/path/to/cllm-compactor:$PATH"# Summarize all markdown files in a directory
cllm-compactor --directory docs/decisions --prompt "Summarize all decisions"
# With verbose output to see what's happening
cllm-compactor -d docs/ -p "Create overview" -v
# Save output to a file
cllm-compactor -d src/ -p "List all TODO comments" -o summary.txtcllm-compactor --directory <dir> --prompt <prompt> [options]-d, --directory DIR- Directory containing files to compact-p, --prompt PROMPT- Final prompt to execute after loading context
-r, --recursive- Recursively process subdirectories (default: false)--pattern PATTERN- Glob pattern for file filtering (default:*.md)-o, --output FILE- Write output to file instead of stdout--cllm-path PATH- Custom cllm configuration directory--conversation-id ID- Reuse existing conversation for iterative refinement--no-cleanup- Don't clean up conversation after completion-v, --verbose- Show verbose output including file processing details--version- Show version information-h, --help- Show help message
Summarize architectural decisions:
cllm-compactor -d docs/decisions -p "Summarize all architectural decisions"Recursive search with custom pattern:
cllm-compactor -d src/ -r --pattern "*.py" -p "List all TODO comments"Iterative refinement using the same conversation:
# First pass
cllm-compactor -d docs/ -p "Create initial summary" \
--conversation-id abc-123 --no-cleanup
# Add more details to the same conversation
cllm-compactor -d docs/ -p "Now add risk analysis" \
--conversation-id abc-123Pipe output to another tool:
cllm-compactor -d . -p "List key files" | grep "important"- File Discovery - Scans the specified directory for files matching the pattern
- Context Building - Feeds each file's content into a
cllmconversation - Prompt Execution - Executes your final prompt with all file context loaded
- Output - Returns the compacted result to stdout or a file
The tool wraps cllm to handle the orchestration of file looping and context
building, so you can focus on crafting effective compaction prompts.
0- Success1- Invalid arguments or usage2- Missing dependencies (cllm not found)3- File or directory errors4- LLM execution failure
You can customize cllm behavior by providing a custom configuration directory:
cllm-compactor -d docs/ -p "Summarize" --cllm-path ./my-config/Your config directory should contain a Cllmfile.yml with settings like model,
temperature, and system messages. See context/compaction/Cllmfile.yml for an
example compaction configuration.
This project follows the Vibe ADR methodology for capturing architectural decisions. All major design choices are documented in docs/decisions/.
cllm-compactor/
├── cllm-compactor # Main CLI utility (bash)
├── context/
│ ├── compaction.sh # Original prototype script
│ ├── verify_cllm.sh # CLLM installation smoke test
│ └── compaction/ # Example CLLM configuration
│ └── Cllmfile.yml # Model and prompt settings
├── docs/
│ └── decisions/ # Architecture Decision Records
└── llms.txt # AI agent onboarding instructions
All architectural decisions are documented as ADRs in docs/decisions/:
-
ADR-0001: Adopt Vibe ADR for Decision Records
- Status: ✅ Accepted
- Standardizes decision documentation with vibe-oriented language
- Ensures consistent structure for both humans and AI agents
-
ADR-0002: Define llms.txt Bootstrap Instructions
- Status: ✅ Accepted
- Publishes
llms.txtfor zero-friction AI agent onboarding - Follows community standards at llmstxt.org
-
ADR-0003: Adopt CLLM for Context Compaction Workflows
- Status: ✅ Accepted
- Uses CLLM for bash-native LLM orchestration
- Enables conversation management and multi-provider support
-
ADR-0004: Build cllm-compactor CLI Utility
- Status: ✅ Accepted
- Implements standalone CLI for file-to-context compaction
- Abstracts directory traversal and prompt execution patterns
Prototype scripts that inspired this tool:
context/compaction.sh- Original bash prototype showing the core workflowcontext/compaction/Cllmfile.yml- Example configuration for summarization taskscontext/compaction/output/compacted_summary.txt- Sample output showing target quality
- Architecture Decision Records - All major design decisions
- llms.txt - AI agent onboarding instructions
- Templates - Vibe ADR templates for new decisions
We welcome contributions! Areas of interest:
- Additional file filtering options (size limits, binary detection)
- Support for piping file lists via stdin
- Template prompt library for common compaction tasks
- Enhanced conversation management and cleanup
- Integration examples for git hooks and CI workflows
Before making significant changes, please:
- Review existing Architecture Decision Records
- Create a new ADR for major architectural changes
- Follow the Vibe ADR template
Share your ideas, pain points, and cllm command patterns that work well for you.