An experimental orchestration language for AI models and system commands
Mairex is an orchestration language that allows you to coordinate AI models, shell commands, and data flows using a JSON-based syntax with specialized operators. It's designed for developers who want to prototype AI workflows and automation scripts.
- Sequential execution only (parallel execution syntax exists but runs sequentially)
- Basic error handling
- Limited debugging capabilities
- Many planned features not yet implemented
- Documentation may be ahead of implementation in some areas
Mairex lets you write scripts that combine shell commands and AI model calls in a declarative way. For example, you can download websites, process them with AI models, and save outputs to files - all coordinated through a single .jsom file.
- 🤖 Native AI Integration - Call Ollama, OpenAI, Anthropic, Gemini, and XAI models directly
- 🔄 Parallel Execution - Run multiple shells and AI models concurrently
- 🔗 Chainable Operations - Flow data between commands, files, and AI models
- 📦 Variable Scoping - Custom and AI-specific variable management
- 🎯 JSON-Based Syntax - Familiar structure with powerful extensions
- 🛠️ Shell Integration - Execute any terminal command with persistent shell sessions
pip install mairexCreate a file hello.jsom:
{
"greeting": {
"set_input": [
"~| A&I <&¤S- 'World' |~"
],
"set_prompt": [
"~| A&P <&¤S- 'Say hello to the input' |~"
],
"call_ai": [
"~| A&O -$S> |>echo '<$>'<| |~"
]
}
}Run it:
mairex hello.jsomWhat this does:
- Sets AI input to "World"
- Sets AI prompt to "Say hello to the input"
- Calls the AI model and echoes the response
Mairex scripts use .jsom files (JSON + Mairex). They follow standard JSON syntax with one rule:
All leaf nodes must be arrays:
{
"step": {
"action": ["value"]
}
}NOT:
{
"step": {
"action": "value"
}
}Instructions are declared between ~| |~ specifiers:
["~| |>echo 'Hello'<| |~"]Shell commands go between |> <|:
["~| |>ls -la<| |~"]Custom Variables (shared across shells, scoped to function):
["~| VAR&V <&¤S- 'my value' |~"]AI Variables (shell-specific, persistent across tree levels):
["~| A&I <&¤S- 'AI input' |~"]Left to right:
["~| |>echo 'output'<| -&#> FILE&V -€S> result.txt |~"]Right to left:
["~| FILE&V <&€- result.txt <&#- |>cat file.txt<| |~"]Separate shells (parallel):
{
"parallel_tasks": [
"~| |>echo 'Shell 1'<| |~",
"~| |>echo 'Shell 2'<| |~",
"~| |>echo 'Shell 3'<| |~"
]
}Each array element runs in its own independent shell session.
- Syntax Reference - Complete language specification
- Tutorial - Step-by-step guide
- Examples - Real-world use cases
- Python 3.8+
- Dependencies (auto-installed):
ollama- Local AI model supportlitellm- Multi-provider AI API supportlizard- Code analysis for function extractionwhats_that_code- Programming language detection
- Install Ollama: https://ollama.ai
- Pull a model:
ollama pull llama3 - No API keys needed - works out of the box!
Create API_keys.json in your working directory:
{
"openai": "sk-your-key-here",
"anthropic": "sk-ant-your-key-here",
"gemini": "your-gemini-key",
"xai": "your-xai-key"
}Set the provider in your JSOM file:
["~| A&S <&¤S- 'openai' |~"]
["~| A&M <&¤S- 'gpt-4o' |~"]MIT License
This is an early alpha release. The project is not currently accepting outside contributions. Bug reports and feedback are welcome via GitHub issues.
An experimental tool for AI orchestration