A terminal chatbot that speaks to modern LLMs — written in a language older than the moon landing.
🤓 Be a nerd — get to the bottom of it first. :P
cobold-cli is a fully functional AI chat agent implemented in COBOL 85. It talks to any model on OpenRouter, remembers the entire conversation, and — because a chatbot isn't a chatbot without tool use — it runs a proper agent loop: the model can call a weather tool, get real data back, and reason about it.
No JSON library. No HTTP library. No dependencies beyond curl and a COBOL compiler. Every byte of JSON is sliced character by character by hand-written COBOL paragraphs.
Built as a submission for the ai_devs4 course. What started as "can I even compile hello world in COBOL" escalated rapidly.
The base cobold-cli is the minimal viable agent. The real fun begins in the extended build, where more tools, tighter context management, and richer reasoning logic are wired in to actually solve tasks from the ai_devs4 course — the one run by Mateusz Chrobok on YouTube (Polish, sorry international friends 🇵🇱).
┌────────────────────┐
│ you @> prompt │
└─────────┬──────────┘
│
▼
┌───────────────────────────────┐ ┌──────────────────┐
│ main.cob │───────▶│ env-reader │
│ (REPL + banner + counter) │ │ prompt-loader │
└───────────────┬───────────────┘ └──────────────────┘
│
▼
┌───────────────────────────────┐
│ context-mgr.cob │◀──┐
│ (grows the messages JSON) │ │
└───────────────┬───────────────┘ │
│ │ append reply
▼ │
┌───────────────────────────────┐ │ ┌────────────────────┐
│ ai-caller.cob │───┼────▶│ OpenRouter API │
│ curl · parse · agent loop │◀──┘ │ (any LLM model) │
└───────────────┬───────────────┘ └────────────────────┘
│ tool_calls?
▼
┌───────────────────────────────┐ ┌────────────────────┐
│ weather-tool.cob │────────▶│ wttr.in │
│ get_weather(location) │◀────────│ (plain text) │
└───────────────────────────────┘ └────────────────────┘
| File | Program | What it does |
|---|---|---|
| 🎛️ src/main.cob | COBOLD-CLI |
REPL loop, ANSI-coloured banner, context counter |
| 🧮 src/context-mgr.cob | CONTEXT-MGR |
Escapes & appends each turn into one growing JSON array |
| 🌐 src/ai-caller.cob | AI-CALLER |
Builds the payload, shells out to curl, parses the response, drives the tool-call loop |
| ⛅ src/weather-tool.cob | WEATHER-TOOL |
Looks up live weather on wttr.in |
| 🔐 src/env-reader.cob | ENV-READER |
Parses the .env sitting next to the binary |
| 📜 src/prompt-loader.cob | PROMPT-LOADER |
Loads the system prompt from prompts/system-prompt.txt |
When the LLM responds with a tool_calls block instead of plain text, AI-CALLER:
- Detects the
"tool_calls"marker by scanning the raw response byte-by-byte - Extracts the function name, arguments JSON, and call ID
- Dispatches to the matching COBOL tool program (currently just
get_weather) - Appends both the assistant tool-call message and the tool result to the context
- Re-sends the full conversation to the API
- Repeats until the model finally returns a plain text reply
Everything lives in a single PIC X(60000) buffer. Messages are appended with STRING … INTO CM-JSON WITH POINTER WS-PTR, overwriting the closing ]. Escaping (" → \", \n, \t, \r, \\) and the reverse unescape pass are done one character at a time. It is exactly as fun as it sounds.
| Tool | Why | |
|---|---|---|
| 🏛️ | GnuCOBOL (cobc) |
Compiles the sources to a native binary |
| 📡 | curl | The HTTP layer — both for OpenRouter and wttr.in |
| 🔑 | OpenRouter API key | Unlocks any supported LLM |
On macOS:
brew install gnu-cobol# 1. Clone
git clone <repo-url> && cd cobold-cli
# 2. Configure
cp .env.example .env
# OPENROUTER_API_KEY=sk-or-...
# OPENROUTER_MODEL=openai/gpt-4o-mini
# 3. Build
make
# 4. Run
./dist/coboldType your message, hit Enter. Type /q to quit.
💡 The binary locates
.envandprompts/system-prompt.txtrelative to its own path, sodist/is fully self-contained — copy it anywhere.
.env (sits next to the binary)
OPENROUTER_API_KEY=sk-or-...
OPENROUTER_MODEL=openai/gpt-4o-miniprompts/system-prompt.txt — the persona and instructions loaded as the first system message. Edit freely.
After every turn the footer prints:
context @> 00832/60000 chars used
That's literally the byte length of the in-memory JSON buffer. When it fills up, STRING … WITH POINTER simply stops writing — so treat 60 000 chars as your hard limit and expect older history to get silently clipped near the edge.
- More tools (web search, file read, shell)
- Streaming responses instead of one big blocking call
- Sliding-window context trimming when the buffer fills
- Maybe — maybe — a markdown renderer. In COBOL. Pray for me.
☕ My wife — for feeding and caffeinating me through the three days it took to write and test this thing. None of the COBOL would have compiled without her.
🍷 The Italians — for the wine I drank to release the frustration of parsing JSON by hand. (Sorry France, the local shop didn't stock any Grand Vin de Bordeaux.)
🤘 Bartosz — for believing I was the only sick bastard who could actually pull this off. YOLO! 😈
⚓ Grace Hopper — for laying the foundations of COBOL, the Python of the 70's. o7
🎨 Logo based on artwork by Christopher Burdett for Wizards of the Coast christopherburdett.com
📚 Built for the ai_devs4 course
Made with PIC X(2000) and questionable decisions.

