The low-overhead native app lives in native/. It is menu-bar-only: type a prompt, press Run, and it dispatches a Codex agent. Running agents show in the popover and can be quit individually.
Build and launch:
bash native/build-app.sh
open "native/build/CUA Codex Menu.app"Builds are ad-hoc signed by default. Use a stable local signing identity if you want macOS permission grants to persist across rebuilds:
SIGN_IDENTITY="Apple Development: Your Name (TEAMID)" bash native/build-app.shThe app itself does not need microphone or speech permissions.
Codex must have CUA registered as MCP:
codex mcp add cua-driver -- /usr/local/bin/cua-driver mcp- macOS 14 or newer.
- CUA Driver installed and granted Accessibility and Screen Recording permissions.
- Codex CLI authenticated with your ChatGPT/Codex subscription or an API-authenticated Codex-compatible account.
Install CUA Driver:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/trycua/cua/main/libs/cua-driver/scripts/install.sh)"Verify CUA Driver:
cua-driver launch_app '{"bundle_id":"com.apple.calculator"}'The app launches Codex through /usr/bin/env with:
Command: /usr/bin/env
Args: codex exec --model {model} --dangerously-bypass-approvals-and-sandbox -
Model: gpt-5.3-codex-spark
CUA Driver: /usr/local/bin/cua-driverThe {model} token is replaced with the model field before spawning the process. The final prompt is piped to stdin, so the CLI args must include whatever mode reads task input from stdin.
The default uses Codex's full-access mode because CUA controls real macOS apps outside the project workspace. Use this app only for tasks you trust the local Codex session to perform.
If your Codex subscription uses a different local command or flags, edit native/Sources/CUACodexMenu/AppConfig.swift.
This app does not bypass authentication or scrape ChatGPT. It invokes your local Codex tooling. If gpt-5.3-codex-spark is only available inside the ChatGPT product and not exposed to a local CLI/API in your account, the CLI will fail until OpenAI exposes that access path or you configure an available model.