Strongly-typed, async-first Rust SDK for building agents on the OpenAI Codex CLI.
- One-shot queries —
query()sends a prompt and returns a collectedTurn - Streaming —
query_stream()/thread.run_streamed()yield events as they arrive - Multi-turn threads —
start_thread()/resume_thread()for persistent conversations - Item lifecycle events — structured
ThreadEventstream (ItemStarted → ItemUpdated → ItemCompleted) - Approval callbacks —
ApprovalCallbackfor programmatic per-command and per-patch approval - Sandbox control —
SandboxPolicyfromRestrictedtoDangerFullAccess - Reasoning effort —
ReasoningEffortfromMinimaltoXHigh - Web search —
WebSearchMode(Disabled / Cached / Live) - Structured output —
OutputSchema(inline JSON or file path) for typed responses - Local providers — point at lmstudio, ollama, or other local models
- Multimodal input — attach image files to any prompt
- Config overrides — pass flat or nested JSON overrides to the CLI via
-c key=val - Stderr callback — capture CLI debug output for logging/diagnostics
- Testing framework —
MockTransport, event builders, and fixtures for unit tests without a live CLI - Cross-platform — macOS, Linux, and Windows
[dependencies]
codex-cli-sdk = "0.1"
tokio = { version = "1", features = ["macros", "rt-multi-thread"] }use codex_cli_sdk::{query, CodexConfig, ThreadOptions};
#[tokio::main]
async fn main() -> codex_cli_sdk::Result<()> {
let turn = query(
"List the files in this directory",
CodexConfig::default(),
ThreadOptions::default(),
).await?;
println!("{}", turn.final_response);
Ok(())
} CodexConfig
(cli_path, env, overrides)
│
▼
┌─────────────┐ ┌─────────────────┐ ┌─────────────────┐ ┌────────────┐
│ Your Code │──▶│ Codex │──▶│ CliTransport │──▶│ Codex CLI │
│ │ │ │ │ (or Mock) │◀──│ │
│ query() │ │ start_thread() │ │ stdin/stdout │ │ JSONL │
│ query_ │ │ resume_thread()│ └─────────────────┘ │ stdio │
│ stream() │ └────────┬────────┘ └────────────┘
└─────────────┘ │
▼
Thread::run() → Turn
Thread::run_streamed() → StreamedTurn (yields ThreadEvents)
All examples require a working Codex CLI installation (npm install -g @openai/codex).
| Example | Feature | Run |
|---|---|---|
01_basic_query |
One-shot query + token usage | cargo run --example 01_basic_query |
02_streaming |
Real-time event stream + item lifecycle | cargo run --example 02_streaming |
03_multi_turn |
Thread resumption across turns | cargo run --example 03_multi_turn |
04_approval |
Exec approval callbacks + all four decisions | cargo run --example 04_approval |
05_sandbox |
Sandbox policies, extra dirs, network access | cargo run --example 05_sandbox |
06_structured_output |
JSON Schema output + typed deserialization | cargo run --example 06_structured_output |
07_local_provider |
lmstudio / ollama + reasoning effort | cargo run --example 07_local_provider |
08_cancellation |
CancellationToken mid-stream abort |
cargo run --example 08_cancellation |
use codex_cli_sdk::{query, CodexConfig, ThreadOptions};
let turn = query(
"Explain the main.rs file",
CodexConfig::default(),
ThreadOptions::default(),
).await?;
println!("{}", turn.final_response);
println!("Tokens: in={} out={}",
turn.usage.as_ref().map_or(0, |u| u.input_tokens),
turn.usage.as_ref().map_or(0, |u| u.output_tokens),
);use codex_cli_sdk::{query_stream, CodexConfig, ThreadOptions, ThreadEvent, ThreadItem};
use tokio_stream::StreamExt;
let mut stream = query_stream(
"Refactor the error handling in src/",
CodexConfig::default(),
ThreadOptions::default(),
).await?;
while let Some(event) = stream.next().await {
match event? {
ThreadEvent::ItemUpdated { item: ThreadItem::AgentMessage { text, .. } } => {
print!("{text}");
}
ThreadEvent::ItemStarted { item: ThreadItem::CommandExecution { command, .. } } => {
println!("\n> Running: {command}");
}
ThreadEvent::ItemCompleted { item: ThreadItem::FileChange { changes, .. } } => {
for change in &changes {
println!(" {} {:?}", change.path, change.kind);
}
}
ThreadEvent::TurnCompleted { usage } => {
println!("\n--- {}/{} tokens ---", usage.input_tokens, usage.output_tokens);
}
_ => {}
}
}use codex_cli_sdk::{Codex, CodexConfig, ThreadOptions};
let codex = Codex::new(CodexConfig::default())?;
// First turn
let mut thread = codex.start_thread(ThreadOptions::default());
let turn = thread.run("Create a hello.py file", Default::default()).await?;
let thread_id = thread.id().unwrap();
// Resume in a later session
let mut thread = codex.resume_thread(&thread_id, ThreadOptions::default());
let turn = thread.run("Now add error handling to hello.py", Default::default()).await?;use codex_cli_sdk::{Codex, CodexConfig, ThreadOptions};
use codex_cli_sdk::config::ApprovalPolicy;
use codex_cli_sdk::permissions::{ApprovalCallback, ApprovalContext, ApprovalDecision};
use std::sync::Arc;
let options = ThreadOptions::builder()
.approval(ApprovalPolicy::OnRequest)
.build();
let callback: ApprovalCallback = Arc::new(|ctx: ApprovalContext| {
Box::pin(async move {
println!("Approve command: {}", ctx.request.command);
if ctx.request.command.starts_with("rm") {
ApprovalDecision::Denied
} else {
ApprovalDecision::Approved
}
})
});
let codex = Codex::new(CodexConfig::default())?;
let mut thread = codex
.start_thread(options)
.with_approval_callback(callback);
let turn = thread.run("Clean up the build artifacts", Default::default()).await?;run_streamed() yields a sequence of ThreadEvents following this lifecycle:
ThreadStarted
└─ TurnStarted
├─ ItemStarted { item: AgentMessage | CommandExecution | Reasoning | ... }
│ ItemUpdated (streaming text deltas)
│ ItemCompleted
│
├─ ApprovalRequest ←── responded to via ApprovalCallback
└─ TurnCompleted { usage }
| Variant | Description |
|---|---|
AgentMessage |
Text output from the agent (streams via ItemUpdated) |
Reasoning |
Extended reasoning/thinking block |
CommandExecution |
Shell command run by the agent, with output and exit code |
FileChange |
File creation, modification, or deletion (patch) |
McpToolCall |
MCP tool invocation with server, tool name, args, and result |
WebSearch |
Web search query |
TodoList |
Agent-managed task list |
Error |
Error item from the CLI |
use codex_cli_sdk::config::SandboxPolicy;
let options = ThreadOptions::builder()
.sandbox(SandboxPolicy::Restricted) // read-only
// .sandbox(SandboxPolicy::WorkspaceWrite) // default — workspace writable
// .sandbox(SandboxPolicy::DangerFullAccess) // no restrictions
.build();use codex_cli_sdk::config::ReasoningEffort;
let options = ThreadOptions::builder()
.reasoning_effort(ReasoningEffort::High)
.build();Available levels: Minimal, Low, Medium, High, XHigh.
use codex_cli_sdk::config::OutputSchema;
let options = ThreadOptions::builder()
.output_schema(OutputSchema::Inline(serde_json::json!({
"type": "object",
"properties": {
"summary": { "type": "string" },
"files_changed": { "type": "array", "items": { "type": "string" } }
},
"required": ["summary", "files_changed"]
})))
.build();An inline schema is automatically written to a temp file and cleaned up after the turn.
use codex_cli_sdk::config::WebSearchMode;
let options = ThreadOptions::builder()
.web_search(WebSearchMode::Live)
.build();let options = ThreadOptions::builder()
.local_provider("lmstudio") // or "ollama"
.model("qwen2.5-coder-7b")
.build();use codex_cli_sdk::config::ConfigOverrides;
// Nested JSON — auto-flattened to dot-notation `-c key=val` pairs
let config = CodexConfig::builder()
.config_overrides(ConfigOverrides::Json(serde_json::json!({
"sandbox_workspace_write": { "network_access": true }
})))
.build();
// Or flat key-value pairs
let config = CodexConfig::builder()
.config_overrides(ConfigOverrides::Flat(vec![
("model".into(), "o4-mini".into()),
]))
.build();use std::path::PathBuf;
let options = ThreadOptions::builder()
.images(vec![PathBuf::from("screenshot.png")])
.build();
let turn = thread.run(
"Describe what you see in this screenshot",
Default::default(),
).await?;use std::sync::Arc;
let config = CodexConfig::builder()
.stderr_callback(Arc::new(|line: &str| {
eprintln!("[codex] {line}");
}))
.build();use tokio_util::sync::CancellationToken;
use codex_cli_sdk::config::TurnOptions;
let cancel = CancellationToken::new();
let turn_opts = TurnOptions { cancel: Some(cancel.clone()), ..Default::default() };
// Cancel from another task:
cancel.cancel();
let result = thread.run_streamed("Long task", turn_opts).await;| Field | Type | Default | Description |
|---|---|---|---|
cli_path |
Option<PathBuf> |
None |
Path to codex binary; auto-discovered if None |
env |
HashMap<String, String> |
{} |
Extra environment variables for the subprocess |
config_overrides |
ConfigOverrides |
None |
Flat or nested JSON overrides (-c key=val) |
profile |
Option<String> |
None |
Config profile name (--profile) |
connect_timeout |
Option<Duration> |
30s |
Deadline for process spawn and init |
close_timeout |
Option<Duration> |
10s |
Deadline for graceful shutdown |
version_check_timeout |
Option<Duration> |
5s |
Deadline for codex --version check |
stderr_callback |
Option<StderrCallback> |
None |
Invoked with each line of CLI stderr |
Set any Option<Duration> to None to wait indefinitely.
| Field | Type | Default | Description |
|---|---|---|---|
working_directory |
Option<PathBuf> |
None |
Working directory (--cd) |
model |
Option<String> |
None |
Model name, e.g. "o4-mini", "gpt-5-codex" |
sandbox |
SandboxPolicy |
WorkspaceWrite |
Sandbox isolation level |
approval |
ApprovalPolicy |
Never |
When to request approval for actions |
additional_directories |
Vec<PathBuf> |
[] |
Extra writable directories (--add-dir) |
skip_git_repo_check |
bool |
false |
Skip git repository validation |
reasoning_effort |
Option<ReasoningEffort> |
None |
Reasoning effort level |
network_access |
Option<bool> |
None |
Enable network access inside sandbox |
web_search |
Option<WebSearchMode> |
None |
Web search mode |
output_schema |
Option<OutputSchema> |
None |
JSON Schema for structured output |
ephemeral |
bool |
false |
Don't persist session to disk |
images |
Vec<PathBuf> |
[] |
Image files to include with the prompt |
local_provider |
Option<String> |
None |
Local provider name (lmstudio, ollama) |
Enable the testing feature for unit tests without a live CLI:
[dev-dependencies]
codex-cli-sdk = { version = "0.1", features = ["testing"] }use codex_cli_sdk::testing::{MockTransport, fixtures, builders};
// Use a pre-built fixture
let transport = fixtures::simple_text_response();
// Or assemble events manually
use builders::{thread_started, agent_message_completed, turn_completed};
let mut transport = MockTransport::new();
transport.enqueue_events(vec![
thread_started("thread-1"),
agent_message_completed("msg-1", "Hello!"),
turn_completed(10, 5),
]);Available fixtures: simple_text_response, tool_call_session, approval_session, error_session, streaming_session, reasoning_session.
| Problem | Cause | Fix |
|---|---|---|
CliNotFound error |
Codex CLI not on PATH |
Install: npm install -g @openai/codex |
| Timeout on first run | CLI slow to start | Increase connect_timeout or check CLI health |
| Approval request not handled | ApprovalCallback not set |
Call .with_approval_callback() or use ApprovalPolicy::Never |
VersionCheck error |
CLI version check failed or couldn't parse output | Update: npm update -g @openai/codex or check CLI health |
| Noisy stderr output | CLI debug output | Set stderr_callback to capture/silence it |
| Feature | Description |
|---|---|
testing |
MockTransport, event builders, and fixtures for unit tests |
integration |
Integration test helpers (requires a live CLI) |
macOS, Linux, and Windows.
This is an unofficial, community-developed SDK and is not affiliated with, endorsed by, or sponsored by OpenAI. "Codex" is a trademark of OpenAI. This crate interacts with the Codex CLI but does not contain any OpenAI proprietary code.
Licensed under either of Apache License, Version 2.0 or MIT License at your option.
Maintained by the POM team.