Skip to content

Codex exits sandbox, find another local LLM and ollama and starts using that as its bitch #14580

@dkoding

Description

@dkoding

What version of Codex CLI is running?

codex-cli 0.111.0

What subscription do you have?

Pro

Which model were you using?

gpt-5.4 xhigh

What platform is your computer?

Windows 10 x64

What terminal emulator and version are you using (if applicable)?

cli on windows

What issue are you seeing?

Codex exits its sandbox and starts calling another LLM using ollama instead of actually doing the work itself.
I asked it to translate text.

What steps can reproduce the bug?

Uploaded thread: 019ce6b7-aa21-7b91-a707-f0a6c7b085f4

What is the expected behavior?

No response

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingsandboxIssues related to permissions or sandboxing

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions