Skip to content

Not using tool calls properly with local LLM on Windows #5488

@SimonLiu423

Description

@SimonLiu423

What version of Codex is running?

codex-cli 0.47.0

What subscription do you have?

None

Which model were you using?

gpt-oss:20b, qwen3-coder:30b, Qwen3-Next-80B-A3B-Instruct

What platform is your computer?

Microsoft Windows NT 10.0.22631.0 x64

What issue are you seeing?

I have self-hosted some models including Qwen3-Next-80B-A3B-Instruct, qwen3-coder:30b, and gpt-oss:20b. However, I can't get any of these models to work with Codex, when I ask:

> List the current directory

it replies

• I'll list the contents of the current directory for you.

  <tool_call>
  {"name": "shell", "arguments": {"command": ["ls"]}}
  </tool_call>

without actually using the tool.

These models worked fine with VSCode Cline extension, it could read files, list directories etc. without problems.

Note: I'm running on Windows without WSL

What steps can reproduce the bug?

  1. Run Codex on Windows without WSL
  2. Connect Codex to self-hosted inference engine
  3. Select one of these models: Qwen3-Next-80B-A3B-Instruct, qwen3-coder:30b, and gpt-oss:20b
  4. Ask it to make some tool calls

What is the expected behavior?

No response

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingcustom-modelIssues related to custom model providers (including local models)windows-osIssues related to Codex on Windows systems

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions