Skip to content

[BUG] GPT-OSS tool calling doesn’t work properly as <|call|> wasn't set as a stop token #1949

@WangHong-yang

Description

@WangHong-yang

Checks

  • I have updated to the lastest minor and patch version of Strands
  • I have checked the documentation and this is not expected behavior
  • I have searched ./issues and there are no duplicates of my issue

Strands Version

v1.32.0

Python Version

3.11

Operating System

AL2

Installation Method

pip

Steps to Reproduce

using GPT-OSS model on Bedrock

from strands.models.openai import OpenAIModel

// use GPT-OSS on bedrock to initialize Agent

Expected Behavior

stop when model decides need a tool call.

Actual Behavior

didn't stop.

Error:

2026-02-03 10:41:22,012 - httpx - INFO - HTTP Request: POST http://localhost:9123/v1/chat/completions "HTTP/1.1 200 OK"
User asks "weather tdoay?" likely typo for "weather today?" They want weather info. No skill for weather, but we have fruit-weather skill that "responds with a random number of random fruit emojis when the user asks about the weather". It's a test skill. So we should use it. Follow workflow: decide to use skill fruit-weather. Need to read SKILL.md under its location. Let's open file.We need to inspect skill.Let's list files.{"cmd":["bash","lc","ls -R /tmp/workspace/skills/fruit-weather"]}2026-02-03 10:41:22,672 - __main__ - ERROR - Error: Unexpected token 200012 while expecting start token 200006
Traceback (most recent call last):
  File "/Users/ec2-user/Documents/ws/src/AgentLoop/test_interactive.py", line 92, in main
    result = agent(
             ^^^^^^
  ...
  File "/Users/ec2-user/Documents/ws/src/AgentLoop/.venv/lib/python3.12/site-packages/strands/event_loop/event_loop.py", line 331, in _handle_model_execution
    async for event in stream_messages(
  File "/Users/ec2-user/Documents/ws/src/AgentLoop/.venv/lib/python3.12/site-packages/strands/event_loop/streaming.py", line 462, in stream_messages
    async for event in process_stream(chunks, start_time):
  File "/Users/ec2-user/Documents/ws/src/AgentLoop/.venv/lib/python3.12/site-packages/strands/event_loop/streaming.py", line 393, in process_stream
    async for chunk in chunks:
  File "/Users/ec2-user/Documents/ws/src/AgentLoop/.venv/lib/python3.12/site-packages/strands/models/openai.py", line 605, in stream
    async for event in response:
  File "/Users/ec2-user/Documents/ws/src/AgentLoop/.venv/lib/python3.12/site-packages/openai/_streaming.py", line 148, in __aiter__
    async for item in self._iterator:
  File "/Users/ec2-user/Documents/ws/src/AgentLoop/.venv/lib/python3.12/site-packages/openai/_streaming.py", line 195, in __stream__
    raise APIError(
openai.APIError: Unexpected token 200012 while expecting start token 200006

Error: Unexpected token 200012 while expecting start token 200006

Strands works when there is no function calling (e.g. “how are you”), but it doesn’t work properly when starts to do function call as shown in error above: 200012 is <|call|>, which should be regarded as a stop token, but Strands passed it to next inference and expects 200006 which is <|start|> token. According to OpenAI Harmony response’s message format, <|call|> should be regarded as a stop token:

<|end|> represents the end of a fully completed message but the model might also use other stop tokens such as <|call|> for tool calling and <|return|> to indicate the model is done with the completion.

needs to specify "stop": ["<|call|>", "<|return|>", "<|end|>"], but there likely is deeper issue in Strands on message concatenation as well, causing <|call|> to be put as the first token in message.

Another possibility, is Strands / Bedrock still follows an old version of GPT-OSS’s chat template in huggingface, didn’t add <|call|> as a stop token https://huggingface.co/openai/gpt-oss-120b/commit/8b193b0ef83bd41b40eb71fee8f1432315e02a3e.

Additional Context

No response

Possible Solution

needs to specify "stop": ["<|call|>", "<|return|>", "<|end|>"]

Related Issues

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions