Skip to content

top_p: null still sent in API requests after #385 fix, causing errors on Bedrock and other providers #414

@toslali-ibm

Description

@toslali-ibm

PR #385 (closing #378) made top_p optional and changed its default to None in LLMConfig. However, the fix only addressed the config layer — the API call layer in openevolve/llm/openai.py still unconditionally includes top_p in the request params:

# openevolve/llm/openai.py ~L157
params = {
    "model": self.model,
    "messages": formatted_messages,
    "temperature": kwargs.get("temperature", self.temperature),
    "top_p": kwargs.get("top_p", self.top_p),   # <-- sends None when top_p is unset
    "max_tokens": kwargs.get("max_tokens", self.max_tokens),
}

After #385, self.top_p defaults to None, so this puts "top_p": None into the params dict. Whether this causes an error depends on the downstream API client:

  • The standard OpenAI SDK may strip None values internally (uses NOT_GIVEN as its sentinel)
  • AWS Bedrock, LiteLLM, and other OpenAI-compatible wrappers may serialize it as "top_p": null in the JSON body, triggering a 400 error

Reproduction

  1. Configure OpenEvolve with an AWS Bedrock endpoint (or any OpenAI-compatible provider that does not strip null values)
  2. Do not set top_p in the config (relying on the None default from Fix Anthropic models error when both temperature and top_p are passed #385)
  3. Run an evolution — the API call includes "top_p": null, resulting in:
Error code: 400 - {'error': {'code': 'invalid_request_error', 'message': '`temperature` and `top_p` cannot both be specified for this model.'}}

Expected behavior

When top_p is None, the key should be omitted from the params dict entirely, rather than being sent as null.

Suggested fix

-            params = {
-                "model": self.model,
-                "messages": formatted_messages,
-                "temperature": kwargs.get("temperature", self.temperature),
-                "top_p": kwargs.get("top_p", self.top_p),
-                "max_tokens": kwargs.get("max_tokens", self.max_tokens),
-            }
+            params = {
+                "model": self.model,
+                "messages": formatted_messages,
+                "temperature": kwargs.get("temperature", self.temperature),
+                "max_tokens": kwargs.get("max_tokens", self.max_tokens),
+            }
+            top_p = kwargs.get("top_p", self.top_p)
+            if top_p is not None:
+                params["top_p"] = top_p

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions