-
Notifications
You must be signed in to change notification settings - Fork 865
Open
Description
PR #385 (closing #378) made top_p optional and changed its default to None in LLMConfig. However, the fix only addressed the config layer — the API call layer in openevolve/llm/openai.py still unconditionally includes top_p in the request params:
# openevolve/llm/openai.py ~L157
params = {
"model": self.model,
"messages": formatted_messages,
"temperature": kwargs.get("temperature", self.temperature),
"top_p": kwargs.get("top_p", self.top_p), # <-- sends None when top_p is unset
"max_tokens": kwargs.get("max_tokens", self.max_tokens),
}After #385, self.top_p defaults to None, so this puts "top_p": None into the params dict. Whether this causes an error depends on the downstream API client:
- The standard OpenAI SDK may strip
Nonevalues internally (usesNOT_GIVENas its sentinel) - AWS Bedrock, LiteLLM, and other OpenAI-compatible wrappers may serialize it as
"top_p": nullin the JSON body, triggering a 400 error
Reproduction
- Configure OpenEvolve with an AWS Bedrock endpoint (or any OpenAI-compatible provider that does not strip
nullvalues) - Do not set
top_pin the config (relying on theNonedefault from Fix Anthropic models error when both temperature and top_p are passed #385) - Run an evolution — the API call includes
"top_p": null, resulting in:
Error code: 400 - {'error': {'code': 'invalid_request_error', 'message': '`temperature` and `top_p` cannot both be specified for this model.'}}
Expected behavior
When top_p is None, the key should be omitted from the params dict entirely, rather than being sent as null.
Suggested fix
- params = {
- "model": self.model,
- "messages": formatted_messages,
- "temperature": kwargs.get("temperature", self.temperature),
- "top_p": kwargs.get("top_p", self.top_p),
- "max_tokens": kwargs.get("max_tokens", self.max_tokens),
- }
+ params = {
+ "model": self.model,
+ "messages": formatted_messages,
+ "temperature": kwargs.get("temperature", self.temperature),
+ "max_tokens": kwargs.get("max_tokens", self.max_tokens),
+ }
+ top_p = kwargs.get("top_p", self.top_p)
+ if top_p is not None:
+ params["top_p"] = top_pReactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels