You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OpenAI accepts this shape. Several other LLM backends do not, including the ones MemOS already documents/configures (minimax, plus Anthropic-compatible endpoints like the MiniMax Anthropic gateway):
Backend
Endpoint
Response
MiniMax Text API
https://api.minimax.io/v1/chat/completions
HTTP 400 invalid params, chat content is empty (2013)
MiniMax Anthropic-compat
https://api.minimax.io/anthropic/v1/messages
HTTP 400 invalid params, messages must not be empty (2013)
Anthropic native
https://api.anthropic.com/v1/messages
(same — system is a top-level field, messages[] must have ≥1 user turn)
Per MiniMax's own minimal example in the official docs, even system-only is rejected:
Split the prompt into a short system persona and a user message containing the actual instruction. OpenAI accepts this; MiniMax/Anthropic require it.
message_list= [
{"role": "system",
"content": "You generate suggestion queries based on the user's recent memories."},
{"role": "user",
"content": suggestion_prompt.format(memories=memories)},
]
This is a strict widening — every backend that accepts the original system-only shape also accepts the split shape. No behaviour change for existing OpenAI/Azure deployments.
Other handlers in src/memos/api/handlers/ should be audited for the same [{role: system, content: ...}] pattern; I only fixed suggestion_handler in this PR but happy to extend.
Summary
handle_get_suggestion_queriesinsrc/memos/api/handlers/suggestion_handler.pybuilds the LLM request as a single system message:OpenAI accepts this shape. Several other LLM backends do not, including the ones MemOS already documents/configures (
minimax, plus Anthropic-compatible endpoints like the MiniMax Anthropic gateway):https://api.minimax.io/v1/chat/completionsinvalid params, chat content is empty (2013)https://api.minimax.io/anthropic/v1/messagesinvalid params, messages must not be empty (2013)https://api.anthropic.com/v1/messagessystemis a top-level field,messages[]must have ≥1 user turn)Per MiniMax's own minimal example in the official docs, even system-only is rejected:
{ "model": "MiniMax-M2.7", "messages": [ { "role": "system", "name": "MiniMax AI" }, { "role": "user", "name": "user", "content": "hello" } ] }Reproduction
MOS_CHAT_MODEL=MiniMax-M2.7,OPENAI_API_BASE=https://api.minimax.io/v1, validOPENAI_API_KEY.POST /product/suggestionswithmem_cube_id=<existing cube>,language=en.openai.BadRequestError: Error code: 400 ... 'chat content is empty (2013)'. Without:AttributeError: 'NoneType' object has no attribute 'replace'.Proposed fix
Split the prompt into a short system persona and a user message containing the actual instruction. OpenAI accepts this; MiniMax/Anthropic require it.
This is a strict widening — every backend that accepts the original system-only shape also accepts the split shape. No behaviour change for existing OpenAI/Azure deployments.
PR follows.
Related
src/memos/api/handlers/should be audited for the same[{role: system, content: ...}]pattern; I only fixed suggestion_handler in this PR but happy to extend.