Is there an existing issue for this?
Current Behavior
The ai-proxy plugin currently uses Spring AI's ChatClient API to call upstream LLM providers. This causes the gateway to return Spring AI's internal ChatResponse object structure instead of the standard OpenAI Chat Completions API format.
Clients expecting an OpenAI-compatible API (e.g., ChatCompletion / ChatCompletionChunk) receive a non-standard response and cannot parse it correctly.
Additionally, Spring AI's createRequest() reconstructs ChatCompletionMessage from AssistantMessage, which loses fields such as reasoning_content, refusal, and annotations from the original request.
Expected Behavior
No response
Steps To Reproduce
No response
Environment
Debug logs
No response
Anything else?
by the way, Do shenyu need to support other ai protocols, such as Anthropic? Do shenyu need to convert different protocols, for example, if the client is the OpenAI Chat Completions protocol and the provider is Anthropic?
Is there an existing issue for this?
Current Behavior
The
ai-proxyplugin currently uses Spring AI'sChatClientAPI to call upstream LLM providers. This causes the gateway to return Spring AI's internalChatResponseobject structure instead of the standard OpenAI Chat Completions API format.Clients expecting an OpenAI-compatible API (e.g.,
ChatCompletion/ChatCompletionChunk) receive a non-standard response and cannot parse it correctly.Additionally, Spring AI's
createRequest()reconstructsChatCompletionMessagefromAssistantMessage, which loses fields such asreasoning_content,refusal, andannotationsfrom the original request.Expected Behavior
No response
Steps To Reproduce
No response
Environment
Debug logs
No response
Anything else?
by the way, Do shenyu need to support other ai protocols, such as Anthropic? Do shenyu need to convert different protocols, for example, if the client is the OpenAI Chat Completions protocol and the provider is Anthropic?