Skip to content

[BUG] AI Proxy plugin returns Spring AI ChatResponse instead of OpenAI Chat Completions format #6340

@eye-gu

Description

@eye-gu

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

The ai-proxy plugin currently uses Spring AI's ChatClient API to call upstream LLM providers. This causes the gateway to return Spring AI's internal ChatResponse object structure instead of the standard OpenAI Chat Completions API format.

Clients expecting an OpenAI-compatible API (e.g., ChatCompletion / ChatCompletionChunk) receive a non-standard response and cannot parse it correctly.

Additionally, Spring AI's createRequest() reconstructs ChatCompletionMessage from AssistantMessage, which loses fields such as reasoning_content, refusal, and annotations from the original request.

Expected Behavior

No response

Steps To Reproduce

No response

Environment

ShenYu version(s):

Debug logs

No response

Anything else?

by the way, Do shenyu need to support other ai protocols, such as Anthropic? Do shenyu need to convert different protocols, for example, if the client is the OpenAI Chat Completions protocol and the provider is Anthropic?

Metadata

Metadata

Assignees

No one assigned

    Labels

    type: bugSomething isn't working

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions