Skip to content

[google-adk-spring-ai] MessageConverter.toLlmResponse does not map ChatResponse Usage to LlmResponse #1159

@caliu1

Description

@caliu1

contrib/spring-ai/src/main/java/com/google/adk/models/springai/MessageConverter.java

🔴 Required Information

Describe the Bug:
com.google.adk.models.springai.MessageConverter.toLlmResponse(ChatResponse, boolean) drops Spring AI ChatResponse usage metadata.
Even when chatResponse.getMetadata().getUsage() is present and non-zero, the returned LlmResponse has empty usageMetadata.
This breaks token accounting in downstream ADK plugins/callbacks.

Steps to Reproduce:

  1. Use google-adk-spring-ai in a Java project with streaming chat.
  2. Enable streaming usage on Spring AI model options (e.g. OpenAiChatOptions.builder().streamUsage(true)).
  3. Get a ChatResponse where chatResponse.getMetadata().getUsage() is non-null and has non-zero values.
  4. Convert it via MessageConverter.toLlmResponse(chatResponse, true).
  5. Check llmResponse.usageMetadata() -> it is empty / missing.

Expected Behavior:
LlmResponse.usageMetadata should be populated from ChatResponse.getMetadata().getUsage()
(e.g. prompt/completion/total token counts), so ADK callbacks/plugins can read usage reliably.

Observed Behavior:
LlmResponse contains content/partial/turnComplete, but no usageMetadata.
As a result, token metrics remain 0 in usage-dependent logic.

Environment Details:

  • ADK Library Version (see maven dependency): google-adk-spring-ai:0.9.0 (also google-adk:0.9.0)
  • OS: Windows
  • TS Version (tsc --version): N/A (Java project)

Model Information:

  • Which model is being used: qwen-plus (OpenAI-compatible endpoint via DashScope)

🟡 Optional Information

Logs:

ChatResponse metadata usage is present and non-zero before conversion.
After MessageConverter.toLlmResponse(...), LlmResponse.usageMetadata is empty.
Downstream token counters therefore record 0.

Additional Context:
In the current implementation, toLlmResponse(ChatResponse, boolean) builds LlmResponse with:content、partial
and turnComplete,but does not map usage from Spring AI response metadata.

Minimal Reproduction Code:
Please provide a code snippet or a link to a Gist/repo that isolates the issue.

package repro;

import com.fasterxml.jackson.databind.ObjectMapper;
import com.google.adk.models.LlmResponse;
import com.google.adk.models.springai.MessageConverter;
import org.junit.jupiter.api.Test;
import org.springframework.ai.chat.messages.AssistantMessage;
import org.springframework.ai.chat.metadata.ChatResponseMetadata;
import org.springframework.ai.chat.metadata.DefaultUsage;
import org.springframework.ai.chat.model.ChatResponse;
import org.springframework.ai.chat.model.Generation;

import java.util.List;

import static org.junit.jupiter.api.Assertions.*;

class MessageConverterUsageReproTest {

    @Test
    void usageShouldBeMappedFromSpringAiChatResponseToAdkLlmResponse() {
        MessageConverter converter = new MessageConverter(new ObjectMapper());

        AssistantMessage assistantMessage = new AssistantMessage("hello");
        Generation generation = new Generation(assistantMessage);

        DefaultUsage usage = new DefaultUsage(12, 8, 20);
        ChatResponseMetadata metadata = ChatResponseMetadata.builder()
                .id("resp-1")
                .model("dummy-model")
                .usage(usage)
                .build();

        ChatResponse chatResponse = new ChatResponse(List.of(generation), metadata);

        // Precondition: Spring AI usage exists and is non-zero
        assertNotNull(chatResponse.getMetadata());
        assertNotNull(chatResponse.getMetadata().getUsage());
        assertEquals(20, chatResponse.getMetadata().getUsage().getTotalTokens());

        LlmResponse llmResponse = converter.toLlmResponse(chatResponse, true);

        // Expected: should be present, but currently missing in google-adk-spring-ai
        assertTrue(llmResponse.usageMetadata().isPresent(),
                "Expected usageMetadata to be mapped from ChatResponse metadata usage");
    }
}

How often has this issue occurred?:

  • Always (100%)

Metadata

Metadata

Assignees

Labels

waiting on reporterWaiting for reaction by reporter. Failing that, maintainers will eventually closed it as stale.

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions