contrib/spring-ai/src/main/java/com/google/adk/models/springai/MessageConverter.java
🔴 Required Information
Describe the Bug:
com.google.adk.models.springai.MessageConverter.toLlmResponse(ChatResponse, boolean) drops Spring AI ChatResponse usage metadata.
Even when chatResponse.getMetadata().getUsage() is present and non-zero, the returned LlmResponse has empty usageMetadata.
This breaks token accounting in downstream ADK plugins/callbacks.
Steps to Reproduce:
- Use
google-adk-spring-ai in a Java project with streaming chat.
- Enable streaming usage on Spring AI model options (e.g.
OpenAiChatOptions.builder().streamUsage(true)).
- Get a
ChatResponse where chatResponse.getMetadata().getUsage() is non-null and has non-zero values.
- Convert it via
MessageConverter.toLlmResponse(chatResponse, true).
- Check
llmResponse.usageMetadata() -> it is empty / missing.
Expected Behavior:
LlmResponse.usageMetadata should be populated from ChatResponse.getMetadata().getUsage()
(e.g. prompt/completion/total token counts), so ADK callbacks/plugins can read usage reliably.
Observed Behavior:
LlmResponse contains content/partial/turnComplete, but no usageMetadata.
As a result, token metrics remain 0 in usage-dependent logic.
Environment Details:
- ADK Library Version (see maven dependency):
google-adk-spring-ai:0.9.0 (also google-adk:0.9.0)
- OS: Windows
- TS Version (tsc --version): N/A (Java project)
Model Information:
- Which model is being used:
qwen-plus (OpenAI-compatible endpoint via DashScope)
🟡 Optional Information
Logs:
ChatResponse metadata usage is present and non-zero before conversion.
After MessageConverter.toLlmResponse(...), LlmResponse.usageMetadata is empty.
Downstream token counters therefore record 0.
Additional Context:
In the current implementation, toLlmResponse(ChatResponse, boolean) builds LlmResponse with:content、partial
and turnComplete,but does not map usage from Spring AI response metadata.
Minimal Reproduction Code:
Please provide a code snippet or a link to a Gist/repo that isolates the issue.
package repro;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.google.adk.models.LlmResponse;
import com.google.adk.models.springai.MessageConverter;
import org.junit.jupiter.api.Test;
import org.springframework.ai.chat.messages.AssistantMessage;
import org.springframework.ai.chat.metadata.ChatResponseMetadata;
import org.springframework.ai.chat.metadata.DefaultUsage;
import org.springframework.ai.chat.model.ChatResponse;
import org.springframework.ai.chat.model.Generation;
import java.util.List;
import static org.junit.jupiter.api.Assertions.*;
class MessageConverterUsageReproTest {
@Test
void usageShouldBeMappedFromSpringAiChatResponseToAdkLlmResponse() {
MessageConverter converter = new MessageConverter(new ObjectMapper());
AssistantMessage assistantMessage = new AssistantMessage("hello");
Generation generation = new Generation(assistantMessage);
DefaultUsage usage = new DefaultUsage(12, 8, 20);
ChatResponseMetadata metadata = ChatResponseMetadata.builder()
.id("resp-1")
.model("dummy-model")
.usage(usage)
.build();
ChatResponse chatResponse = new ChatResponse(List.of(generation), metadata);
// Precondition: Spring AI usage exists and is non-zero
assertNotNull(chatResponse.getMetadata());
assertNotNull(chatResponse.getMetadata().getUsage());
assertEquals(20, chatResponse.getMetadata().getUsage().getTotalTokens());
LlmResponse llmResponse = converter.toLlmResponse(chatResponse, true);
// Expected: should be present, but currently missing in google-adk-spring-ai
assertTrue(llmResponse.usageMetadata().isPresent(),
"Expected usageMetadata to be mapped from ChatResponse metadata usage");
}
}
How often has this issue occurred?:
contrib/spring-ai/src/main/java/com/google/adk/models/springai/MessageConverter.java
🔴 Required Information
Describe the Bug:
com.google.adk.models.springai.MessageConverter.toLlmResponse(ChatResponse, boolean)drops Spring AIChatResponseusage metadata.Even when
chatResponse.getMetadata().getUsage()is present and non-zero, the returnedLlmResponsehas emptyusageMetadata.This breaks token accounting in downstream ADK plugins/callbacks.
Steps to Reproduce:
google-adk-spring-aiin a Java project with streaming chat.OpenAiChatOptions.builder().streamUsage(true)).ChatResponsewherechatResponse.getMetadata().getUsage()is non-null and has non-zero values.MessageConverter.toLlmResponse(chatResponse, true).llmResponse.usageMetadata()-> it is empty / missing.Expected Behavior:
LlmResponse.usageMetadatashould be populated fromChatResponse.getMetadata().getUsage()(e.g. prompt/completion/total token counts), so ADK callbacks/plugins can read usage reliably.
Observed Behavior:
LlmResponsecontains content/partial/turnComplete, but nousageMetadata.As a result, token metrics remain 0 in usage-dependent logic.
Environment Details:
google-adk-spring-ai:0.9.0(alsogoogle-adk:0.9.0)Model Information:
qwen-plus(OpenAI-compatible endpoint via DashScope)🟡 Optional Information
Logs:
Additional Context:
In the current implementation, toLlmResponse(ChatResponse, boolean) builds LlmResponse with:content、partial
and turnComplete,but does not map usage from Spring AI response metadata.
Minimal Reproduction Code:
Please provide a code snippet or a link to a Gist/repo that isolates the issue.
How often has this issue occurred?: