Skip to content

Conversation

@sandangel
Copy link

@sandangel sandangel commented Nov 15, 2025

Linked Issue

Closes: #2014
Resolve merge conflicts in #2698

Summary

This PR introduces progressive streaming support for non-live runs (run_async).

* Tools can now emit **intermediate progress updates** as `partial function_response` events.

* **Only the final `function_response` is injected back into the model.**

* Intermediate results are surfaced to the **user/runner only**, not used by the model for reasoning.

* **Live runner behavior remains unchanged** (live mode continues to inject intermediates into the model loop).

* **Existing non-progressive tools remain unaffected.**

Rationale

Users need progress visibility for long-running tools during run_async, without switching to the live APIs. This feature mirrors live runner ergonomics while remaining opt-in and ensuring model reasoning is unchanged (only final outputs affect the model).

New API

* `google.adk.tools.progressive_function_tool.ProgressiveFunctionTool` (abstract base)

* `google.adk.tools.progressive_tool.ProgressiveTool` (convenience wrapper)

Supported patterns:

* **Async generator:** each `yield` → partial; last `yield` → final

* **Async coroutine with optional `progress` or `progress_callback`:** injected reporter emits partials; return value → final

Changes

* **`src/google/adk/flows/llm_flows/base_llm_flow.py`**
  
  * `_postprocess_handle_function_calls_async` updated to handle progressive tools (stream partials + final).

* **`src/google/adk/flows/llm_flows/functions.py`**
  
  * Added `iter_progressive_function_calls_async(...)` to stream partials and yield a final `function_response`.

* **`src/google/adk/tools/progressive_function_tool.py`**
  
  * New abstract base class for progressive tools.

* **`src/google/adk/tools/progressive_tool.py`**
  
  * Wrapper to support async generators or injected progress callbacks.
  * Excludes progress params from function schema (`_ignore_params`).

* **`src/google/adk/tools/__init__.py`**
  
  * Export `ProgressiveTool`.

Backward Compatibility

* No behavior changes for existing tools or live runs.

* Non-progressive tools continue to return only a single final result.

* Progressive streaming is **opt-in**.

Testing Plan

Unit Tests (added)

* **Tools:** `tests/unittests/tools/test_progressive_tool.py`
  
  * Streams partial + final results
  * Final equals last yield
  * Error → converted to final
  * Multiple progressive tools in one turn
  * Non-progressive tool unaffected
  * Callback style (`progress`, `progress_callback`)
  * Direct `run_async` returns last yield

* **Flows:** `tests/unittests/flows/llm_flows/test_progressive_flow.py`
  
  * Flow emits partial then final with `ProgressiveTool`
  * Subclass of `ProgressiveFunctionTool` supported
  * Fallback path works for non-progressive tools

* **Functions:** `tests/unittests/flows/llm_flows/test_functions_progressive_unit.py`
  
  * Progressive iteration streams + final
  * Error handling converts to final
  * Parallel merge of function response events

Commands:

# Run only progressive tool/flow tests
pytest -q tests/unittests/tools/test_progressive_tool.py \
         tests/unittests/flows/llm_flows/test_progressive_flow.py \
         tests/unittests/flows/llm_flows/test_functions_progressive_unit.py
  

Manual End-to-End (E2E)

Minimal script with InMemoryRunner + ProgressiveTool(export_report) (async generator).

Observed output (truncated):

Partial events:

{"status": "started", "country": "Germany"}
{"status": "progress", "percent": 20}
...
{"status": "progress", "percent": 100}
{"status": "completed", "url": "https://example.com/germany.pdf"}

Model Reply

The report for Germany has been exported and can be accessed at https://example.com/germany.pdf

Documentation

No user-facing docs changed in this PR. A follow-up PR to adk-docs will add a short usage guide for ProgressiveTool.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @sandangel, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the user experience by enabling tools to provide progressive updates during long-running operations. By introducing ProgressiveTool and integrating its streaming capabilities into the LLM flow, the system can now emit partial events before a final result, making interactions more dynamic and informative. This change allows for better real-time feedback and more flexible tool execution within the agent framework.

Highlights

  • Progressive Tool Streaming: Introduced the ability for tools to stream partial progress updates during their execution, providing real-time feedback before a final result is returned.
  • New Tool Classes: Added ProgressiveFunctionTool as a base class for tools that support progress streaming, and ProgressiveTool as a wrapper to easily convert existing async functions (async generators or functions with progress/progress_callback parameters) into progressive tools.
  • LLM Flow Integration: Modified the core LLM flow (_postprocess_handle_function_calls_async) to detect and prioritize progressive tools, yielding partial events as they occur, and then the final result. Non-progressive tools continue to use the existing handling mechanism.
  • Enhanced Error Handling: The progressive tool streaming mechanism includes robust error handling, allowing plugins to intercept and convert tool errors into a final function response event.
  • Comprehensive Testing: New unit tests have been added for the ProgressiveTool and ProgressiveFunctionTool classes, as well as for the LLM flow's integration, covering various scenarios including streaming, final results, and error handling.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@adk-bot
Copy link
Collaborator

adk-bot commented Nov 15, 2025

Response from ADK Triaging Agent

Hello @sandangel, thank you for creating this PR!

To help reviewers better understand and review your contribution, could you please fill out the PR description template? Specifically, please provide:

  • A link to the associated GitHub issue. If one doesn't exist, please create one or describe the bug or feature in the PR description.
  • A testing plan section detailing how you have tested these changes.

This information will help us to review your PR more efficiently. Thanks!

@adk-bot adk-bot added the core [Component] This issue is related to the core interface and implementation label Nov 15, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a significant feature: the ProgressiveTool for streaming partial results from asynchronous tools. The changes are well-structured, adding new tool base classes, a wrapper tool, and updating the core LLM flow to handle progressive streaming. The new functionality is supported by a comprehensive set of unit and integration tests. My review identifies a critical issue in the handling of mixed (progressive and non-progressive) tool calls that could lead to incorrect behavior. I have also included a couple of medium-severity suggestions to improve the maintainability and correctness of the new ProgressiveTool implementation.

@ryanaiagent ryanaiagent self-assigned this Nov 18, 2025
sandangel and others added 5 commits November 18, 2025 22:10
Signed-off-by: San Nguyen <vinhsannguyen91@gmail.com>
Signed-off-by: San Nguyen <vinhsannguyen91@gmail.com>
Signed-off-by: San Nguyen <vinhsannguyen91@gmail.com>
Signed-off-by: San Nguyen <vinhsannguyen91@gmail.com>
@sandangel
Copy link
Author

Fixed all failed tests and linting/formatting errors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

core [Component] This issue is related to the core interface and implementation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Master issue: [Streaming Tools] support streaming intermediate results for tools for non-streaming case

4 participants