Skip to content

Add simple streaming tool in FastMCP during tools usage #114

@stex2005

Description

@stex2005

I’ve been experimenting with a simple streaming tool in FastMCP:

@mcp.tool()
async def process_data_with_updates(data_uri: str, ctx: Context) -> dict:
    await ctx.info(f"Starting to process data from {data_uri}...")
    await ctx.report_progress(progress=0, total=100)

    for i in range(5):
        await asyncio.sleep(2)
        await ctx.info(f"Step {i+1}/5 complete")
        await ctx.report_progress(progress=(i+1)*20, total=100)

    await ctx.info("Done!")
    return {"status": "success"}

Expected behavior

While the tool runs, I should see intermediate ctx.info(...) log lines and/or ctx.report_progress(...) updates appear in the LLM chat interface.
After completion, the final return value should be displayed.

Actual behavior

The tool hangs until the end.
Only the final return value is shown.
No progress updates or info logs appear inline.
Not sure if the LLM notes down the intermediate updates internally.

Notes

The chat interface is not displaying these real-time updates. This could be due to:

  • The client not supporting progress tokens
  • The streaming updates being sent but not rendered in the UI
  • The updates being processed but not shown to the user

Would be great if LLM clients supported showing live updates in the conversation window.

Environment

  • fastmcp version: 2.12.3
  • Python version: 3.12
  • Transport: streamable-http

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingenhancementNew feature or requestquestionFurther information is requested

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions