Skip to content

Custom MCP with Streamable HTTP is completely broken #352

@emi-dm

Description

@emi-dm

Describe the bug

When I try to connect my MCP with cagent and DMR model (gpt-oss), the CLI stopped.

Image

Version affected

Latest version of cagent released for MacOS

How To Reproduce

Detailed steps to reproduce the behavior:

  1. Create agent.yml file :
version: "2"

agents:
  root:
    model: local-oai
    description: Un asistente útil para la investigación científica
    instruction: |
      Eres un asistente amistoso que ayuda en investigación científica.
    toolsets:
      - type: mcp
        ref: docker:duckduckgo # stdio transport
      - type: mcp # Model Context Protocol
        remote:
          url: "https://google-scholar-mcp-main.onrender.com/mcp" # Base URL to connect to
          transport_type: "streamable" # Type of MCP transport (sse or streamable

models:
  local-oai:
    provider: dmr
    model: ai/gpt-oss:latest
    max_tokens: 64000

  1. Run cagent agent.yml and type "Hola"

Expectation

The model call the MCP functions.

OS and Terminal type

MacOS

Metadata

Metadata

Assignees

No one assigned

    Labels

    area/docker-model-runnerFor features/issues/fixes related to the usage of Docker Model Runner (DMR)area/toolsFor features/issues/fixes related to the usage of built-in and MCP toolskind/bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions