Skip to content

[Python SDK] Add MediaRouter for prefix-based provider dispatch #463

@santoshkumarradha

Description

@santoshkumarradha

Summary

Replace the scattered if model.startswith("fal-ai/") ... elif model.startswith("openrouter/") ... else litellm chains in agent_ai.py with a single MediaRouter class that maps model prefixes to MediaProvider instances based on declared capabilities.

Problem

Today, every media method in agent_ai.py has its own if/elif routing logic:

  • ai_with_vision() (line 1402-1433): 3-way branch — fal / openrouter / litellm
  • ai_with_audio() (line 1121-1160): 4-way branch — fal / openai_direct / tts / chat-modalities
  • ai_generate_video() (line 1692-1704): fal-only guard + raise ValueError

Adding a new provider (e.g., Replicate, ElevenLabs) means editing every method. Adding a new modality (e.g., music) means another if/elif in a new method.

Solution

A MediaRouter class (~40 lines) that:

  1. Accepts register(prefix, provider) calls at init time
  2. Resolves (model_string, capability)MediaProvider instance
  3. Falls through prefix matches in registration order (longest prefix wins)
class MediaRouter:
    def __init__(self):
        self._providers: list[tuple[str, MediaProvider]] = []
    
    def register(self, prefix: str, provider: MediaProvider):
        self._providers.append((prefix, provider))
        # Sort by prefix length descending for longest-match-first
        self._providers.sort(key=lambda x: len(x[0]), reverse=True)
    
    def resolve(self, model: str, capability: str) -> MediaProvider:
        for prefix, provider in self._providers:
            if model.startswith(prefix) and capability in provider.supported_modalities:
                return provider
        raise ValueError(
            f"No provider for model '{model}' with '{capability}' capability."
        )

Files

File Change
sdk/python/agentfield/media_router.py NEWMediaRouter class
sdk/python/agentfield/agent_ai.py Replace if/elif chains in ai_with_vision(), ai_with_audio(), ai_generate_video() with self._media_router.resolve(model, cap).method(...)
sdk/python/agentfield/agent_ai.py Add lazy _media_router property that registers providers on first access
sdk/python/agentfield/media_providers.py No changes (existing providers already fit the interface)

Before → After

Before (ai_with_vision, line 1402):

if model.startswith("fal-ai/") or model.startswith("fal/"):
    return await self._fal_provider.generate_image(...)
elif model.startswith("openrouter/"):
    return await vision.generate_image_openrouter(...)
else:
    return await vision.generate_image_litellm(...)

After:

provider = self._media_router.resolve(model, "image")
return await provider.generate_image(prompt=prompt, model=model, size=size, quality=quality, **kwargs)

Acceptance Criteria

  • MediaRouter class created in sdk/python/agentfield/media_router.py
  • All 3 existing providers (Fal, OpenRouter, LiteLLM) registered with correct prefixes
  • ai_with_vision(), ai_with_audio(), ai_generate_video() use router instead of if/elif
  • ai_generate_video() no longer raises ValueError for non-fal models (router handles it)
  • All existing tests pass unchanged — behavior is identical, just routing is centralized
  • pytest sdk/python/ passes
  • ruff check sdk/python/ passes

Testing

# Unit test the router itself
router = MediaRouter()
router.register("fal-ai/", mock_fal)
router.register("openrouter/", mock_openrouter)
router.register("", mock_litellm)  # default fallback

assert router.resolve("fal-ai/flux/dev", "image") is mock_fal
assert router.resolve("openrouter/google/veo-3.1", "video") is mock_openrouter
assert router.resolve("dall-e-3", "image") is mock_litellm

# Raises for unsupported capability
with pytest.raises(ValueError):
    router.resolve("fal-ai/flux/dev", "music")  # fal doesn't support music

Notes for Contributors

Severity: MEDIUM — Pure refactor, no behavior change.

This is a prerequisite for all other issues in this milestone. The router pattern must land first so that new providers/modalities plug in cleanly without touching agent_ai.py routing logic.

The LiteLLMProvider registration uses empty prefix "" as the catch-all fallback — this preserves backward compatibility for models like dall-e-3 that don't have a provider prefix.

Keep mode="openai_direct" as a special case in ai_with_audio() for now — it's an explicit user opt-in, not model-prefix-based routing.

Metadata

Metadata

Labels

ai-friendlyWell-documented task suitable for AI-assisted developmentarea:aiAI/LLM integrationenhancementNew feature or requestrefactorCode quality and refactoring improvementssdk:pythonPython SDK related

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions