Skip to content

Conversation

@TNAJanssen
Copy link

@TNAJanssen TNAJanssen commented Nov 20, 2025

Q A
Bug fix? yes
New feature? no
Docs? no
Issues Fix #898
License MIT

Overview

The ai.model configuration in ai.yaml was validated but never processed or passed to ModelCatalog services. This fix implements the missing functionality to wire model configuration to ModelCatalog constructors.

Changes

  • Added model processing loop in AiBundle::loadExtension() to process all configured models
  • Implemented processModelConfig() method to:
    • Map platform names to ModelCatalog service IDs (with special handling for vertexai and eleven_labs)
    • Convert capability strings/enums to Capability enum instances
    • Build the $additionalModels array in the format expected by ModelCatalog
    • Set the arguments on the ModelCatalog service definition
  • Implemented getModelClassForPlatform() method to map platform names to their corresponding model classes
  • Added support for all 20 platforms that accept additionalModels in their ModelCatalog constructors
  • Added comprehensive tests to verify model configuration processing for all supported platforms

Testing

  • All existing tests pass
  • Added new tests for model configuration processing
  • Verified graceful handling of unsupported platforms
  • Fixed capability enum handling to support both string and enum instances

@carsonbot
Copy link
Collaborator

It looks like you unchecked the "Allow edits from maintainer" box. That is fine, but please note that if you have multiple commits, you'll need to squash your commits into one before this can be merged. Or, you can check the "Allow edits from maintainers" box and the maintainer can squash for you.

Cheers!

Carsonbot

@carsonbot carsonbot changed the title Fix: Process ai.model configuration and pass to ModelCatalog services Fix: Process ai.model configuration and pass to ModelCatalog services Nov 20, 2025
@TNAJanssen TNAJanssen changed the title Fix: Process ai.model configuration and pass to ModelCatalog services [AI Bundle] Fix: Process ai.model configuration and pass to ModelCatalog services Nov 20, 2025
@OskarStark OskarStark closed this Nov 21, 2025
OskarStark added a commit that referenced this pull request Nov 21, 2025
… (OskarStark)

This PR was squashed before being merged into the main branch.

Discussion
----------

[AI Bundle] Process model configuration from semantic config

| Q             | A
| ------------- | ---
| Bug fix?      | yes
| New feature?  | no
| Docs?         | no
| Issues        | Fixes #898
| License       | MIT

Replaces #914

## Summary

The `ai.model` configuration in the AI Bundle is now properly processed and passed to `ModelCatalog` services. Previously, the configuration was validated but never used by the bundle, requiring users to manually configure services in `services.yaml`.

## Changes

- Added `processModelConfig()` method to process model configuration from semantic config
- Uses the `class` field from the config (introduced in #920)
- Support for all platforms that have a ModelCatalog service
- Proper handling of vertexai platform name mapping to vertexai.gemini

## Example

With this fix, the following configuration now works as expected:

```yaml
ai:
    model:
        ollama:
            custom-model-name:
                class: Symfony\AI\Platform\Bridge\Ollama\Ollama
                capabilities:
                    - input-text
                    - output-text
```

The model will be properly passed to the `OllamaModelCatalog` constructor without requiring manual service configuration.

Commits
-------

2884d29 [AI Bundle] Process model configuration from semantic config
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[AI Bundle] in the semantic configuration, ai.model seems not used

3 participants