Skip to content

Conversation

@OskarStark
Copy link
Contributor

Summary

Fixes #898

The ai.model configuration in the AI Bundle is now properly processed and passed to ModelCatalog services. Previously, the configuration was validated but never used by the bundle, requiring users to manually configure services in services.yaml.

Changes

  • Added processModelConfig() method to process model configuration from semantic config
  • Added getModelClassForPlatform() helper method to map platform names to model classes
  • Added necessary imports for model classes (Claude, Gpt, Gemini, Mistral, Ollama) and Capability enum
  • Support for anthropic, openai, gemini, mistral, and ollama platforms
  • Proper handling of vertexai platform name mapping to vertexai.gemini

Example

With this fix, the following configuration now works as expected:

ai:
    model:
        ollama:
            custom-model-name:
                capabilities:
                    - input-text
                    - output-text

The model will be properly passed to the OllamaModelCatalog constructor without requiring manual service configuration.

Testing

  • All existing tests pass
  • Code follows Symfony coding standards (verified with PHP-CS-Fixer)

The ai.model configuration in the AI Bundle is now properly processed
and passed to ModelCatalog services. Previously, the configuration was
validated but never used by the bundle.

This fix adds:
- processModelConfig() method to process model configuration
- getModelClassForPlatform() method to map platform names to model classes
- Support for anthropic, openai, gemini, mistral, and ollama platforms
- Proper handling of vertexai platform name mapping

Fixes symfony#898
@carsonbot carsonbot changed the title Process model configuration from semantic config Process model configuration from semantic config Nov 20, 2025
*
* @return class-string|null
*/
private function getModelClassForPlatform(string $platformName): ?string
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@chr-hertel we can get rid of this when making the class required in the config which feels ok to me, WDYT?

I am talking about:

@chr-hertel
Copy link
Member

nice try, but github actions do not fall for this 😆
image

vs.

image

@OskarStark OskarStark closed this Nov 21, 2025
@OskarStark OskarStark deleted the fix-model-config-processing branch November 21, 2025 10:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[AI Bundle] in the semantic configuration, ai.model seems not used

3 participants