-
-
Notifications
You must be signed in to change notification settings - Fork 72
Description
Problem
When AI models gain new capabilities that are already supported by our platform, users cannot use them without a library release. Example: Platform supports tool-calling
, model initially doesn't. When model gains tool-calling
support, users must wait for a library update to use it.
Proposed Solution
Allow users to declare model capabilities at runtime using DSN syntax:
model-name@capability1,capability2,capability3
As a bonus we could get rid of all the model classes.
Examples
Scenario
Platform supports: input-image
, tool-calling
, output-streaming
Model GPT-4 initially supports: input-image
Model GPT-4 later adds: tool-calling
(no library release needed!)
// Before: Limited to hardcoded capabilities in library
$model = 'gpt-4'; // Only uses capabilities defined in library code
// After: User can declare newly supported capabilities
$model = 'gpt-4@tool-calling'; // Uses tool-calling without library update
Usage Examples
// Enable single capability
'gpt-4@tool-calling'
// Enable multiple capabilities
'claude-3@tool-calling,output-streaming'
// Enable multiple capabilities
'gpt-4o@input-image,tool-calling,output-streaming'
// No additional capabilities (default behavior)
'gpt-4'
Available Capabilities (from Capability
enum)
input-audio
,input-image
,input-messages
,input-multiple
,input-pdf
,input-text
output-audio
,output-image
,output-streaming
,output-structured
,output-text
tool-calling
text-to-speech
,speech-to-text
Benefits
- Zero-day capability adoption: Use new model capabilities immediately when API supports them
- No library releases needed for capability updates
- Forward compatible: Platform already knows the capability, just needs model declaration
- Backward compatible: Models without
@
work as before
Bundle
In the bundle it could look like:
ai:
agent:
my_agent:
platform: 'ai.platform.openai'
- model:
- class: 'Symfony\AI\Platform\Bridge\OpenAi\Gpt'
- name: 'gpt-4.1-mini'
+ model: 'gpt-4.1-mini@tool-calling'
Note
- API validates if model supports requested capabilities and returns meaningful errors if not supported.
- You cannot forbid certain capabilities or limit the model to some specific capabilitites as those are implemented on platform level.
EDIT:
Updated Proposal: Extended Configuration Support
Based on community feedback, the proposal has been extended to include configuration parameters alongside capabilities. The DSN syntax now supports:
- Capabilities Declaration (original proposal): Use
@
followed by comma-separated capabilities- Example:
gpt-4@tool-calling,streaming
- Example:
- Configuration Parameters (new addition): Use
?
followed by query-string style parameters- Example:
gpt-4?temperature=0.5&max_tokens=1000
- Example:
- Combined Usage: Both capabilities and parameters can be used together
- Example:
gpt-4@tool-calling?temperature=0.5
This enhancement allows for complete runtime configuration of AI models, including both feature capabilities and operational parameters, all within a single, familiar DSN-like string format. The syntax follows established conventions (similar to database DSNs) making it intuitive for developers while providing maximum flexibility for model configuration.
- Example: