-
Notifications
You must be signed in to change notification settings - Fork 2k
Description
Bug description
When the application is configured with reasoning model, e.g.:
ai:
openai:
chat:
options:
model: o3-mini
And we're applying any of available QueryTransformer's – the application throw Unsupported parameter: 'temperature' during Query transform(Query query); phase. This happens because in the following section, QueryTransformer is using default available model, and temperature param is always provided, regardless of the model type
public class CompressionQueryTransformer {
...
@Override
public Query transform(Query query) {
...
var compressedQueryText = this.chatClient.prompt()
.user(user -> user.text(this.promptTemplate.getTemplate())
.param("history", formatConversationHistory(query.history()))
.param("query", query.text()))
**.options(ChatOptions.builder().temperature(0.0).build())**
.call()
.content();
...
}
}
I thought of making PR with extending ChatOptions by new reasoningEffort param, but I'm not sure if this is a good idea, since it affects all ChatOptions heirs which has no introduced reasoning models yet, e.g. MistralAiChatOptions.
If you suggest what would be the right way to approach this issue – I'd love to submit PR with fix
Environment
Spring AI: tried with 1.0.0-M5 and latest main branch state
Steps to reproduce
- Configure Spring AI app with reasoning model (e.g.
o1oro3-mini) - Apply either of available
QueryTransformer's (e.g.CompressionQueryTransformer) - Call
Query transform(Query query)method of providedQueryTransformer
Expected behavior
Query transform(Query query) should work with reasoning models