Closed
Description
Description
The scenario is that I hope to use the reasoning model at the beginning of the conversation, and use the normal model in the subsequent steps to achieve a balance between performance and efficiency. This approach does not seem to be supported currently.
const response = await streamText({
model: customModel(this.modelIdentifier),
maxSteps: 21,
onStepFinish: () => {
// Change model
}
})
AI SDK Version
No response