Skip to content

Commit

Permalink
Set lower temperature to ChatGPT calls (#28959)
Browse files Browse the repository at this point in the history
Updated the ChatCompletionRequest in the agent model to include temperature parameter. The temperature parameter controls the randomness of the AI's responses, making the model more conservative and focused with a lower value. In this case, the temperature is set to 0.3 to produce more focused and consistent results. Default is 1.0. Max is 2.0.
  • Loading branch information
jakule committed Jul 12, 2023
1 parent c50e6e6 commit d39c110
Showing 1 changed file with 4 additions and 3 deletions.
7 changes: 4 additions & 3 deletions lib/ai/model/agent.go
Expand Up @@ -244,9 +244,10 @@ func (a *Agent) plan(ctx context.Context, state *executionState) (*AgentAction,
stream, err := state.llm.CreateChatCompletionStream(
ctx,
openai.ChatCompletionRequest{
Model: openai.GPT4,
Messages: prompt,
Stream: true,
Model: openai.GPT4,
Messages: prompt,
Temperature: 0.3,
Stream: true,
},
)
if err != nil {
Expand Down

0 comments on commit d39c110

Please sign in to comment.