Skip to content

Commit

Permalink
chore: Improve Openai json rendering (#8666)
Browse files Browse the repository at this point in the history
We have been observing JSON parsing errors for responses from GPT. Switching to the gpt-4-1106-preview model along with using response_format has significantly improved the responses from OpenAI, hence making the switch in code.

ref: https://openai.com/blog/new-models-and-developer-products-announced-at-devday
fixes: #CW-2931
  • Loading branch information
sojan-official committed Jan 9, 2024
1 parent e34ab59 commit 046ce68
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions enterprise/lib/chat_gpt.rb
Expand Up @@ -4,7 +4,7 @@ def self.base_uri
end

def initialize(context_sections = '')
@model = 'gpt-4'
@model = 'gpt-4-1106-preview'
@messages = [system_message(context_sections)]
end

Expand Down Expand Up @@ -53,7 +53,7 @@ def system_content(context_sections)

def request_gpt
headers = { 'Content-Type' => 'application/json', 'Authorization' => "Bearer #{ENV.fetch('OPENAI_API_KEY')}" }
body = { model: @model, messages: @messages }.to_json
body = { model: @model, messages: @messages, response_format: { type: 'json_object' } }.to_json
Rails.logger.info "Requesting Chat GPT with body: #{body}"
response = HTTParty.post("#{self.class.base_uri}/v1/chat/completions", headers: headers, body: body)
Rails.logger.info "Chat GPT response: #{response.body}"
Expand Down

0 comments on commit 046ce68

Please sign in to comment.