You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I followed the example here and the call to client.runs.create always fails with the following error:
Faraday::ParsingError: Empty input at line 1, column 1 [parse.c:926] in 'event: thread.run.created...
ruby-openai version: 7.3.1
Everything works fine in the non-streaming version. But the moment I pass a Proc (even a very simple one that does nothing, or prints a constant string without even referring to chunk) to the stream parameter, I get the above parsing error. Here's a more complete log, though I have redacted most of the actual data JSON strings:
You can see that the API itself is behaving as expected, returning tokens one at a time. But it seems that the ruby-openai code is choking on the response, trying to treat the entire thing as JSON:
def to_json_stream(user_proc:)
parser = EventStreamParser::Parser.new
proc do |chunk, _bytes, env|
if env && env.status != 200
raise_error = Faraday::Response::RaiseError.new
raise_error.on_complete(env.merge(body: try_parse_json(chunk)))
end
parser.feed(chunk) do |_type, data|
user_proc.call(JSON.parse(data)) unless data == "[DONE]"
end
end
end
I believe it's the call to JSON.parse above that's choking on this non-JSON response, or trying to parse the entire thing at once rather than each event's data element.
Is there a reason you can think of why parser.feed would result in data being the entire string of all events, rather than the data element of just one event?
The text was updated successfully, but these errors were encountered:
I followed the example here and the call to
client.runs.create
always fails with the following error:ruby-openai version: 7.3.1
Everything works fine in the non-streaming version. But the moment I pass a Proc (even a very simple one that does nothing, or prints a constant string without even referring to
chunk
) to thestream
parameter, I get the above parsing error. Here's a more complete log, though I have redacted most of the actual data JSON strings:You can see that the API itself is behaving as expected, returning tokens one at a time. But it seems that the ruby-openai code is choking on the response, trying to treat the entire thing as JSON:
I believe it's the call to
JSON.parse
above that's choking on this non-JSON response, or trying to parse the entire thing at once rather than each event'sdata
element.Is there a reason you can think of why
parser.feed
would result indata
being the entire string of all events, rather than thedata
element of just one event?The text was updated successfully, but these errors were encountered: