Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Faraday::ParsingError when creating an Assistants run with streaming #564

Open
speriosu opened this issue Feb 11, 2025 · 1 comment
Open

Comments

@speriosu
Copy link

I followed the example here and the call to client.runs.create always fails with the following error:

Faraday::ParsingError: Empty input at line 1, column 1 [parse.c:926] in 'event: thread.run.created...

ruby-openai version: 7.3.1

Everything works fine in the non-streaming version. But the moment I pass a Proc (even a very simple one that does nothing, or prints a constant string without even referring to chunk) to the stream parameter, I get the above parsing error. Here's a more complete log, though I have redacted most of the actual data JSON strings:

Empty input at line 1, column 1 [parse.c:926] in 'event: thread.run.created
data: {...}

event: thread.run.queued
data: {...}

event: thread.run.in_progress
data: {...}

event: thread.run.step.created
data: {...}

event: thread.run.step.in_progress
data: {...}

event: thread.message.created
data: {...}

event: thread.message.in_progress
data: {...}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"Hello","annotations":[]}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"!"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" I'm"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" doing"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" well"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":","}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" thank"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" you"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"."}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" How"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" can"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" I"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" assist"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" you"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" today"}}]}}

event: thread.message.delta
data: {"id":"msg_8O2EBU0EnMB0Bb7GRvpqadQv","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"?"}}]}}

event: thread.message.completed
data: {...}

event: thread.run.step.completed
data: {...}

event: thread.run.completed
data: {...}

event: done
data: [DONE]

You can see that the API itself is behaving as expected, returning tokens one at a time. But it seems that the ruby-openai code is choking on the response, trying to treat the entire thing as JSON:

    def to_json_stream(user_proc:)
      parser = EventStreamParser::Parser.new

      proc do |chunk, _bytes, env|
        if env && env.status != 200
          raise_error = Faraday::Response::RaiseError.new
          raise_error.on_complete(env.merge(body: try_parse_json(chunk)))
        end

        parser.feed(chunk) do |_type, data|
          user_proc.call(JSON.parse(data)) unless data == "[DONE]"
        end
      end
    end

I believe it's the call to JSON.parse above that's choking on this non-JSON response, or trying to parse the entire thing at once rather than each event's data element.

Is there a reason you can think of why parser.feed would result in data being the entire string of all events, rather than the data element of just one event?

@alexrudall
Copy link
Owner

Hi @speriosu - thanks for this - would you mind testing if ruby-openai v8 fixes this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants