-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How do I respond to stream results via API - Grape #2367
Comments
Can you please elaborate a little? I am not sure what "respond to stream results" means. Are you trying to do something on the server or client? What is not working in the code you've provided? |
I want the results returned from my API to be the same as the results returned from OpenAI. |
I see. Start by doing this without Grape. Does |
hi @dblock
But I haven't seen any article saying it can respond to streaming endpoint so I need your help. |
Care to show a working example with OpenAPI in pure ruby?
I don't know either. Let's figure it out. Looks like @urkle contributed much of the streaming implementation in #1520, maybe he is around to help? |
I know what the problem is @datpmt - what web server are you using? I got it working with Puma, but not Webrick. Let us know if this helps? |
Thank you for your help @dblock ! When I create the new file ruby test.rb # test.rb
require 'grape'
require 'rack'
require 'rack/handler/puma'
class MyStream
def each
3.times do
yield "data: #{{ time: Time.now }.to_json}\n\n"
sleep 1
end
end
end
class ChatGpt < Grape::API
resource :chat_gpt do
get do
stream MyStream.new
content_type 'text/event-stream'
status 200
end
end
end
Rack::Handler::Puma.run ChatGpt.new It working well Screen.Recording.2023-11-09.at.13.50.02.movBut when I start my main project (with Rails app) by command line: rails s -p 3005 Code in my Rails app: # /api/v1/chat_gpt
module API
module V1
class MyStream
def each
3.times do
yield "data: #{{ time: Time.zone.now }.to_json}\n\n"
sleep 1
end
end
end
class ChatGpt < Grape::API
resource :chat_gpt do
get do
stream MyStream.new
content_type 'text/event-stream'
status 200
end
end
end
end
end There seems to be no difference. I tried adding Screen.Recording.2023-11-09.at.13.54.33.movI think the problem is related to |
Let's try to isolate it. Care to port my sample that I added in ruby-grape/grape-on-rack#56 to https://github.com/ruby-grape/grape-on-rails? Make a (non-working) PR? |
hi @dblock |
I do not, but you should find time to write and, and I will find time to dig into why it doesn't work |
I am going to close the issue here since it's clearly not a Grape problem, as grape returns all the right things per spec and works on Puma. |
According to rails/rails#38780 (comment), this is not a Rails problem but an inconsistent Rack version. |
Did upgrading Rack to >2.2 or <=2.1 fix it in your application? |
Downgrade |
Recently I have been using the ChatGPT API and I've found a very useful method of sending information called streaming. Instead of sending a long piece of text all at once, ChatGPT sends it in small chunks until it's finished. For example, the text "Hello! How can I assist you today?" would be divided as follows:
0.1s: Hello
0.2s: !
0.3s: How
...
0.9s: today
1s: ?
Thank you for everyone's contributions!
The text was updated successfully, but these errors were encountered: