Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support streaming openai outside Node (ie. without axios) #247

Closed
nfcampos opened this issue Mar 8, 2023 · 9 comments · Fixed by #526
Closed

Support streaming openai outside Node (ie. without axios) #247

nfcampos opened this issue Mar 8, 2023 · 9 comments · Fixed by #526
Labels
help wanted This would make a good PR

Comments

@nfcampos
Copy link
Collaborator

nfcampos commented Mar 8, 2023

Use this in axios adapter https://github.com/Azure/fetch-event-source

@nfcampos nfcampos added the help wanted This would make a good PR label Mar 8, 2023
@esaounkine
Copy link

Could you provide a bit more context of what's the desired outcome?

@nfcampos
Copy link
Collaborator Author

nfcampos commented Mar 9, 2023

The official openai library (which we use) uses axios to make http requests.
Axios off-the-shelf doesn't work in non-node environments.
For non-streaming output we already use an axios adapter (https://github.com/hwchase17/langchainjs/blob/main/langchain/src/util/axios-fetch-adapter.js) to make it work outside Node. This issue is about updating that adapter to support the SSE responses that OpenAI uses for streaming

@alberduris
Copy link

Is it possible now to make streaming OpenAI requests, for example, on Cloudflare Workers? If so, how?

@nfcampos
Copy link
Collaborator Author

To set up Cloudflare Workers see #212 (comment)
To set up streaming see https://js.langchain.com/docs/modules/models/llms/additional_functionality#streaming-responses

@alberduris
Copy link

It's not working. I have followed those instructions and (after fixing several process is not defined errors) I'm still getting the ✘ [ERROR] Uncaught (in response) TypeError: adapter is not a function error as described https://github.com/fern-openai/openai-node/issues/7 and openai/openai-node#30 (comment).

@nfcampos
Copy link
Collaborator Author

Which version of langchain are you using?

@alberduris
Copy link

Which version of langchain are you using?

From your package.json specification:

  "dependencies": {
    "langchain": "^0.0.35"
  }

I get:

> npm view langchain version
$ 0.0.44

@nfcampos
Copy link
Collaborator Author

Update to the latest version with npm install langchain@latest and try again

@alberduris
Copy link

alberduris commented Mar 30, 2023

npm view langchain version still shows the same version (0.0.44) but it is working now, thanks!

Is there an example of how to send the response of the ChatOpenAI streamed back to the client from the cf workers?

Edit: I got it working here, in case someone is interested: https://gist.github.com/alberduris/32e4ad5827cb01c28022ded982bfd8bc

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted This would make a good PR
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants