Very slow ChatOpenAI response when called from Vercel #785
Unanswered
aramboyajyan
asked this question in
Help
Replies: 1 comment
-
@aramboyajyan Assuming you're using the Node.js runtime, try switching to the Edge runtime. Read this for more info. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello!
I'm building a chatbot that will be hosted on Vercel. It's using Langchain, Pinecone, local MySQL DB and OpenAI.
During the development there were no performance issues; everything worked fast and there were no timeouts.
As soon as the chatbot was hosted on Vercel, the performance issues started showing up. I timed different responses using
console.time
andconsole.timeEnd
, and noticed that the bottleneck is related to invoking ChatOpenAI.Response time varies from 10s to over 5 minutes, which obviously breaks the chat.
Do you have any ideas on why the same ChatOpenAI call is taking so much more time from Vercel? I'm even using the same API keys.
Any thoughts on where to further debug this?
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions