-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use Node.js runtime & move more components to RSC #5
Conversation
@leerob is attempting to deploy a commit to the morphic Team on Vercel. A member of the Team first needs to authorize it. |
Also, added you here 😄 https://vercel.com/templates/next.js/morphic-ai-answer-engine-generative-ui |
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
Thank you for reviewing this project code!
I also didn't write this line, and it worked fine in my local environment. Do I need to change the Vercel settings?
|
So cool! 🚀 |
@leerob Can you explain how streaming is faster on the node.js runtime than on edge? I'm imagining it has to do with the CPU speed. Are there significant price differences to using a full node runtime? |
Hey @aleksa-code, reviewed your comment on #85 briefly. Vercel has recently moved away from edge. The argument for using the node run time is interesting but I would need more sources to be convinced. I think the argument goes as follow (can't remember if I saw this on X or not): With more compute power, text streams faster and so with less time executing the function, although the node runtime costs more, the end result in price is about the same if not better. So with node runtime you get faster streaming and a potential savings cost. I'd like to see if that speed in streaming is significantly faster and whether using the node runtime opens you up to other costs. |
Hi @albertdbio. My comment was related to the 405 error mentioned in #85 😅, which seems to be resolved now, so I removed the Vercel CLI version flag that I had in my project. I think Morphic should be working fine with the newest version of Next.js as well, without giving 405. Regarding the Edge and Node runtime, I don't have an answer about why streaming is not working. I also learned about Vercel ditching Edge from Theo's video and by finding this PR here. In my case, I only use Edge when necessary, and I would happily switch to using only Node. This would remove the confusion of when to use it or not. What I am curious about is what that means for the free hobby Vercel tier. Since with that tier, you cannot switch to more CPU and memory, in the case of, for example, streaming responses from the OpenAI API, the requests will time out. Does that mean the hobby tier will be upgraded to standard (CPU/memory) in the future? Or if we want to stay on the free tier, would we still need to utilize the Edge runtime? |
👋 There was a bug up until 14.2.2 with streaming + server actions + Node.js that has since been resolved. Thanks for helping pinpoint things here, appreciate it. Should work now, if you want to approve that deployment. |
Could you also make sure you're on "Standard" performance here? https://vercel.com/changelog/manage-your-vercel-functions-cpu-and-memory-in-the-dashboard |
It has become the standard. |
Changed maxDuration to 60 seconds and error was resolved. |
@leerob Thank you for your great contribution! 🥇 |
Hey! I saw you ran into a few issues streaming with the edge runtime. This PR does a few things:
"use client"
placement. This means that the initialpage
will be a server component, as well ascomponents/chat
.