Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Node.js runtime & move more components to RSC #5

Merged
merged 9 commits into from
May 8, 2024

Conversation

leerob
Copy link
Contributor

@leerob leerob commented Apr 9, 2024

Hey! I saw you ran into a few issues streaming with the edge runtime. This PR does a few things:

  1. Moves your client boundaries down lower in the tree, by changing the "use client" placement. This means that the initial page will be a server component, as well as components/chat.
  2. Moves from edge -> Node.js runtime. You should pair this with selecting 1 vCPU in your Function settings on Vercel. This will help you get faster function responses (faster than edge, which might not be obvious)
  3. Removes a few bits of unused code that I saw in editor

Copy link

vercel bot commented Apr 9, 2024

@leerob is attempting to deploy a commit to the morphic Team on Vercel.

A member of the Team first needs to authorize it.

@leerob
Copy link
Contributor Author

leerob commented Apr 9, 2024

Also, added you here 😄

https://vercel.com/templates/next.js/morphic-ai-answer-engine-generative-ui

Copy link

vercel bot commented Apr 10, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
morphic ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 8, 2024 9:06pm

@miurla
Copy link
Owner

miurla commented Apr 10, 2024

@leerob

Thank you for reviewing this project code!

export const runtime = 'edge'

I also didn't write this line, and it worked fine in my local environment.
(It's not even written in the code of the ai/rsc docs.)
However, when I deploy to Vercel, it doesn't stream. That's why I added it.
Related: vercel/ai#1187

Do I need to change the Vercel settings?
1 vCPU in your Function settings is already set.
Your PR also works fine locally, but it doesn't stream in the Vercel environment.

image

@miurla
Copy link
Owner

miurla commented Apr 10, 2024

Also, added you here 😄

https://vercel.com/templates/next.js/morphic-ai-answer-engine-generative-ui

So cool! 🚀

@albertdbio
Copy link
Contributor

@leerob Can you explain how streaming is faster on the node.js runtime than on edge? I'm imagining it has to do with the CPU speed. Are there significant price differences to using a full node runtime?

@miurla
Copy link
Owner

miurla commented Apr 29, 2024

@leerob

Thank you for your commit with the fix. We recently encountered an error with Next.js v14.2.3. So, we are specifying v14.2.1. :#85

@albertdbio
Copy link
Contributor

Hey @aleksa-code, reviewed your comment on #85 briefly.
Do you believe the text not streaming when using the node runtime is related to the Next.js/Vercel CLI version?

Vercel has recently moved away from edge.

The argument for using the node run time is interesting but I would need more sources to be convinced. I think the argument goes as follow (can't remember if I saw this on X or not): With more compute power, text streams faster and so with less time executing the function, although the node runtime costs more, the end result in price is about the same if not better. So with node runtime you get faster streaming and a potential savings cost.

I'd like to see if that speed in streaming is significantly faster and whether using the node runtime opens you up to other costs.

@aleksa-codes
Copy link

aleksa-codes commented May 4, 2024

Hi @albertdbio. My comment was related to the 405 error mentioned in #85 😅, which seems to be resolved now, so I removed the Vercel CLI version flag that I had in my project. I think Morphic should be working fine with the newest version of Next.js as well, without giving 405.

Regarding the Edge and Node runtime, I don't have an answer about why streaming is not working. I also learned about Vercel ditching Edge from Theo's video and by finding this PR here. In my case, I only use Edge when necessary, and I would happily switch to using only Node. This would remove the confusion of when to use it or not.

What I am curious about is what that means for the free hobby Vercel tier. Since with that tier, you cannot switch to more CPU and memory, in the case of, for example, streaming responses from the OpenAI API, the requests will time out. Does that mean the hobby tier will be upgraded to standard (CPU/memory) in the future? Or if we want to stay on the free tier, would we still need to utilize the Edge runtime?

@leerob
Copy link
Contributor Author

leerob commented May 8, 2024

👋 There was a bug up until 14.2.2 with streaming + server actions + Node.js that has since been resolved. Thanks for helping pinpoint things here, appreciate it. Should work now, if you want to approve that deployment.

@leerob
Copy link
Contributor Author

leerob commented May 8, 2024

Could you also make sure you're on "Standard" performance here?

https://vercel.com/changelog/manage-your-vercel-functions-cpu-and-memory-in-the-dashboard

@miurla
Copy link
Owner

miurla commented May 8, 2024

Could you also make sure you're on "Standard" performance here?

https://vercel.com/changelog/manage-your-vercel-functions-cpu-and-memory-in-the-dashboard

It has become the standard.

image

@miurla
Copy link
Owner

miurla commented May 8, 2024

> Vercel Runtime Timeout Error: Task timed out after 30 seconds
For our application, 30 seconds is not enough.

Task timed out after 15.02 seconds

Seems to time out at 15 seconds.

Screenshot 2024-05-09 at 5 26 46

@miurla
Copy link
Owner

miurla commented May 8, 2024

Changed maxDuration to 60 seconds and error was resolved.

@miurla
Copy link
Owner

miurla commented May 8, 2024

@leerob Thank you for your great contribution! 🥇

@miurla miurla merged commit edb5ca3 into miurla:main May 8, 2024
2 checks passed
@leerob leerob deleted the fixes branch May 9, 2024 01:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants