New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nuxt Error on Deployment using 'vercel-edge' preset: "An internal error occurred with Vercel. BAD_GATEWAY". Works locally. #135
Comments
Just a note that this still does not work even with the new updates |
@pi0 has a demo here: https://nuxt-openai-demo.vercel.app/ Can you share a link to a deployment? |
@MaxLeiter I just clicked the "deploy to Vercel" button and it deployed but says "edge function crashed": |
Hi. Checking with latest example in your repository, i can reproduce as well. My demo was using
Currently with latest Nitro/Nuxt versions, edge is supported out of the box. The normal lambda functions requires a new experimental flag for the build output that will be enabled very soon in stable release. (in the meantime you can use nuxt edge channel). |
Looks like the example was updated from The module that causes problem is |
That was indeed the issue. Thanks! |
@pi0 @Hebilicious For some reason, the deployment fails again. I did get it to work previously after updating the
Demo link https://vercel-ai-chat-nuxt-openai-git-main-dosstx.vercel.app/ Public repo: https://github.com/dosstx/vercel-ai-chat-nuxt-openai |
Thanks for sharing reproduction @dosstx. Can you please give function errors also any (possible) build warnings? (to help to debug easier, set |
URL: https://vercel-ai-chat-nuxt-openai-2dqtuctth-dosstx.vercel.app/ I made a new deployment and this is what I have:
Deployment build log:
Logs for
|
Hi! The good news it seems using the latest I have updated my repo with it (pi0/nuxt-openai-demo@a9e4554) and it just works (demo). |
@pi0 I will give that a try. Is there a reason to use Vercel's AI dependency? I am confused as to whether I could just use openAI SDK for Nuxt. If Vercel has a tighter integration with Nuxt, I'm all for it! |
hey @dosstx the OpenAI SDK and the AI SDK solve different problems and can be used together. You use the OpenAI SDK to get an LLM response and the AI SDK makes it easy to stream to the client and handle things like function calls |
Yep, just realized that. Thanks! Sounds good, I'll be giving that a try. |
Yes, it works. I was also able to deploy to Netlify Edge with this. Thanks! |
When setting the nitro preset to
vercel-edge
and deploying the example code from here to Vercel, I get following server response:It WORKS locally, but not when deployed to Vercel
Also, when not using the
vercel-edge
preset, the code works when deployed EXCEPT it will not stream the data (only show the server data response when it is completed, instead of incremently).Anyone got any ideas how to get stream work using
vercel-edge
?My
nuxt.config.ts
file:The text was updated successfully, but these errors were encountered: