Skip to content
This repository was archived by the owner on Feb 20, 2025. It is now read-only.

Conversation

@dtinth
Copy link
Contributor

@dtinth dtinth commented Oct 5, 2020

Running Nuxt and API on the same server is a common use case.

At first, this is not supported. Quoting @danielroe from #332 (comment):

I suspect you might be trying to connect to your serverMiddleware API on server side, which will not work in a serverless environment.

Later in #341, it was suggested that using $VERCEL_URL can work around this issue. I previously filed #373 which adds a docs about this workaround. Although it works, it is quite hacky and is unperformant:

  • It requires extra configuration.
  • It relies on $VERCEL_URL to be able to make Nuxt invoke the API on the same deployment.
  • In runtime, this causes 2 separate instances of the Lambda function to launch, doubling the response time (due to waiting for 2 cold boots: one in Nuxt and another in API). This made the serverless function timeout a lot.

Today, I took a quick look at the source code and I discovered a simple fix: Just launch an HTTP server on port 3000 (this happens to be the default port configured by @nuxtjs/axios) and problem is solved. Using @nuxtjs/axios to call serverMiddleware API on server side “just works” now.

How to try it out

I published my fork to npm so you can "use": "@dtinth/nuxtjs__vercel-builder@prerelease" to test it.

@danielroe
Copy link
Member

danielroe commented Nov 3, 2020

@dtinth great idea - but would you make it opt-in rather than on by default, as I imagine it has a performance cost to cold start? (I'd also appreciate it if you could do some perf benchmarks)

@dtinth
Copy link
Contributor Author

dtinth commented Nov 8, 2020

@danielroe I strongly doubt that there would be any, if at all, performance hit in cold start.

  1. This code change does not create 2 instances of Nuxt servers; it only creates 2 HTTP servers sharing the same request handler.
  2. It takes, on average, about 3ms to create a server and listen to a port, however, the listen() call is non-blocking, and on average it blocks JS execution for only 0.3ms.

* if there is serverMiddleware in `nuxt.config`
* if it is enabled in package config
@danielroe danielroe changed the title Add support for calling APIs defined in serverMiddleware from within Nuxt feat: add internal server for serverMiddleware Nov 9, 2020
@danielroe danielroe merged commit 4935cab into nuxt:main Nov 9, 2020
@dtinth
Copy link
Contributor Author

dtinth commented Nov 10, 2020

Thanks @danielroe!

@JonathanAlumbaugh
Copy link

This is awesome, was literally just wishing I could use serverMiddleware this way an hour or two after @dtinth forked this, and then bam it's merged in! You guys rock.

@joshrouwhorst
Copy link

I'm just going to throw this out there for anyone, like me, who is struggling to get an express API running at all, and can NOT figure out why it works locally but not in Vercel. Make sure express is in your dependencies and not your devDependencies!

@danielroe danielroe mentioned this pull request Jan 10, 2021
@AhmedSarhan
Copy link

AhmedSarhan commented Mar 17, 2021

I am lost, what's the solution? I already have a project ready on deployment and when I deploy it to vercel it fails 😅 and I still don't know what to do
My network shoes that I am sending requests to localhost:3000, this was working on local machine but I think it should be different on vercel

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants