Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming API Responses Not Working on Vercel #38736

Closed
1 task done
jamespantalones opened this issue Jul 17, 2022 · 6 comments
Closed
1 task done

Streaming API Responses Not Working on Vercel #38736

jamespantalones opened this issue Jul 17, 2022 · 6 comments
Assignees
Labels
Runtime Related to Node.js or Edge Runtime with Next.js.

Comments

@jamespantalones
Copy link

Verify canary release

  • I verified that the issue exists in the latest Next.js canary release

Provide environment information

    Operating System:
      Platform: darwin
      Arch: x64
      Version: Darwin Kernel Version 20.6.0: Tue Feb 22 21:10:41 PST 2022; root:xnu-7195.141.26~1/RELEASE_X86_64
    Binaries:
      Node: 16.15.0
      npm: 8.5.5
      Yarn: 1.22.17
      pnpm: 6.16.0
    Relevant packages:
      next: 12.2.3-canary.10
      eslint-config-next: 12.2.0
      react: 18.2.0
      react-dom: 18.2.0

What browser are you using? (if relevant)

Chrome 103.0.5060.114

How are you deploying your application? (if relevant)

Vercel

Describe the Bug

I am using the experimental-edge runtime for an api route. /api/stream. I am creating a custom ReadableStream within this API handler, and returning the ReadableStream to the Response body. This works great locally. However, once I deploy to Vercel, the response resolves with no body. The documentation mentions supporting streaming, so not quite sure what I'm doing wrong.

Not sure if this bug resides on the Next.js side or Vercel.

Below is the simplest repro I can make.

export const config = {
  runtime: "experimental-edge",
  api: {
    bodyParser: false,
    responseLimit: false,
  },
};

export default async function handler(req: NextRequest, res: NextResponse) {

  let count = 0;

  const resultStream = new ReadableStream(
    {
      pull(controller) {
        if (count < 10) {
          controller.enqueue(JSON.stringify({ foo: "bar" }) + "\n");
          count++;
        } else {
          controller.close();
        }
      },
    },
    {
      highWaterMark: 1,
      size(chunk) {
        return 1;
      },
    }
  );

  return new Response(resultStream, {
    status: 200,
    headers: {
      "content-type": "text/plain",
      "Cache-Control": "no-cache",
    },
  });
}

Expected Behavior

I expect the stream to be returned in chunks, similar to local dev. {"foo": "bar"}\n should be returned 10 times.

Link to reproduction

https://github.com/jamespantalones/next-js-stream-bug

To Reproduce

  1. npm install
  2. npm run dev
  3. Visit http://localhost:3000/api/stream and you should see the following:

Screen Shot 2022-07-17 at 4 06 05 PM

  1. Visit same app on Vercel https://next-js-stream-bug.vercel.app/api/hello
  2. You should see nothing returned.
@jamespantalones jamespantalones added the bug Issue was opened via the bug report template. label Jul 17, 2022
@balazsorban44 balazsorban44 added Runtime Related to Node.js or Edge Runtime with Next.js. kind: bug and removed bug Issue was opened via the bug report template. labels Jul 18, 2022
@jamespantalones jamespantalones changed the title Streaming API Responses Not Working in Production Streaming API Responses Not Working on Vercel Jul 24, 2022
@crazy-slot
Copy link

What is your build command: next build && next export or only next build?

@jamespantalones
Copy link
Author

@balazsorban44 does #38862 address this?

@jamespantalones
Copy link
Author

@Onyelaudochukwuka do you mind deleting all your comments? i think you are referring to a separate issue entirely. this is only regarding ReadableStreams running on experimental-edge api routes

@jamespantalones
Copy link
Author

@crazy-slot just next build

@javivelasco
Copy link
Member

The problem on this example is that you're a enqueuing strings into your ReadableStream while it should only accept Uint8Array. In Next.js it worked because you are responding with a stream from the Edge Runtime and then it is the Next.js Server (in Node.js) the one that processes the stream wiring it into the Node Response which actually allows string. When deployed on Vercel it was failing because strings can't be enqueued to the Response stream but you wouldn't see an error because of a bug catching and logging unhandled rejection.

We have already fixed the issue in both Next.js to be consistent making it fail when enqueuing strings and in Vercel to make sure logs with the error are correctly delivered. If you run the example in the latest Next.js canary you will see an error stating: "TypeError: This ReadableStream did not return bytes.". When you deploy to Vercel you will see the same error TypeError: This ReadableStream did not return bytes..

To fix the code you must first encode the text using new TextEncoder.encode().

@javivelasco javivelasco self-assigned this Oct 14, 2022
@github-actions
Copy link
Contributor

This closed issue has been automatically locked because it had no new activity for a month. If you are running into a similar issue, please create a new issue with the steps to reproduce. Thank you.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 13, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Runtime Related to Node.js or Edge Runtime with Next.js.
Projects
None yet
Development

No branches or pull requests

4 participants