Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MaxFileSizeUploadError can't be caught and crashes the server #45

Closed
Vincz opened this issue Feb 9, 2018 · 16 comments · Fixed by #81
Closed

MaxFileSizeUploadError can't be caught and crashes the server #45

Vincz opened this issue Feb 9, 2018 · 16 comments · Fixed by #81

Comments

@Vincz
Copy link

Vincz commented Feb 9, 2018

Hi!

I have a problem with a custom middleware and the use of the processRequest function.
Here is my custom middleware:

try {
    return processRequest(request, options)
                    .then(body => {
                        console.log("body is : ", body);
                        ctx.request.body = body;
                        return next();
                    })
                    .catch(error => {
                        console.log("Error catched : ", error);
                        if (error.status && error.expose) response.status(error.status);
                        return next(error)
                    });
} catch (error) {
    console.log("Error catched bis : ", error);
}

If I try to upload a file bigger than the allowed limit, I get the following error and my app crash.

events.js:137
      throw er; // Unhandled 'error' event
      ^

MaxFileSizeUploadError: File truncated as it exceeds the size limit.
    at FileStream.file.stream.once (/Volumes/lake/projects/evibe/api-lovejs/node_modules/apollo-upload-server/lib/middleware.js:35:13)
    at Object.onceWrapper (events.js:255:19)
    at FileStream.emit (events.js:160:13)

I can't capture the error.
Any idea?

Thanks :)

@Vincz
Copy link
Author

Vincz commented Feb 9, 2018

As a complement, I tried the apollo-upload-example, setting this options:


    maxFieldSize: 1000,
    maxFileSize: 1000,
    maxFiles: 5

And I get the same crash:

(node:77346) ExperimentalWarning: The ESM module loader is experimental.
Serving http://localhost:3001 for development.
events.js:137
      throw er; // Unhandled 'error' event
      ^

MaxFileSizeUploadError: File truncated as it exceeds the size limit.
    at FileStream.file.stream.once (file:///Volumes/lake/labs/apollo-upload-examples/api/node_modules/apollo-upload-server/lib/middleware.mjs:29:13)
    at Object.onceWrapper (events.js:255:19)
    at FileStream.emit (events.js:160:13)
    at PartStream.onData (/Volumes/lake/labs/apollo-upload-examples/api/node_modules/busboy/lib/types/multipart.js:220:18)
    at PartStream.emit (events.js:160:13)
    at addChunk (_stream_readable.js:269:12)
    at readableAddChunk (_stream_readable.js:256:11)
    at PartStream.Readable.push (_stream_readable.js:213:10)
    at Dicer._oninfo (/Volumes/lake/labs/apollo-upload-examples/api/node_modules/dicer/lib/Dicer.js:191:36)
    at Dicer._oninfo (/Volumes/lake/labs/apollo-upload-examples/api/node_modules/dicer/lib/Dicer.js:199:14)
    at SBMH.<anonymous> (/Volumes/lake/labs/apollo-upload-examples/api/node_modules/dicer/lib/Dicer.js:127:10)
    at SBMH.emit (events.js:160:13)
    at SBMH._sbmh_feed (/Volumes/lake/labs/apollo-upload-examples/api/node_modules/streamsearch/lib/sbmh.js:188:10)
    at SBMH.push (/Volumes/lake/labs/apollo-upload-examples/api/node_modules/streamsearch/lib/sbmh.js:56:14)
    at Dicer._write (/Volumes/lake/labs/apollo-upload-examples/api/node_modules/dicer/lib/Dicer.js:109:17)
    at doWrite (_stream_writable.js:407:12)
[nodemon] app crashed - waiting for file changes before starting...

@jaydenseric
Copy link
Owner

jaydenseric commented Feb 10, 2018

I don't know why this is not working, I could swear it used to work.

I added a (failing) test for when maxFileSize is exceeded. We should try to fix the Upload class and processRequest function used in the middleware to get this test to pass.

For some reason an emitted error event (see here) when maxFileSize is exceeded can not be listened for in the resolved upload promise stream. Renaming the event anything other than error works, so it is something to do with the weirdness of error events in Node.js streams. I know they can't be forwarded when piping a stream, but I can't see any pipe happening between where the error is emitted, and where we are trying to listen for it in the test.

Help on this would be appreciated. Worse case, we need to drop the event monkeypatch and publish a major new version that just uses the busboy API for oversized files: A limit event.

@jaydenseric
Copy link
Owner

jaydenseric commented Feb 10, 2018

If you want to take a crack at it just clone the repo, and in it run npm install && npm run watch.

@roytan883
Copy link

same maxFileSize lead crash server problem here.

const apolloServer = new ApolloServer({
    typeDefs,
    resolvers,
    debug: config.baseConfig.isDev,
    uploads: {
      maxFileSize: 200,
    },
  });
apolloServer.applyMiddleware({ app, path: config.serviceConfig.fileUploadRoute, });

the output is:

20|rts-gra | 18-08-16_11:26:35.143: MaxFileSizeUploadError: File truncated as it exceeds the size limit.
20|rts-gra |     at FileStream.file.stream.once (/home/mydev/projects/rts/server/node_modules/apollo-server-koa/node_modules/apollo-upload-server/lib/middleware.js:35:13)
20|rts-gra |     at Object.onceWrapper (events.js:313:30)
20|rts-gra |     at emitNone (events.js:106:13)
20|rts-gra |     at FileStream.emit (events.js:208:7)
20|rts-gra |     at PartStream.onData (/home/mydev/projects/rts/server/node_modules/busboy/lib/types/multipart.js:220:18)
20|rts-gra |     at emitOne (events.js:116:13)
20|rts-gra |     at PartStream.emit (events.js:211:7)
20|rts-gra |     at addChunk (_stream_readable.js:263:12)
20|rts-gra |     at readableAddChunk (_stream_readable.js:250:11)
20|rts-gra |     at PartStream.Readable.push (_stream_readable.js:208:10)
20|rts-gra |     at Dicer._oninfo (/home/mydev/projects/rts/server/node_modules/dicer/lib/Dicer.js:191:36)
20|rts-gra |     at Dicer._oninfo (/home/mydev/projects/rts/server/node_modules/dicer/lib/Dicer.js:199:14)
20|rts-gra |     at SBMH.<anonymous> (/home/mydev/projects/rts/server/node_modules/dicer/lib/Dicer.js:127:10)
20|rts-gra |     at emitMany (events.js:147:13)
20|rts-gra |     at SBMH.emit (events.js:224:7)
20|rts-gra |     at SBMH._sbmh_feed (/home/mydev/projects/rts/server/node_modules/streamsearch/lib/sbmh.js:159:14)

@jaydenseric
Copy link
Owner

@roytan883 Apollo Server uses an old version that has bugs :(

@roytan883
Copy link

The bug is the exception throw to top level, i use process object can get it:

process.on('uncaughtException', (err) => {
    console.log("process uncaughtException = ", err)
  });

find:

20|rts-gra | 18-08-16_11:41:07.458: process uncaughtException =  { MaxFileSizeUploadError: File truncated as it exceeds the size limit.
20|rts-gra |     at FileStream.file.stream.once (/home/mydev/projects/rts/server/node_modules/apollo-server-koa/node_modules/apollo-upload-server/lib/middleware.js:35:13)
20|rts-gra |     at Object.onceWrapper (events.js:313:30)
20|rts-gra |     at emitNone (events.js:106:13)
20|rts-gra |     at FileStream.emit (events.js:208:7)
20|rts-gra |     at PartStream.onData (/home/mydev/projects/rts/server/node_modules/busboy/lib/types/multipart.js:220:18)
20|rts-gra |     at emitOne (events.js:116:13)
20|rts-gra |     at PartStream.emit (events.js:211:7)
20|rts-gra |     at addChunk (_stream_readable.js:263:12)
20|rts-gra |     at readableAddChunk (_stream_readable.js:250:11)
20|rts-gra |     at PartStream.Readable.push (_stream_readable.js:208:10)
20|rts-gra |     at Dicer._oninfo (/home/mydev/projects/rts/server/node_modules/dicer/lib/Dicer.js:191:36)
20|rts-gra |     at Dicer._oninfo (/home/mydev/projects/rts/server/node_modules/dicer/lib/Dicer.js:199:14)
20|rts-gra |     at SBMH.<anonymous> (/home/mydev/projects/rts/server/node_modules/dicer/lib/Dicer.js:127:10)
20|rts-gra |     at emitMany (events.js:147:13)
20|rts-gra |     at SBMH.emit (events.js:224:7)
20|rts-gra |     at SBMH._sbmh_feed (/home/mydev/projects/rts/server/node_modules/streamsearch/lib/sbmh.js:159:14) name: 'MaxFileSizeUploadError' }

@jaydenseric
Copy link
Owner

jaydenseric commented Aug 16, 2018

@roytan883
Copy link

@jaydenseric I use apollo-server-koa, how can i fix this? wait apollo-server-koa use new version of apollo-upload-server ?

@jaydenseric
Copy link
Owner

Yes, you have to wait for them to update unfortunately. That is the cost of magic packages that abstract a lot of things; you lose control 😓

Apollo aims to support Node.js v6+, but new versions of this package support v8.5+ making them hesitant to upgrade despite all the improvements. We're looking at what has to happen to get Node.js v6.10+ support for this package for future releases, but it is not a sure thing as it is really complicated dealing with API differences in aborted network requests, destroying streams, etc. It would be brilliant for Apollo to update support to v8.5+. I consider that the first modern version, with native async/await, ESM, etc.

@jaydenseric
Copy link
Owner

@roytan883 you might be interested in graphql-api-koa.

@roytan883
Copy link

@jaydenseric So far, I can only manually force to use outside new version apollo-upload-server by

rm -rf node_modules/apollo-server-koa/node_modules/apollo-upload-server

@Awalgawe
Copy link

I had the same problem.

@jaydenseric
After some investigation, the resolve(operations); (https://github.com/jaydenseric/graphql-upload/blob/master/public/processRequest.js#L243) should probably be moved to https://github.com/jaydenseric/graphql-upload/blob/master/public/processRequest.js#L343 but I don't know all the implications.

@jaydenseric
Copy link
Owner

@Awalgawe there are no known bugs about this in the current version of graphql-upload, are you sure you are using the latest version?

The operations need to resolve before the entire request has finished parsing so that we can have an API that has file upload streams in resolvers, like the "async" version in this graphic:

https://github.com/jaydenseric/graphql-multipart-request-spec/blob/master/sync-vs-async-graphql-multipart-request-middleware.svg

@mi-mazouz
Copy link

still have this error using graphqlUploadExpress and graphql-upload 11.0.0

@jaydenseric
Copy link
Owner

@mi-mazouz if you are sure everything is setup correctly in your project and that there might be a graphql-upload bug, please in a draft PR add a test that shows a failure. Here's the existing maxFileSize option test:

tests.add('`processRequest` with option `maxFileSize`.', async () => {

It should be pretty easy to work on:

  1. Fork graphql-upload.
  2. Clone your fork to your local machine.
  3. In Terminal, go to the cloned project directory and run npm install.
  4. Update the test code and run npm test after.

If you can produce a failing test case then we can figure out a patch against that.

@Luchanso
Copy link

I have the same problem, in my case file stream throw error and not exist handlers for that error.

node-backend            |   PayloadTooLargeError: File truncated as it exceeds the 3145728 byte size limit.
node-backend            |       at FileStream.<anonymous> (/app/node_modules/graphql-upload/public/processRequest.js:289:21)
node-backend            |       at FileStream.emit (node:events:390:28)
node-backend            |       at PartStream.onData (/app/node_modules/busboy/lib/types/multipart.js:221:18)
node-backend            |       at PartStream.emit (node:events:390:28)
node-backend            |       at addChunk (node:internal/streams/readable:315:12)
node-backend            |       at readableAddChunk (node:internal/streams/readable:289:9)
node-backend            |       at PartStream.Readable.push (node:internal/streams/readable:228:10)
node-backend            |       at Dicer._oninfo (/app/node_modules/dicer/lib/Dicer.js:190:36)
node-backend            |       at SBMH.<anonymous> (/app/node_modules/dicer/lib/Dicer.js:126:10)
node-backend            |       at SBMH.emit (node:events:390:28) {
node-backend            |     locations: [ [Object] ],
node-backend            |     path: [ 'createProduct' ]
node-backend            |   }

Solution in my case: wrap stream in promise

export const resolvers = {
  // ... other resolvers ...
  Mutation: {
    uploadImageResolver: async (_, { image }) => {
      // throwing error
      await new Promise(async (res, rej) => {
        const { createReadStream } = await image;
        const stream = createReadStream();
        stream.on("error", (err) => {
            rej(err);
        });

        // ... some code about using that stream, like save file ...
        // await saveFile(stream);
        res(); // success
      });
    },
  },
};

krasivyy3954 added a commit to krasivyy3954/react-graphql-upload that referenced this issue Jan 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants