Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel upload? #1

Closed
wmertens opened this issue Feb 24, 2017 · 6 comments
Closed

Parallel upload? #1

wmertens opened this issue Feb 24, 2017 · 6 comments

Comments

@wmertens
Copy link

I'm wondering if this project could transparently support parallel uploads? If you provide a FileList, it could perhaps send 3 at a time?

Or would that break a single-form-post assumption on the server?

Also not sure if it's worth it, if the parallel uploading takes just as long as serial uploading.

@jaydenseric
Copy link
Owner

Interesting, I am not sure how clients natively handle multipart form streams.

Or would that break a single-form-post assumption on the server?

Perhaps, handling an interruption would be tricky.

Server performance needs to be considered; one user could choke it up uploading a lot of files at once.

Found this looking around: https://github.com/flowjs/flow.js.

@wmertens
Copy link
Author

wmertens commented Feb 24, 2017 via email

@jaydenseric
Copy link
Owner

I think I will close this for now as it is not something I intend to work on. It is likely to introduce a lot of complexity and what we have now is a fairly standard approach that seems pretty efficient. If anyone has any interesting ideas or benchmarks feel free to necro this issue.

In the meantime, I suppose you could handle parallel uploads in your app by firing separate mutations at once, and awaiting them all to finish before doing a followup mutation. It would then be up to you to decide what happens if some fail or the operation is interrupted.

@wmertens
Copy link
Author

I've been mulling this over for a while, and I thought I'd just write down my thoughts on this:

  • As long as the footprint is small, having a single go-to solution for uploads for everyone is best
  • This means that side-loading to e.g. S3 should also be possible
  • The concept is that the query gets processed in the NetworkInterface and before it hits the server graphql handler, and that works for all use cases I can think of
  • The current implementation shortcuts everything by using form upload for everything, and that's fine. It performs fine vs the normal JSON-body POST and the API is asynchronous on both ends, so any behavior can be injected.
  • If a need later arises for parallel uploads or S3, all you need to do is perform the uploads and replace them with relevant tokens for the graphql query. You can then use any transport for the regular query.
  • One thing that is missing is progress support; it could easily be added for supporting transports by adding an id per upload and exposing a subscription function to get upload progress updates.

So in conclusion, this library is good to use now and is easily updatable to more complex requirements. I would love it if it added flow.js as a transport mechanism, falling back to fetch if not supported, since that would make the user experience much nicer.

@jaydenseric One question though, how open are you to PRs that implement extra features, like switching to XHR to get progress support?

@jaydenseric
Copy link
Owner

jaydenseric commented Mar 28, 2017

I'm open to PRs, but will be picky. Keep in mind minified flow.js is 14.5 KB.

How would progress work with Apollo client? The data loading state is just a boolean. I'm all ears if there is something clever we can do.

These packages (client and server) expressedly exist to allow files to be uploaded to your GraphQL server. If you do not want to upload to your GraphQL server, but upload to Amazon, then there is no problem? Use their whatever their API is on the client, get the result and send it in a regular mutation as strings.

If we were to make the upload method configurable, this line here is the place to start. file would be set via a function that takes the actual file and returns the required metadata. Should we standardize the allowed metadata, or just have people set the type on the server to match the method?

The network interface would be setup something like this:

import ApolloClient from 'apollo-client'
import {createNetworkInterface, amazonMethod as uploadMethod} from 'apollo-upload-client'

const client = new ApolloClient({
  networkInterface: createNetworkInterface({
    uri: '/graphql',
    uploadMethod
  })
})
async function amazonMethod (file) {
  const {name, type, size, url} = await myAmazonAPI(file)
  return {
    name,
    type,
    size,
    url
  }
}

But the current setup assumes a multipart form and the server specifically looks for files; it would need to be rewritten. Really, it should be a different packages because it's a different concept.

Since I added batching support, the server expects 100% of requests to be multipart forms, so currently you cant hit the API from a client other than apollo-upload-client. This is unintentional and temporary, so when I get to server to support both regular and special multipart form file upload requests again this might be easier.

I'm this close to just base64 encoding files and sending them up as vanilla mutation variables. No engineering or config needed, just a 25% larger upload.

@wmertens
Copy link
Author

wmertens commented Mar 29, 2017 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants