Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.size option not supported on res.body #1149

Open
juliangruber opened this issue May 5, 2021 · 11 comments
Open

.size option not supported on res.body #1149

juliangruber opened this issue May 5, 2021 · 11 comments
Labels

Comments

@juliangruber
Copy link

It's not mentioned in the README that when using res.body and not res.json() etc, that the .size option will not trigger any response size limiting errors.

@xxczaki
Copy link
Member

xxczaki commented May 5, 2021

I'm not sure whether I quite understand what you mean - could you please provide more details? Example code would be very useful.

@juliangruber
Copy link
Author

If you use res.json() for example, an error will be thrown once the internal stream has reached opts.size bytes.

If you use res.body, for example like this:

res.body.pipe(fs.createWriteStream('./out.txt'))

...then no error will be emitted, even if the response will be larger than opts.size.

@juliangruber
Copy link
Author

I'm solving this internally by replacing res.body with a stream that emits an error once opts.size bytes have been reached.

@xxczaki
Copy link
Member

xxczaki commented May 5, 2021

I would say that this is obvious - when you utilize a function like json(), the stream is processed by node-fetch. We have a function to consume the body, which respects the size:

if (data.size > 0 && accumBytes + chunk.length > data.size) {

However, response.body is not a function, but rather just a ReadableStream. You must process it yourself and make sure that you respect options.size - Stream.pipe() function doesn't know it.

@xxczaki
Copy link
Member

xxczaki commented May 5, 2021

We can mention this fact in the README, but I'm not sure whether that's necessary - would like to hear what @node-fetch/reviewers think.

@juliangruber
Copy link
Author

juliangruber commented May 5, 2021

I would say that this is obvious

It wasn't obvious to me. I expected res.body to emit an "error" once opts.size bytes had been received. The default way of writing the response to disk is to use the res.body stream (assuming you don't want to buffer the whole response in memory), and for this use-case opts.size doesn't work.

As an alternative to adjusting the documentation, It would be possible for this library to make the stream emit an error too.

@juliangruber
Copy link
Author

If it helps, here is my basic implementation:

class LimitStream extends Transform {
  constructor (limit) {
    super()
    this._bytesLimit = limit
    this._bytesWritten = 0
  }

  _transform (chunk, _, cb) {
    this._bytesWritten += chunk.length
    if (this._bytesWritten > this._bytesLimit) {
      this.emit('limit')
    }
    this.push(chunk)
    cb()
  }
}

const res = await fetch(url, opts)

const { body } = res
const limitStream = new LimitStream(opts.size)
limitStream.on('limit', () => {
  limitStream.removeAllListeners('limit')
  body.emit('error', new FetchError(`content size at ${url} over limit: ${opts.size}`, 'max-size'))
})
body.pipe(limitStream)
body.on('error', limitStream.emit.bind(limitStream, 'error'))
Object.defineProperty(res, 'body', { get: () => limitStream })

@tinovyatkin
Copy link
Member

size is a node-fetch extension to the spec, and as such, has a very little description for edge case behaviours.
In my personal opinion, it should be removed, as (even per @juliangruber example) streaming response allows to implement size limit for any particular needs. But if we want to keep it then we certainly must respect it in all types of responses.

@jimmywarting
Copy link
Collaborator

jimmywarting commented May 13, 2021

Here is my take on looking at the size limit in a cross platform way-ish and looking at download progress at the same time

const ctrl = new AbortController()
const limit = 1024
const res = await fetch(url, { signal: ctrl.signal })
const knownLength = Number(res.headers.get('content-length') || 0)
const notEncoded = !res.headers.get('content-encoding')
const decoder = new TextDecoder()
let responseText = ''
let bytesWritten = 0


if (knownLength > limit) {
  ctrl.abort()
  throw new Error(`content size at ${res.url} over limit`)
}

for await (const chunk of res.body) {
  bytesWritten += chunk.byteLength
  if (bytesWritten > limit) {
    ctrl.abort()
    throw new Error(`content size at ${res.url} over limit`)
  }
  if (notEncoded && knownLength) {
    console.log(`progress = ${bytesWritten / knownLength}`)
  }

  // Do something else with chunk
  responseText += decoder.decode(chunk, { stream: true })
}

responseText += decoder.decode() // flush the remaning bytes
const json = JSON.stringify(responseText)

@juliangruber
Copy link
Author

Nice, thanks for sharing, this is a lot simpler using async iteration

@jimmywarting
Copy link
Collaborator

As an attempt to implement nodes new stream-consumers then this became more apparent to me.
Right now i feel like removing size option all together. it's not part of the spec and it is not implemented in any other fetch implementation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants