-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
.size option not supported on res.body #1149
Comments
I'm not sure whether I quite understand what you mean - could you please provide more details? Example code would be very useful. |
If you use If you use res.body.pipe(fs.createWriteStream('./out.txt')) ...then no error will be emitted, even if the response will be larger than |
I'm solving this internally by replacing |
I would say that this is obvious - when you utilize a function like Line 200 in ffef5e3
However, |
We can mention this fact in the README, but I'm not sure whether that's necessary - would like to hear what @node-fetch/reviewers think. |
It wasn't obvious to me. I expected As an alternative to adjusting the documentation, It would be possible for this library to make the stream emit an error too. |
If it helps, here is my basic implementation: class LimitStream extends Transform {
constructor (limit) {
super()
this._bytesLimit = limit
this._bytesWritten = 0
}
_transform (chunk, _, cb) {
this._bytesWritten += chunk.length
if (this._bytesWritten > this._bytesLimit) {
this.emit('limit')
}
this.push(chunk)
cb()
}
}
const res = await fetch(url, opts)
const { body } = res
const limitStream = new LimitStream(opts.size)
limitStream.on('limit', () => {
limitStream.removeAllListeners('limit')
body.emit('error', new FetchError(`content size at ${url} over limit: ${opts.size}`, 'max-size'))
})
body.pipe(limitStream)
body.on('error', limitStream.emit.bind(limitStream, 'error'))
Object.defineProperty(res, 'body', { get: () => limitStream }) |
|
Here is my take on looking at the size limit in a cross platform way-ish and looking at download progress at the same time const ctrl = new AbortController()
const limit = 1024
const res = await fetch(url, { signal: ctrl.signal })
const knownLength = Number(res.headers.get('content-length') || 0)
const notEncoded = !res.headers.get('content-encoding')
const decoder = new TextDecoder()
let responseText = ''
let bytesWritten = 0
if (knownLength > limit) {
ctrl.abort()
throw new Error(`content size at ${res.url} over limit`)
}
for await (const chunk of res.body) {
bytesWritten += chunk.byteLength
if (bytesWritten > limit) {
ctrl.abort()
throw new Error(`content size at ${res.url} over limit`)
}
if (notEncoded && knownLength) {
console.log(`progress = ${bytesWritten / knownLength}`)
}
// Do something else with chunk
responseText += decoder.decode(chunk, { stream: true })
}
responseText += decoder.decode() // flush the remaning bytes
const json = JSON.stringify(responseText) |
Nice, thanks for sharing, this is a lot simpler using async iteration |
As an attempt to implement nodes new stream-consumers then this became more apparent to me. |
It's not mentioned in the README that when using
res.body
and notres.json()
etc, that the.size
option will not trigger any response size limiting errors.The text was updated successfully, but these errors were encountered: