New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement maxUploadFileSize
#256
Changes from 12 commits
d617e36
5eb4dc1
680fe92
52dc4c7
d038235
2af220f
8d9191a
a623d48
c6b6678
f7a0493
e3450a0
159f040
c404117
4254181
05ee163
a51b537
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,6 @@ | ||
|
||
|
||
import stream from 'stream' | ||
import stream, { Readable, PassThrough } from 'stream' | ||
import { DriverConstructor, DriverStatics } from './driverModel' | ||
import S3Driver from './drivers/S3Driver' | ||
import AzDriver from './drivers/AzDriver' | ||
|
@@ -43,6 +43,14 @@ export function getDriverClass(driver: DriverName): DriverConstructor & DriverSt | |
} | ||
} | ||
|
||
export function megabytesToBytes(megabytes: number) { | ||
return megabytes * 1024 * 1024 | ||
} | ||
|
||
export function bytesToMegabytes(bytes: number, decimals = 2) { | ||
return Number.parseFloat((bytes / 1024 / 1024).toFixed(decimals)) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Not a bug, but I'd prefer There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Fixed |
||
} | ||
|
||
export function dateToUnixTimeSeconds(date: Date) { | ||
return Math.round(date.getTime() / 1000) | ||
} | ||
|
@@ -79,6 +87,59 @@ export function timeout(milliseconds: number): Promise<void> { | |
}) | ||
} | ||
|
||
export interface StreamProgressCallback { | ||
/** | ||
* A callback that is invoked each time a chunk passes through the stream. | ||
* This callback can throw an Error and it will be propagated | ||
* If this callback throws an error, it will be propagated through the stream | ||
* pipeline. | ||
* @param totalBytes Total bytes read (includes the current chunk bytes). | ||
* @param chunkBytes Bytes read in the current chunk. | ||
*/ | ||
(totalBytes: number, chunkBytes: number): void; | ||
} | ||
|
||
export interface MonitorStreamResult { | ||
monitoredStream: Readable; | ||
pipelinePromise: Promise<void>; | ||
} | ||
|
||
export function monitorStreamProgress( | ||
inputStream: Readable, | ||
progressCallback: StreamProgressCallback | ||
): MonitorStreamResult { | ||
|
||
// Create a PassThrough stream to monitor streaming chunk sizes. | ||
let monitoredContentSize = 0 | ||
const monitorStream = new PassThrough({ | ||
transform: (chunk: Buffer, _encoding, callback) => { | ||
monitoredContentSize += chunk.length | ||
try { | ||
progressCallback(monitoredContentSize, chunk.length) | ||
// Pass the chunk Buffer through, untouched. This takes the fast | ||
// path through the stream pipe lib. | ||
callback(null, chunk) | ||
} catch (error) { | ||
callback(error) | ||
} | ||
} | ||
}) | ||
|
||
// Use the stream pipe API to monitor a stream with correct back pressure | ||
// handling. This avoids buffering entire streams in memory and hooks up | ||
// all the correct events for cleanup and error handling. | ||
// See https://nodejs.org/api/stream.html#stream_three_states | ||
// https://nodejs.org/ja/docs/guides/backpressuring-in-streams/ | ||
const monitorPipeline = pipelineAsync(inputStream, monitorStream) | ||
|
||
const result: MonitorStreamResult = { | ||
monitoredStream: monitorStream, | ||
pipelinePromise: monitorPipeline | ||
} | ||
|
||
return result | ||
} | ||
|
||
|
||
export class AsyncMutexScope { | ||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should mention somewhere in the README that a
Content-Length
header is necessary -- there's no support for chunked encoding.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Per my comment #256 (comment)
I removed the requirement for providing a content-length header. After extensive testing, this is a frustrating header to provide given CORs whitelist contraints. Additionally, it excludes future uploading features that do not necessarily know the content length in advance.