New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(cli/buffer): allow Buffer to store MAX_SIZE bytes #6568
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -11,11 +11,19 @@ import { | |
unitTest, | ||
} from "./test_util.ts"; | ||
|
||
const MAX_SIZE = 2 ** 32 - 2; | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Is it really necessary to allocate 4G of memory? Does the crash occur a smaller allocations? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I can replicate the crash on
readFrom/Sync
Can't replicate with a smaller amount because |
||
// N controls how many iterations of certain checks are performed. | ||
const N = 100; | ||
let testBytes: Uint8Array | null; | ||
let testString: string | null; | ||
|
||
let ignoreMaxSizeTests = false; | ||
try { | ||
new ArrayBuffer(MAX_SIZE); | ||
} catch (e) { | ||
ignoreMaxSizeTests = true; | ||
} | ||
|
||
function init(): void { | ||
if (testBytes == null) { | ||
testBytes = new Uint8Array(N); | ||
|
@@ -167,6 +175,100 @@ unitTest(async function bufferTooLargeByteWrites(): Promise<void> { | |
); | ||
}); | ||
|
||
unitTest( | ||
{ ignore: ignoreMaxSizeTests }, | ||
function bufferGrowWriteMaxBuffer(): void { | ||
const bufSize = 16 * 1024; | ||
const capacities = [MAX_SIZE, MAX_SIZE - 1]; | ||
for (const capacity of capacities) { | ||
let written = 0; | ||
const buf = new Deno.Buffer(); | ||
const writes = Math.floor(capacity / bufSize); | ||
for (let i = 0; i < writes; i++) | ||
written += buf.writeSync(repeat("x", bufSize)); | ||
|
||
if (written < capacity) { | ||
written += buf.writeSync(repeat("x", capacity - written)); | ||
} | ||
|
||
assertEquals(written, capacity); | ||
} | ||
} | ||
); | ||
|
||
unitTest( | ||
{ ignore: ignoreMaxSizeTests }, | ||
function bufferGrowReadSyncCloseToMaxBuffer(): void { | ||
const capacities = [MAX_SIZE, MAX_SIZE - 1]; | ||
for (const capacity of capacities) { | ||
const reader = new Deno.Buffer(new ArrayBuffer(capacity)); | ||
const buf = new Deno.Buffer(); | ||
buf.readFromSync(reader); | ||
|
||
assertEquals(buf.length, capacity); | ||
} | ||
} | ||
); | ||
|
||
unitTest( | ||
{ ignore: ignoreMaxSizeTests }, | ||
async function bufferGrowReadCloseToMaxBuffer(): Promise<void> { | ||
const capacities = [MAX_SIZE, MAX_SIZE - 1]; | ||
for (const capacity of capacities) { | ||
const reader = new Deno.Buffer(new ArrayBuffer(capacity)); | ||
const buf = new Deno.Buffer(); | ||
await buf.readFrom(reader); | ||
assertEquals(buf.length, capacity); | ||
} | ||
} | ||
); | ||
|
||
unitTest( | ||
{ ignore: ignoreMaxSizeTests }, | ||
async function bufferGrowReadCloseMaxBufferPlus1(): Promise<void> { | ||
const reader = new Deno.Buffer(new ArrayBuffer(MAX_SIZE + 1)); | ||
const buf = new Deno.Buffer(); | ||
|
||
await assertThrowsAsync( | ||
async () => { | ||
await buf.readFrom(reader); | ||
}, | ||
Error, | ||
"grown beyond the maximum size" | ||
); | ||
} | ||
); | ||
|
||
unitTest( | ||
{ ignore: ignoreMaxSizeTests }, | ||
function bufferGrowReadSyncCloseMaxBufferPlus1(): void { | ||
const reader = new Deno.Buffer(new ArrayBuffer(MAX_SIZE + 1)); | ||
const buf = new Deno.Buffer(); | ||
|
||
assertThrows( | ||
() => { | ||
buf.readFromSync(reader); | ||
}, | ||
Error, | ||
"grown beyond the maximum size" | ||
); | ||
} | ||
); | ||
|
||
unitTest( | ||
{ ignore: ignoreMaxSizeTests }, | ||
async function bufferReadCloseToMaxBufferWithInitialGrow(): Promise<void> { | ||
const capacities = [MAX_SIZE, MAX_SIZE - 1, MAX_SIZE - 512]; | ||
for (const capacity of capacities) { | ||
const reader = new Deno.Buffer(new ArrayBuffer(capacity)); | ||
const buf = new Deno.Buffer(); | ||
buf.grow(MAX_SIZE); | ||
await buf.readFrom(reader); | ||
assertEquals(buf.length, capacity); | ||
} | ||
} | ||
); | ||
|
||
unitTest(async function bufferLargeByteReads(): Promise<void> { | ||
init(); | ||
assert(testBytes); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
calling
this.#grow
without knowing the actual amount that needs to grow is causing troubles, went with the simpler and less error-prone code, which is callingwriteSync
which callsthis.#grow
with the size of the read buffer.In a few minutes, I'll push an alternative reading directly to the buffer. If someone else wants to try a different approach I'm more than happy.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Posted alternative implementation in #6570