Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(cli/buffer): allow Buffer to store MAX_SIZE bytes #6568

Closed
wants to merge 5 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
27 changes: 9 additions & 18 deletions cli/js/buffer.ts
Expand Up @@ -7,11 +7,6 @@
import { Reader, Writer, ReaderSync, WriterSync } from "./io.ts";
import { assert } from "./util.ts";

// MIN_READ is the minimum ArrayBuffer size passed to a read call by
// buffer.ReadFrom. As long as the Buffer has at least MIN_READ bytes beyond
// what is required to hold the contents of r, readFrom() will not grow the
// underlying buffer.
const MIN_READ = 512;
const MAX_SIZE = 2 ** 32 - 2;

// `off` is the offset into `dst` where it will at which to begin writing values
Expand Down Expand Up @@ -133,17 +128,17 @@ export class Buffer implements Reader, ReaderSync, Writer, WriterSync {
// we instead let capacity get twice as large so we
// don't spend all our time copying.
copyBytes(this.#buf.subarray(this.#off), this.#buf);
} else if (c > MAX_SIZE - c - n) {
} else if (c + n > MAX_SIZE) {
throw new Error("The buffer cannot be grown beyond the maximum size.");
} else {
// Not enough space anywhere, we need to allocate.
const buf = new Uint8Array(2 * c + n);
const buf = new Uint8Array(Math.min(2 * c + n, MAX_SIZE));
copyBytes(this.#buf.subarray(this.#off), buf);
this.#buf = buf;
}
// Restore this.#off and len(this.#buf).
this.#off = 0;
this.#reslice(m + n);
this.#reslice(Math.min(m + n, MAX_SIZE));
return m;
};

Expand All @@ -157,31 +152,27 @@ export class Buffer implements Reader, ReaderSync, Writer, WriterSync {

async readFrom(r: Reader): Promise<number> {
let n = 0;
const buf = new Uint8Array(32 * 1024);
Copy link
Contributor Author

@marcosc90 marcosc90 Jun 29, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

calling this.#grow without knowing the actual amount that needs to grow is causing troubles, went with the simpler and less error-prone code, which is calling writeSync which calls this.#grow with the size of the read buffer.

In a few minutes, I'll push an alternative reading directly to the buffer. If someone else wants to try a different approach I'm more than happy.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Posted alternative implementation in #6570

while (true) {
const i = this.#grow(MIN_READ);
this.#reslice(i);
const fub = new Uint8Array(this.#buf.buffer, i);
const nread = await r.read(fub);
const nread = await r.read(buf);
if (nread === null) {
return n;
}
this.#reslice(i + nread);
n += nread;
this.writeSync(buf.subarray(0, nread));
}
}

readFromSync(r: ReaderSync): number {
let n = 0;
const buf = new Uint8Array(32 * 1024);
while (true) {
const i = this.#grow(MIN_READ);
this.#reslice(i);
const fub = new Uint8Array(this.#buf.buffer, i);
const nread = r.readSync(fub);
const nread = r.readSync(buf);
if (nread === null) {
return n;
}
this.#reslice(i + nread);
n += nread;
this.writeSync(buf.subarray(0, nread));
}
}
}
Expand Down
102 changes: 102 additions & 0 deletions cli/tests/unit/buffer_test.ts
Expand Up @@ -11,11 +11,19 @@ import {
unitTest,
} from "./test_util.ts";

const MAX_SIZE = 2 ** 32 - 2;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it really necessary to allocate 4G of memory? Does the crash occur a smaller allocations?

Copy link
Contributor Author

@marcosc90 marcosc90 Jul 3, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can replicate the crash on master using write/writeSync when writing ~2gb, but not for readFrom, the latter allocate a maximum buffer of 4294966784, 510 bytes short of MAX_SIZE

const buf = new Deno.Buffer()
buf.writeSync(repeat("x", 2147483647));
buf.writeSync(repeat("x", 1));
// throws The buffer cannot be grown beyond the maximum size.

readFrom/Sync

const reader = new Deno.Buffer(new ArrayBuffer(4294966784 + 1));
const buf = new Deno.Buffer();
buf.readFromSync(reader);
// throws The buffer cannot be grown beyond the maximum size.

Can't replicate with a smaller amount because buf.readForm calls buf.#grow with 512 which always ends up with a max buffer of 4294966784 when trying to read any amount less than 4294966784, so not possible to replicate.

// N controls how many iterations of certain checks are performed.
const N = 100;
let testBytes: Uint8Array | null;
let testString: string | null;

let ignoreMaxSizeTests = false;
try {
new ArrayBuffer(MAX_SIZE);
} catch (e) {
ignoreMaxSizeTests = true;
}

function init(): void {
if (testBytes == null) {
testBytes = new Uint8Array(N);
Expand Down Expand Up @@ -167,6 +175,100 @@ unitTest(async function bufferTooLargeByteWrites(): Promise<void> {
);
});

unitTest(
{ ignore: ignoreMaxSizeTests },
function bufferGrowWriteMaxBuffer(): void {
const bufSize = 16 * 1024;
const capacities = [MAX_SIZE, MAX_SIZE - 1];
for (const capacity of capacities) {
let written = 0;
const buf = new Deno.Buffer();
const writes = Math.floor(capacity / bufSize);
for (let i = 0; i < writes; i++)
written += buf.writeSync(repeat("x", bufSize));

if (written < capacity) {
written += buf.writeSync(repeat("x", capacity - written));
}

assertEquals(written, capacity);
}
}
);

unitTest(
{ ignore: ignoreMaxSizeTests },
function bufferGrowReadSyncCloseToMaxBuffer(): void {
const capacities = [MAX_SIZE, MAX_SIZE - 1];
for (const capacity of capacities) {
const reader = new Deno.Buffer(new ArrayBuffer(capacity));
const buf = new Deno.Buffer();
buf.readFromSync(reader);

assertEquals(buf.length, capacity);
}
}
);

unitTest(
{ ignore: ignoreMaxSizeTests },
async function bufferGrowReadCloseToMaxBuffer(): Promise<void> {
const capacities = [MAX_SIZE, MAX_SIZE - 1];
for (const capacity of capacities) {
const reader = new Deno.Buffer(new ArrayBuffer(capacity));
const buf = new Deno.Buffer();
await buf.readFrom(reader);
assertEquals(buf.length, capacity);
}
}
);

unitTest(
{ ignore: ignoreMaxSizeTests },
async function bufferGrowReadCloseMaxBufferPlus1(): Promise<void> {
const reader = new Deno.Buffer(new ArrayBuffer(MAX_SIZE + 1));
const buf = new Deno.Buffer();

await assertThrowsAsync(
async () => {
await buf.readFrom(reader);
},
Error,
"grown beyond the maximum size"
);
}
);

unitTest(
{ ignore: ignoreMaxSizeTests },
function bufferGrowReadSyncCloseMaxBufferPlus1(): void {
const reader = new Deno.Buffer(new ArrayBuffer(MAX_SIZE + 1));
const buf = new Deno.Buffer();

assertThrows(
() => {
buf.readFromSync(reader);
},
Error,
"grown beyond the maximum size"
);
}
);

unitTest(
{ ignore: ignoreMaxSizeTests },
async function bufferReadCloseToMaxBufferWithInitialGrow(): Promise<void> {
const capacities = [MAX_SIZE, MAX_SIZE - 1, MAX_SIZE - 512];
for (const capacity of capacities) {
const reader = new Deno.Buffer(new ArrayBuffer(capacity));
const buf = new Deno.Buffer();
buf.grow(MAX_SIZE);
await buf.readFrom(reader);
assertEquals(buf.length, capacity);
}
}
);

unitTest(async function bufferLargeByteReads(): Promise<void> {
init();
assert(testBytes);
Expand Down